Sample records for nonlinear forecasting method

  1. Counteracting structural errors in ensemble forecast of influenza outbreaks.

    PubMed

    Pei, Sen; Shaman, Jeffrey

    2017-10-13

    For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.

  2. Nonlinear modeling of chaotic time series: Theory and applications

    NASA Astrophysics Data System (ADS)

    Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.

  3. Fuzzy neural network technique for system state forecasting.

    PubMed

    Li, Dezhi; Wang, Wilson; Ismail, Fathy

    2013-10-01

    In many system state forecasting applications, the prediction is performed based on multiple datasets, each corresponding to a distinct system condition. The traditional methods dealing with multiple datasets (e.g., vector autoregressive moving average models and neural networks) have some shortcomings, such as limited modeling capability and opaque reasoning operations. To tackle these problems, a novel fuzzy neural network (FNN) is proposed in this paper to effectively extract information from multiple datasets, so as to improve forecasting accuracy. The proposed predictor consists of both autoregressive (AR) nodes modeling and nonlinear nodes modeling; AR models/nodes are used to capture the linear correlation of the datasets, and the nonlinear correlation of the datasets are modeled with nonlinear neuron nodes. A novel particle swarm technique [i.e., Laplace particle swarm (LPS) method] is proposed to facilitate parameters estimation of the predictor and improve modeling accuracy. The effectiveness of the developed FNN predictor and the associated LPS method is verified by a series of tests related to Mackey-Glass data forecast, exchange rate data prediction, and gear system prognosis. Test results show that the developed FNN predictor and the LPS method can capture the dynamics of multiple datasets effectively and track system characteristics accurately.

  4. Decreasing the temporal complexity for nonlinear, implicit reduced-order models by forecasting

    DOE PAGES

    Carlberg, Kevin; Ray, Jaideep; van Bloemen Waanders, Bart

    2015-02-14

    Implicit numerical integration of nonlinear ODEs requires solving a system of nonlinear algebraic equations at each time step. Each of these systems is often solved by a Newton-like method, which incurs a sequence of linear-system solves. Most model-reduction techniques for nonlinear ODEs exploit knowledge of system's spatial behavior to reduce the computational complexity of each linear-system solve. However, the number of linear-system solves for the reduced-order simulation often remains roughly the same as that for the full-order simulation. We propose exploiting knowledge of the model's temporal behavior to (1) forecast the unknown variable of the reduced-order system of nonlinear equationsmore » at future time steps, and (2) use this forecast as an initial guess for the Newton-like solver during the reduced-order-model simulation. To compute the forecast, we propose using the Gappy POD technique. As a result, the goal is to generate an accurate initial guess so that the Newton solver requires many fewer iterations to converge, thereby decreasing the number of linear-system solves in the reduced-order-model simulation.« less

  5. Does money matter in inflation forecasting?

    NASA Astrophysics Data System (ADS)

    Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.

    2010-11-01

    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.

  6. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE PAGES

    Buitrago, Jaime; Asfour, Shihab

    2017-01-01

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  7. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buitrago, Jaime; Asfour, Shihab

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  8. Nonlinear modeling of chaotic time series: Theory and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casdagli, M.; Eubank, S.; Farmer, J.D.

    1990-01-01

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifyingmore » and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.« less

  9. Precipitation and floodiness

    NASA Astrophysics Data System (ADS)

    Stephens, E.; Day, J. J.; Pappenberger, F.; Cloke, H.

    2015-12-01

    There are a number of factors that lead to nonlinearity between precipitation anomalies and flood hazard; this nonlinearity is a pertinent issue for applications that use a precipitation forecast as a proxy for imminent flood hazard. We assessed the degree of this nonlinearity for the first time using a recently developed global-scale hydrological model driven by the ERA-Interim/Land precipitation reanalysis (1980-2010). We introduced new indices to assess large-scale flood hazard, or floodiness, and quantified the link between monthly precipitation, river discharge, and floodiness anomalies at the global and regional scales. The results show that monthly floodiness is not well correlated with precipitation, therefore demonstrating the value of hydrometeorological systems for providing floodiness forecasts for decision-makers. A method is described for forecasting floodiness using the Global Flood Awareness System, building a climatology of regional floodiness from which to forecast floodiness anomalies out to 2 weeks.

  10. Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization

    NASA Astrophysics Data System (ADS)

    Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin

    Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.

  11. Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.

    2005-12-01

    Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.

  12. Forecasting currency circulation data of Bank Indonesia by using hybrid ARIMAX-ANN model

    NASA Astrophysics Data System (ADS)

    Prayoga, I. Gede Surya Adi; Suhartono, Rahayu, Santi Puteri

    2017-05-01

    The purpose of this study is to forecast currency inflow and outflow data of Bank Indonesia. Currency circulation in Indonesia is highly influenced by the presence of Eid al-Fitr. One way to forecast the data with Eid al-Fitr effect is using autoregressive integrated moving average with exogenous input (ARIMAX) model. However, ARIMAX is a linear model, which cannot handle nonlinear correlation structures of the data. In the field of forecasting, inaccurate predictions can be considered caused by the existence of nonlinear components that are uncaptured by the model. In this paper, we propose a hybrid model of ARIMAX and artificial neural networks (ANN) that can handle both linear and nonlinear correlation. This method was applied for 46 series of currency inflow and 46 series of currency outflow. The results showed that based on out-of-sample root mean squared error (RMSE), the hybrid models are up to10.26 and 10.65 percent better than ARIMAX for inflow and outflow series, respectively. It means that ANN performs well in modeling nonlinear correlation of the data and can increase the accuracy of linear model.

  13. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  14. Nonlinear problems in data-assimilation : Can synchronization help?

    NASA Astrophysics Data System (ADS)

    Tribbia, J. J.; Duane, G. S.

    2009-12-01

    Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.

  15. Automation of energy demand forecasting

    NASA Astrophysics Data System (ADS)

    Siddique, Sanzad

    Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.

  16. Artificial Neural Network with Regular Graph for Maximum Air Temperature Forecasting:. the Effect of Decrease in Nodes Degree on Learning

    NASA Astrophysics Data System (ADS)

    Ghaderi, A. H.; Darooneh, A. H.

    The behavior of nonlinear systems can be analyzed by artificial neural networks. Air temperature change is one example of the nonlinear systems. In this work, a new neural network method is proposed for forecasting maximum air temperature in two cities. In this method, the regular graph concept is used to construct some partially connected neural networks that have regular structures. The learning results of fully connected ANN and networks with proposed method are compared. In some case, the proposed method has the better result than conventional ANN. After specifying the best network, the effect of input pattern numbers on the prediction is studied and the results show that the increase of input patterns has a direct effect on the prediction accuracy.

  17. Nonlinear Dynamical Modes as a Basis for Short-Term Forecast of Climate Variability

    NASA Astrophysics Data System (ADS)

    Feigin, A. M.; Mukhin, D.; Gavrilov, A.; Seleznev, A.; Loskutov, E.

    2017-12-01

    We study abilities of data-driven stochastic models constructed by nonlinear dynamical decomposition of spatially distributed data to quantitative (short-term) forecast of climate characteristics. We compare two data processing techniques: (i) widely used empirical orthogonal function approach, and (ii) nonlinear dynamical modes (NDMs) framework [1,2]. We also make comparison of two kinds of the prognostic models: (i) traditional autoregression (linear) model and (ii) model in the form of random ("stochastic") nonlinear dynamical system [3]. We apply all combinations of the above-mentioned data mining techniques and kinds of models to short-term forecasts of climate indices based on sea surface temperature (SST) data. We use NOAA_ERSST_V4 dataset (monthly SST with space resolution 20 × 20) covering the tropical belt and starting from the year 1960. We demonstrate that NDM-based nonlinear model shows better prediction skill versus EOF-based linear and nonlinear models. Finally we discuss capability of NDM-based nonlinear model for long-term (decadal) prediction of climate variability. [1] D. Mukhin, A. Gavrilov, E. Loskutov , A.Feigin, J.Kurths, 2015: Principal nonlinear dynamical modes of climate variability, Scientific Reports, rep. 5, 15510; doi: 10.1038/srep15510. [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J., 2016: Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101. [3] Ya. Molkov, D. Mukhin, E. Loskutov, A. Feigin, 2012: Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.

  18. Neural network based short-term load forecasting using weather compensation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, T.W.S.; Leung, C.T.

    This paper presents a novel technique for electric load forecasting based on neural weather compensation. The proposed method is a nonlinear generalization of Box and Jenkins approach for nonstationary time-series prediction. A weather compensation neural network is implemented for one-day ahead electric load forecasting. The weather compensation neural network can accurately predict the change of actual electric load consumption from the previous day. The results, based on Hong Kong Island historical load demand, indicate that this methodology is capable of providing a more accurate load forecast with a 0.9% reduction in forecast error.

  19. Fitting and forecasting coupled dark energy in the non-linear regime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casas, Santiago; Amendola, Luca; Pettorino, Valeria

    2016-01-01

    We consider cosmological models in which dark matter feels a fifth force mediated by the dark energy scalar field, also known as coupled dark energy. Our interest resides in estimating forecasts for future surveys like Euclid when we take into account non-linear effects, relying on new fitting functions that reproduce the non-linear matter power spectrum obtained from N-body simulations. We obtain fitting functions for models in which the dark matter-dark energy coupling is constant. Their validity is demonstrated for all available simulations in the redshift range 0z=–1.6 and wave modes below 0k=1 h/Mpc. These fitting formulas can be used tomore » test the predictions of the model in the non-linear regime without the need for additional computing-intensive N-body simulations. We then use these fitting functions to perform forecasts on the constraining power that future galaxy-redshift surveys like Euclid will have on the coupling parameter, using the Fisher matrix method for galaxy clustering (GC) and weak lensing (WL). We find that by using information in the non-linear power spectrum, and combining the GC and WL probes, we can constrain the dark matter-dark energy coupling constant squared, β{sup 2}, with precision smaller than 4% and all other cosmological parameters better than 1%, which is a considerable improvement of more than an order of magnitude compared to corresponding linear power spectrum forecasts with the same survey specifications.« less

  20. An optimized Nash nonlinear grey Bernoulli model based on particle swarm optimization and its application in prediction for the incidence of Hepatitis B in Xinjiang, China.

    PubMed

    Zhang, Liping; Zheng, Yanling; Wang, Kai; Zhang, Xueliang; Zheng, Yujian

    2014-06-01

    In this paper, by using a particle swarm optimization algorithm to solve the optimal parameter estimation problem, an improved Nash nonlinear grey Bernoulli model termed PSO-NNGBM(1,1) is proposed. To test the forecasting performance, the optimized model is applied for forecasting the incidence of hepatitis B in Xinjiang, China. Four models, traditional GM(1,1), grey Verhulst model (GVM), original nonlinear grey Bernoulli model (NGBM(1,1)) and Holt-Winters exponential smoothing method, are also established for comparison with the proposed model under the criteria of mean absolute percentage error and root mean square percent error. The prediction results show that the optimized NNGBM(1,1) model is more accurate and performs better than the traditional GM(1,1), GVM, NGBM(1,1) and Holt-Winters exponential smoothing method. Copyright © 2014. Published by Elsevier Ltd.

  1. The NLS-Based Nonlinear Grey Multivariate Model for Forecasting Pollutant Emissions in China.

    PubMed

    Pei, Ling-Ling; Li, Qin; Wang, Zheng-Xin

    2018-03-08

    The relationship between pollutant discharge and economic growth has been a major research focus in environmental economics. To accurately estimate the nonlinear change law of China's pollutant discharge with economic growth, this study establishes a transformed nonlinear grey multivariable (TNGM (1, N )) model based on the nonlinear least square (NLS) method. The Gauss-Seidel iterative algorithm was used to solve the parameters of the TNGM (1, N ) model based on the NLS basic principle. This algorithm improves the precision of the model by continuous iteration and constantly approximating the optimal regression coefficient of the nonlinear model. In our empirical analysis, the traditional grey multivariate model GM (1, N ) and the NLS-based TNGM (1, N ) models were respectively adopted to forecast and analyze the relationship among wastewater discharge per capita (WDPC), and per capita emissions of SO₂ and dust, alongside GDP per capita in China during the period 1996-2015. Results indicated that the NLS algorithm is able to effectively help the grey multivariable model identify the nonlinear relationship between pollutant discharge and economic growth. The results show that the NLS-based TNGM (1, N ) model presents greater precision when forecasting WDPC, SO₂ emissions and dust emissions per capita, compared to the traditional GM (1, N ) model; WDPC indicates a growing tendency aligned with the growth of GDP, while the per capita emissions of SO₂ and dust reduce accordingly.

  2. Product demand forecasts using wavelet kernel support vector machine and particle swarm optimization in manufacture system

    NASA Astrophysics Data System (ADS)

    Wu, Qi

    2010-03-01

    Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.

  3. A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting

    NASA Astrophysics Data System (ADS)

    Kim, T.; Joo, K.; Seo, J.; Heo, J. H.

    2016-12-01

    Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.

  4. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    PubMed Central

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  5. Reservoir inflow forecasting with a modified coactive neuro-fuzzy inference system: a case study for a semi-arid region

    NASA Astrophysics Data System (ADS)

    Allawi, Mohammed Falah; Jaafar, Othman; Mohamad Hamzah, Firdaus; Mohd, Nuruol Syuhadaa; Deo, Ravinesh C.; El-Shafie, Ahmed

    2017-10-01

    Existing forecast models applied for reservoir inflow forecasting encounter several drawbacks, due to the difficulty of the underlying mathematical procedures being to cope with and to mimic the naturalization and stochasticity of the inflow data patterns. In this study, appropriate adjustments to the conventional coactive neuro-fuzzy inference system (CANFIS) method are proposed to improve the mathematical procedure, thus enabling a better detection of the high nonlinearity patterns found in the reservoir inflow training data. This modification includes the updating of the back propagation algorithm, leading to a consequent update of the membership rules and the induction of the centre-weighted set rather than the global weighted set used in feature extraction. The modification also aids in constructing an integrated model that is able to not only detect the nonlinearity in the training data but also the wide range of features within the training data records used to simulate the forecasting model. To demonstrate the model's efficacy, the proposed CANFIS method has been applied to forecast monthly inflow data at Aswan High Dam (AHD), located in southern Egypt. Comparative analyses of the forecasting skill of the modified CANFIS and the conventional ANFIS model are carried out with statistical score indicators to assess the reliability of the developed method. The statistical metrics support the better performance of the developed CANFIS model, which significantly outperforms the ANFIS model to attain a low relative error value (23%), mean absolute error (1.4 BCM month-1), root mean square error (1.14 BCM month-1), and a relative large coefficient of determination (0.94). The present study ascertains the better utility of the modified CANFIS model in respect to the traditional ANFIS model applied in reservoir inflow forecasting for a semi-arid region.

  6. A four-stage hybrid model for hydrological time series forecasting.

    PubMed

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  7. The behaviour of PM10 and ozone in Malaysia through non-linear dynamical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sapini, Muhamad Luqman; Rahim, Nurul Zahirah binti Abd; Noorani, Mohd Salmi Md.

    Prediction of ozone (O3) and PM10 is very important as both these air pollutants affect human health, human activities and more. Short-term forecasting of air quality is needed as preventive measures and effective action can be taken. Therefore, if it is detected that the ozone data is of a chaotic dynamical systems, a model using the nonlinear dynamic from chaos theory data can be made and thus forecasts for the short term would be more accurate. This study uses two methods, namely the 0-1 Test and Lyapunov Exponent. In addition, the effect of noise reduction on the analysis of timemore » series data will be seen by using two smoothing methods: Rectangular methods and Triangle methods. At the end of the study, recommendations were made to get better results in the future.« less

  8. The NLS-Based Nonlinear Grey Multivariate Model for Forecasting Pollutant Emissions in China

    PubMed Central

    Pei, Ling-Ling; Li, Qin

    2018-01-01

    The relationship between pollutant discharge and economic growth has been a major research focus in environmental economics. To accurately estimate the nonlinear change law of China’s pollutant discharge with economic growth, this study establishes a transformed nonlinear grey multivariable (TNGM (1, N)) model based on the nonlinear least square (NLS) method. The Gauss–Seidel iterative algorithm was used to solve the parameters of the TNGM (1, N) model based on the NLS basic principle. This algorithm improves the precision of the model by continuous iteration and constantly approximating the optimal regression coefficient of the nonlinear model. In our empirical analysis, the traditional grey multivariate model GM (1, N) and the NLS-based TNGM (1, N) models were respectively adopted to forecast and analyze the relationship among wastewater discharge per capita (WDPC), and per capita emissions of SO2 and dust, alongside GDP per capita in China during the period 1996–2015. Results indicated that the NLS algorithm is able to effectively help the grey multivariable model identify the nonlinear relationship between pollutant discharge and economic growth. The results show that the NLS-based TNGM (1, N) model presents greater precision when forecasting WDPC, SO2 emissions and dust emissions per capita, compared to the traditional GM (1, N) model; WDPC indicates a growing tendency aligned with the growth of GDP, while the per capita emissions of SO2 and dust reduce accordingly. PMID:29517985

  9. Optimization of autoregressive, exogenous inputs-based typhoon inundation forecasting models using a multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ouyang, Huei-Tau

    2017-07-01

    Three types of model for forecasting inundation levels during typhoons were optimized: the linear autoregressive model with exogenous inputs (LARX), the nonlinear autoregressive model with exogenous inputs with wavelet function (NLARX-W) and the nonlinear autoregressive model with exogenous inputs with sigmoid function (NLARX-S). The forecast performance was evaluated by three indices: coefficient of efficiency, error in peak water level and relative time shift. Historical typhoon data were used to establish water-level forecasting models that satisfy all three objectives. A multi-objective genetic algorithm was employed to search for the Pareto-optimal model set that satisfies all three objectives and select the ideal models for the three indices. Findings showed that the optimized nonlinear models (NLARX-W and NLARX-S) outperformed the linear model (LARX). Among the nonlinear models, the optimized NLARX-W model achieved a more balanced performance on the three indices than the NLARX-S models and is recommended for inundation forecasting during typhoons.

  10. Reconstructing latent dynamical noise for better forecasting observables

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito

    2018-03-01

    I propose a method for reconstructing multi-dimensional dynamical noise inspired by the embedding theorem of Muldoon et al. [Dyn. Stab. Syst. 13, 175 (1998)] by regarding multiple predictions as different observables. Then, applying the embedding theorem by Stark et al. [J. Nonlinear Sci. 13, 519 (2003)] for a forced system, I produce time series forecast by supplying the reconstructed past dynamical noise as auxiliary information. I demonstrate the proposed method on toy models driven by auto-regressive models or independent Gaussian noise.

  11. Mid- and long-term runoff predictions by an improved phase-space reconstruction model.

    PubMed

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P

    2016-07-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall-runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ''wet years and dry years predictability barrier,'' to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Quasi-most unstable modes: a window to 'À la carte' ensemble diversity?

    NASA Astrophysics Data System (ADS)

    Homar Santaner, Victor; Stensrud, David J.

    2010-05-01

    The atmospheric scientific community is nowadays facing the ambitious challenge of providing useful forecasts of atmospheric events that produce high societal impact. The low level of social resilience to false alarms creates tremendous pressure on forecasting offices to issue accurate, timely and reliable warnings.Currently, no operational numerical forecasting system is able to respond to the societal demand for high-resolution (in time and space) predictions in the 12-72h time span. The main reasons for such deficiencies are the lack of adequate observations and the high non-linearity of the numerical models that are currently used. The whole weather forecasting problem is intrinsically probabilistic and current methods aim at coping with the various sources of uncertainties and the error propagation throughout the forecasting system. This probabilistic perspective is often created by generating ensembles of deterministic predictions that are aimed at sampling the most important sources of uncertainty in the forecasting system. The ensemble generation/sampling strategy is a crucial aspect of their performance and various methods have been proposed. Although global forecasting offices have been using ensembles of perturbed initial conditions for medium-range operational forecasts since 1994, no consensus exists regarding the optimum sampling strategy for high resolution short-range ensemble forecasts. Bred vectors, however, have been hypothesized to better capture the growing modes in the highly nonlinear mesoscale dynamics of severe episodes than singular vectors or observation perturbations. Yet even this technique is not able to produce enough diversity in the ensembles to accurately and routinely predict extreme phenomena such as severe weather. Thus, we propose a new method to generate ensembles of initial conditions perturbations that is based on the breeding technique. Given a standard bred mode, a set of customized perturbations is derived with specified amplitudes and horizontal scales. This allows the ensemble to excite growing modes across a wider range of scales. Results show that this approach produces significantly more spread in the ensemble prediction than standard bred modes alone. Several examples that illustrate the benefits from this approach for severe weather forecasts will be provided.

  13. Sufficient Forecasting Using Factor Models

    PubMed Central

    Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei

    2017-01-01

    We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537

  14. The time series approach to short term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagan, M.T.; Behr, S.M.

    The application of time series analysis methods to load forecasting is reviewed. It is shown than Box and Jenkins time series models, in particular, are well suited to this application. The logical and organized procedures for model development using the autocorrelation function make these models particularly attractive. One of the drawbacks of these models is the inability to accurately represent the nonlinear relationship between load and temperature. A simple procedure for overcoming this difficulty is introduced, and several Box and Jenkins models are compared with a forecasting procedure currently used by a utility company.

  15. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks.

    PubMed

    Vlachas, Pantelis R; Byeon, Wonmin; Wan, Zhong Y; Sapsis, Themistoklis P; Koumoutsakos, Petros

    2018-05-01

    We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto-Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM-LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Mei; Wang, Dong, E-mail: wangdong@nju.edu.cn; Wang, Yuankun

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the modelmore » are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated in the model. • Chaotic characteristics of the model are also analyzed. • The forecast results of the mid and long-term runoff in six stations are accurate.« less

  17. Why preferring parametric forecasting to nonparametric methods?

    PubMed

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Nonparametric Stochastic Model for Uncertainty Quantifi cation of Short-term Wind Speed Forecasts

    NASA Astrophysics Data System (ADS)

    AL-Shehhi, A. M.; Chaouch, M.; Ouarda, T.

    2014-12-01

    Wind energy is increasing in importance as a renewable energy source due to its potential role in reducing carbon emissions. It is a safe, clean, and inexhaustible source of energy. The amount of wind energy generated by wind turbines is closely related to the wind speed. Wind speed forecasting plays a vital role in the wind energy sector in terms of wind turbine optimal operation, wind energy dispatch and scheduling, efficient energy harvesting etc. It is also considered during planning, design, and assessment of any proposed wind project. Therefore, accurate prediction of wind speed carries a particular importance and plays significant roles in the wind industry. Many methods have been proposed in the literature for short-term wind speed forecasting. These methods are usually based on modeling historical fixed time intervals of the wind speed data and using it for future prediction. The methods mainly include statistical models such as ARMA, ARIMA model, physical models for instance numerical weather prediction and artificial Intelligence techniques for example support vector machine and neural networks. In this paper, we are interested in estimating hourly wind speed measures in United Arab Emirates (UAE). More precisely, we predict hourly wind speed using a nonparametric kernel estimation of the regression and volatility functions pertaining to nonlinear autoregressive model with ARCH model, which includes unknown nonlinear regression function and volatility function already discussed in the literature. The unknown nonlinear regression function describe the dependence between the value of the wind speed at time t and its historical data at time t -1, t - 2, … , t - d. This function plays a key role to predict hourly wind speed process. The volatility function, i.e., the conditional variance given the past, measures the risk associated to this prediction. Since the regression and the volatility functions are supposed to be unknown, they are estimated using nonparametric kernel methods. In addition, to the pointwise hourly wind speed forecasts, a confidence interval is also provided which allows to quantify the uncertainty around the forecasts.

  19. Detecting and disentangling nonlinear structure from solar flux time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Roszman, L.

    1992-01-01

    Interest in solar activity has grown in the past two decades for many reasons. Most importantly for flight dynamics, solar activity changes the atmospheric density, which has important implications for spacecraft trajectory and lifetime prediction. Building upon the previously developed Rayleigh-Benard nonlinear dynamic solar model, which exhibits many dynamic behaviors observed in the Sun, this work introduces new chaotic solar forecasting techniques. Our attempt to use recently developed nonlinear chaotic techniques to model and forecast solar activity has uncovered highly entangled dynamics. Numerical techniques for decoupling additive and multiplicative white noise from deterministic dynamics and examines falloff of the power spectra at high frequencies as a possible means of distinguishing deterministic chaos from noise than spectrally white or colored are presented. The power spectral techniques presented are less cumbersome than current methods for identifying deterministic chaos, which require more computationally intensive calculations, such as those involving Lyapunov exponents and attractor dimension.

  20. Forecasting space weather: Can new econometric methods improve accuracy?

    NASA Astrophysics Data System (ADS)

    Reikard, Gordon

    2011-06-01

    Space weather forecasts are currently used in areas ranging from navigation and communication to electric power system operations. The relevant forecast horizons can range from as little as 24 h to several days. This paper analyzes the predictability of two major space weather measures using new time series methods, many of them derived from econometrics. The data sets are the A p geomagnetic index and the solar radio flux at 10.7 cm. The methods tested include nonlinear regressions, neural networks, frequency domain algorithms, GARCH models (which utilize the residual variance), state transition models, and models that combine elements of several techniques. While combined models are complex, they can be programmed using modern statistical software. The data frequency is daily, and forecasting experiments are run over horizons ranging from 1 to 7 days. Two major conclusions stand out. First, the frequency domain method forecasts the A p index more accurately than any time domain model, including both regressions and neural networks. This finding is very robust, and holds for all forecast horizons. Combining the frequency domain method with other techniques yields a further small improvement in accuracy. Second, the neural network forecasts the solar flux more accurately than any other method, although at short horizons (2 days or less) the regression and net yield similar results. The neural net does best when it includes measures of the long-term component in the data.

  1. Residual delay maps unveil global patterns of atmospheric nonlinearity and produce improved local forecasts

    PubMed Central

    Sugihara, George; Casdagli, Martin; Habjan, Edward; Hess, Dale; Dixon, Paul; Holland, Greg

    1999-01-01

    We use residual-delay maps of observational field data for barometric pressure to demonstrate the structure of latitudinal gradients in nonlinearity in the atmosphere. Nonlinearity is weak and largely lacking in tropical and subtropical sites and increases rapidly into the temperate regions where the time series also appear to be much noisier. The degree of nonlinearity closely follows the meridional variation of midlatitude storm track frequency. We extract the specific functional form of this nonlinearity, a V shape in the lagged residuals that appears to be a basic feature of midlatitude synoptic weather systems associated with frontal passages. We present evidence that this form arises from the relative time scales of high-pressure versus low-pressure events. Finally, we show that this nonlinear feature is weaker in a well regarded numerical forecast model (European Centre for Medium-Range Forecasts) because small-scale temporal and spatial variation is smoothed out in the grided inputs. This is significant, in that it allows us to demonstrate how application of statistical corrections based on the residual-delay map may provide marked increases in local forecast accuracy, especially for severe weather systems. PMID:10588685

  2. Forecasting Nonlinear Chaotic Time Series with Function Expression Method Based on an Improved Genetic-Simulated Annealing Algorithm

    PubMed Central

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior. PMID:26000011

  3. Forecasting nonlinear chaotic time series with function expression method based on an improved genetic-simulated annealing algorithm.

    PubMed

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.

  4. Dynamic properties of combustion instability in a lean premixed gas-turbine combustor.

    PubMed

    Gotoda, Hiroshi; Nikimoto, Hiroyuki; Miyano, Takaya; Tachibana, Shigeru

    2011-03-01

    We experimentally investigate the dynamic behavior of the combustion instability in a lean premixed gas-turbine combustor from the viewpoint of nonlinear dynamics. A nonlinear time series analysis in combination with a surrogate data method clearly reveals that as the equivalence ratio increases, the dynamic behavior of the combustion instability undergoes a significant transition from stochastic fluctuation to periodic oscillation through low-dimensional chaotic oscillation. We also show that a nonlinear forecasting method is useful for predicting the short-term dynamic behavior of the combustion instability in a lean premixed gas-turbine combustor, which has not been addressed in the fields of combustion science and physics.

  5. Forecasting Non-Stationary Diarrhea, Acute Respiratory Infection, and Malaria Time-Series in Niono, Mali

    PubMed Central

    Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou

    2007-01-01

    Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322

  6. Forecasting non-stationary diarrhea, acute respiratory infection, and malaria time-series in Niono, Mali.

    PubMed

    Medina, Daniel C; Findley, Sally E; Guindo, Boubacar; Doumbia, Seydou

    2007-11-21

    Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. In this longitudinal retrospective (01/1996-06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel.

  7. Neural net forecasting for geomagnetic activity

    NASA Technical Reports Server (NTRS)

    Hernandez, J. V.; Tajima, T.; Horton, W.

    1993-01-01

    We use neural nets to construct nonlinear models to forecast the AL index given solar wind and interplanetary magnetic field (IMF) data. We follow two approaches: (1) the state space reconstruction approach, which is a nonlinear generalization of autoregressive-moving average models (ARMA) and (2) the nonlinear filter approach, which reduces to a moving average model (MA) in the linear limit. The database used here is that of Bargatze et al. (1985).

  8. A Novel Multilevel-SVD Method to Improve Multistep Ahead Forecasting in Traffic Accidents Domain.

    PubMed

    Barba, Lida; Rodríguez, Nibaldo

    2017-01-01

    Here is proposed a novel method for decomposing a nonstationary time series in components of low and high frequency. The method is based on Multilevel Singular Value Decomposition (MSVD) of a Hankel matrix. The decomposition is used to improve the forecasting accuracy of Multiple Input Multiple Output (MIMO) linear and nonlinear models. Three time series coming from traffic accidents domain are used. They represent the number of persons with injuries in traffic accidents of Santiago, Chile. The data were continuously collected by the Chilean Police and were weekly sampled from 2000:1 to 2014:12. The performance of MSVD is compared with the decomposition in components of low and high frequency of a commonly accepted method based on Stationary Wavelet Transform (SWT). SWT in conjunction with the Autoregressive model (SWT + MIMO-AR) and SWT in conjunction with an Autoregressive Neural Network (SWT + MIMO-ANN) were evaluated. The empirical results have shown that the best accuracy was achieved by the forecasting model based on the proposed decomposition method MSVD, in comparison with the forecasting models based on SWT.

  9. A Novel Multilevel-SVD Method to Improve Multistep Ahead Forecasting in Traffic Accidents Domain

    PubMed Central

    Rodríguez, Nibaldo

    2017-01-01

    Here is proposed a novel method for decomposing a nonstationary time series in components of low and high frequency. The method is based on Multilevel Singular Value Decomposition (MSVD) of a Hankel matrix. The decomposition is used to improve the forecasting accuracy of Multiple Input Multiple Output (MIMO) linear and nonlinear models. Three time series coming from traffic accidents domain are used. They represent the number of persons with injuries in traffic accidents of Santiago, Chile. The data were continuously collected by the Chilean Police and were weekly sampled from 2000:1 to 2014:12. The performance of MSVD is compared with the decomposition in components of low and high frequency of a commonly accepted method based on Stationary Wavelet Transform (SWT). SWT in conjunction with the Autoregressive model (SWT + MIMO-AR) and SWT in conjunction with an Autoregressive Neural Network (SWT + MIMO-ANN) were evaluated. The empirical results have shown that the best accuracy was achieved by the forecasting model based on the proposed decomposition method MSVD, in comparison with the forecasting models based on SWT. PMID:28261267

  10. Hybrid methodology for tuberculosis incidence time-series forecasting based on ARIMA and a NAR neural network.

    PubMed

    Wang, K W; Deng, C; Li, J P; Zhang, Y Y; Li, X Y; Wu, M C

    2017-04-01

    Tuberculosis (TB) affects people globally and is being reconsidered as a serious public health problem in China. Reliable forecasting is useful for the prevention and control of TB. This study proposes a hybrid model combining autoregressive integrated moving average (ARIMA) with a nonlinear autoregressive (NAR) neural network for forecasting the incidence of TB from January 2007 to March 2016. Prediction performance was compared between the hybrid model and the ARIMA model. The best-fit hybrid model was combined with an ARIMA (3,1,0) × (0,1,1)12 and NAR neural network with four delays and 12 neurons in the hidden layer. The ARIMA-NAR hybrid model, which exhibited lower mean square error, mean absolute error, and mean absolute percentage error of 0·2209, 0·1373, and 0·0406, respectively, in the modelling performance, could produce more accurate forecasting of TB incidence compared to the ARIMA model. This study shows that developing and applying the ARIMA-NAR hybrid model is an effective method to fit the linear and nonlinear patterns of time-series data, and this model could be helpful in the prevention and control of TB.

  11. Spatial nonlinearities: Cascading effects in the earth system

    USGS Publications Warehouse

    Peters, Debra P.C.; Pielke, R.A.; Bestelmeyer, B.T.; Allen, Craig D.; Munson-McGee, Stuart; Havstad, K. M.; Canadell, Josep G.; Pataki, Diane E.; Pitelka, Louis F.

    2006-01-01

    Nonlinear behavior is prevalent in all aspects of the Earth System, including ecological responses to global change (Gallagher and Appenzeller 1999; Steffen et al. 2004). Nonlinear behavior refers to a large, discontinuous change in response to a small change in a driving variable (Rial et al. 2004). In contrast to linear systems where responses are smooth, well-behaved, continuous functions, nonlinear systems often undergo sharp or discontinuous transitions resulting from the crossing of thresholds. These nonlinear responses can result in surprising behavior that makes forecasting difficult (Kaplan and Glass 1995). Given that many system dynamics are nonlinear, it is imperative that conceptual and quantitative tools be developed to increase our understanding of the processes leading to nonlinear behavior in order to determine if forecasting can be improved under future environmental changes (Clark et al. 2001).

  12. Extending nonlinear analysis to short ecological time series.

    PubMed

    Hsieh, Chih-hao; Anderson, Christian; Sugihara, George

    2008-01-01

    Nonlinearity is important and ubiquitous in ecology. Though detectable in principle, nonlinear behavior is often difficult to characterize, analyze, and incorporate mechanistically into models of ecosystem function. One obvious reason is that quantitative nonlinear analysis tools are data intensive (require long time series), and time series in ecology are generally short. Here we demonstrate a useful method that circumvents data limitation and reduces sampling error by combining ecologically similar multispecies time series into one long time series. With this technique, individual ecological time series containing as few as 20 data points can be mined for such important information as (1) significantly improved forecast ability, (2) the presence and location of nonlinearity, and (3) the effective dimensionality (the number of relevant variables) of an ecological system.

  13. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  14. Spatiotemporal drought forecasting using nonlinear models

    NASA Astrophysics Data System (ADS)

    Vasiliades, Lampros; Loukas, Athanasios

    2010-05-01

    Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. In order to achieve spatiotemporal forecasting, some mature analysis tools, e.g., time series and spatial statistics are extended to the spatial dimension and the temporal dimension, respectively. Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Despite the widespread application of nonlinear mathematical models, comparative studies on spatiotemporal drought forecasting using different models are still a huge task for modellers. This study uses a promising approach, the Gamma Test (GT), to select the input variables and the training data length, so that the trial and error workload could be greatly reduced. The GT enables to quickly evaluate and estimate the best mean squared error that can be achieved by a smooth model on any unseen data for a given selection of inputs, prior to model construction. The GT is applied to forecast droughts using monthly Standardized Precipitation Index (SPI) timeseries at multiple timescales in several precipitation stations at Pinios river basin in Thessaly region, Greece. Several nonlinear models have been developed efficiently, with the aid of the GT, for 1-month up to 12-month ahead forecasting. Several temporal and spatial statistical indices were considered for the performance evaluation of the models. The predicted results show reasonably good agreement with the actual data for short lead times, whereas the forecasting accuracy decreases with increase in lead time. Finally, the developed nonlinear models could be used in an early warning system for risk and decision analyses at the study area.

  15. Forecasting daily streamflow using online sequential extreme learning machines

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Cannon, Alex J.; Hsieh, William W.

    2016-06-01

    While nonlinear machine methods have been widely used in environmental forecasting, in situations where new data arrive continually, the need to make frequent model updates can become cumbersome and computationally costly. To alleviate this problem, an online sequential learning algorithm for single hidden layer feedforward neural networks - the online sequential extreme learning machine (OSELM) - is automatically updated inexpensively as new data arrive (and the new data can then be discarded). OSELM was applied to forecast daily streamflow at two small watersheds in British Columbia, Canada, at lead times of 1-3 days. Predictors used were weather forecast data generated by the NOAA Global Ensemble Forecasting System (GEFS), and local hydro-meteorological observations. OSELM forecasts were tested with daily, monthly or yearly model updates. More frequent updating gave smaller forecast errors, including errors for data above the 90th percentile. Larger datasets used in the initial training of OSELM helped to find better parameters (number of hidden nodes) for the model, yielding better predictions. With the online sequential multiple linear regression (OSMLR) as benchmark, we concluded that OSELM is an attractive approach as it easily outperformed OSMLR in forecast accuracy.

  16. Effects of using a posteriori methods for the conservation of integral invariants. [for weather forecasting

    NASA Technical Reports Server (NTRS)

    Takacs, Lawrence L.

    1988-01-01

    The nature and effect of using a posteriori adjustments to nonconservative finite-difference schemes to enforce integral invariants of the corresponding analytic system are examined. The method of a posteriori integral constraint restoration is analyzed for the case of linear advection, and the harmonic response associated with the a posteriori adjustments is examined in detail. The conservative properties of the shallow water system are reviewed, and the constraint restoration algorithm applied to the shallow water equations are described. A comparison is made between forecasts obtained using implicit and a posteriori methods for the conservation of mass, energy, and potential enstrophy in the complete nonlinear shallow-water system.

  17. Spatial-temporal forecasting the sunspot diagram

    NASA Astrophysics Data System (ADS)

    Covas, Eurico

    2017-09-01

    Aims: We attempt to forecast the Sun's sunspot butterfly diagram in both space (I.e. in latitude) and time, instead of the usual one-dimensional time series forecasts prevalent in the scientific literature. Methods: We use a prediction method based on the non-linear embedding of data series in high dimensions. We use this method to forecast both in latitude (space) and in time, using a full spatial-temporal series of the sunspot diagram from 1874 to 2015. Results: The analysis of the results shows that it is indeed possible to reconstruct the overall shape and amplitude of the spatial-temporal pattern of sunspots, but that the method in its current form does not have real predictive power. We also apply a metric called structural similarity to compare the forecasted and the observed butterfly cycles, showing that this metric can be a useful addition to the usual root mean square error metric when analysing the efficiency of different prediction methods. Conclusions: We conclude that it is in principle possible to reconstruct the full sunspot butterfly diagram for at least one cycle using this approach and that this method and others should be explored since just looking at metrics such as sunspot count number or sunspot total area coverage is too reductive given the spatial-temporal dynamical complexity of the sunspot butterfly diagram. However, more data and/or an improved approach is probably necessary to have true predictive power.

  18. Nonlinear forecasting analysis of inflation-deflation patterns of an active caldera (Campi Flegrei, Italy)

    USGS Publications Warehouse

    Cortini, M.; Barton, C.C.

    1993-01-01

    The ground level in Pozzuoli, Italy, at the center of the Campi Flegrei caldera, has been monitored by tide gauges. Previous work suggests that the dynamics of the Campi Flegrei system, as reconstructed from the tide gauge record, is chaotic and low dimensional. According to this suggestion, in spite of the complexity of the system, at a time scale of days the ground motion is driven by a deterministic mechanism with few degrees of freedom; however, the interactions of the system may never be describable in full detail. New analysis of the tide gauge record using Nonlinear Forecasting, confirms low-dimensional chaos in the ground elevation record at Campi Flegrei and suggests that Nonlinear Forecasting could be a useful tool in volcanic surveillance. -from Authors

  19. Streamflow Forecasting Using Nuero-Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Nanduri, U. V.; Swain, P. C.

    2005-12-01

    The prediction of flow into a reservoir is fundamental in water resources planning and management. The need for timely and accurate streamflow forecasting is widely recognized and emphasized by many in water resources fraternity. Real-time forecasts of natural inflows to reservoirs are of particular interest for operation and scheduling. The physical system of the river basin that takes the rainfall as an input and produces the runoff is highly nonlinear, complicated and very difficult to fully comprehend. The system is influenced by large number of factors and variables. The large spatial extent of the systems forces the uncertainty into the hydrologic information. A variety of methods have been proposed for forecasting reservoir inflows including conceptual (physical) and empirical (statistical) models (WMO 1994), but none of them can be considered as unique superior model (Shamseldin 1997). Owing to difficulties of formulating reasonable non-linear watershed models, recent attempts have resorted to Neural Network (NN) approach for complex hydrologic modeling. In recent years the use of soft computing in the field of hydrological forecasting is gaining ground. The relatively new soft computing technique of Adaptive Neuro-Fuzzy Inference System (ANFIS), developed by Jang (1993) is able to take care of the non-linearity, uncertainty, and vagueness embedded in the system. It is a judicious combination of the Neural Networks and fuzzy systems. It can learn and generalize highly nonlinear and uncertain phenomena due to the embedded neural network (NN). NN is efficient in learning and generalization, and the fuzzy system mimics the cognitive capability of human brain. Hence, ANFIS can learn the complicated processes involved in the basin and correlate the precipitation to the corresponding discharge. In the present study, one step ahead forecasts are made for ten-daily flows, which are mostly required for short term operational planning of multipurpose reservoirs. A Neuro-Fuzzy model is developed to forecast ten-daily flows into the Hirakud reservoir on River Mahanadi in the state of Orissa in India. Correlation analysis is carried out to find out the most influential variables on the ten daily flow at Hirakud. Based on this analysis, four variables, namely, flow during the previous time period, ql1, rainfall during the previous two time periods, rl1 and rl2, and flow during the same period in previous year, qpy, are identified as the most influential variables to forecast the ten daily flow. Performance measures such as Root Mean Square Error (RMSE), Correlation Coefficient (CORR) and coefficient of efficiency R2 are computed for training and testing phases of the model to evaluate its performance. The results indicate that the ten-daily forecasting model is efficient in predicting the high and medium flows with reasonable accuracy. The forecast of low flows is associated with less efficiency. REFERENCES Jang, J.S.R. (1993). "ANFIS: Adaptive - network- based fuzzy inference system." IEEE Trans. on Systems, Man and Cybernetics, 23 (3), 665-685. Shamseldin, A.Y. (1997). "Application of a neural network technique to rainfall-runoff modeling." Journal of Hydrology, 199, 272-294. World Meteorological Organization (1975). Intercomparison of conceptual models used in operational hydrological forecasting. World Meteorological Organization, Technical Report No.429, Geneva, Switzerland.

  20. Water Stage Forecasting in Tidal streams during High Water Using EEMD

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Chang; Kao, Su-Pai; Su, Pei-Yi

    2017-04-01

    There are so many factors may affect the water stages in tidal streams. Not only the ocean wave but also the stream flow affects the water stage in a tidal stream. During high water, two of the most important factors affecting water stages in tidal streams are flood and tide. However the hydrological processes in tidal streams during high water are nonlinear and nonstationary. Generally the conventional methods used for forecasting water stages in tidal streams are very complicated. It explains the accurately forecasting water stages, especially during high water, in tidal streams is always a difficult task. The study makes used of Ensemble Empirical Model Decomposition (EEMD) to analyze the water stages in tidal streams. One of the advantages of the EEMD is it can be used to analyze the nonlinear and nonstationary data. The EEMD divides the water stage into several intrinsic mode functions (IMFs) and a residual; meanwhile, the physical meaning still remains during the process. By comparing the IMF frequency with tidal frequency, it is possible to identify if the IMF is affected by tides. Then the IMFs is separated into two groups, affected by tide or not by tide. The IMFs in each group are assembled to become a factor. Therefore the water stages in tidal streams are only affected by two factors, tidal factor and flood factor. Finally the regression analysis is used to establish the relationship between the factors of the gaging stations in the tidal stream. The available data during 15 typhoon periods of the Tanshui River whose downstream reach is in estuary area is used to illustrate the accuracy and reliability of the proposed method. The results show that the simple but reliable method is capable of forecasting water stages in tidal streams.

  1. A morphological perceptron with gradient-based learning for Brazilian stock market forecasting.

    PubMed

    Araújo, Ricardo de A

    2012-04-01

    Several linear and non-linear techniques have been proposed to solve the stock market forecasting problem. However, a limitation arises from all these techniques and is known as the random walk dilemma (RWD). In this scenario, forecasts generated by arbitrary models have a characteristic one step ahead delay with respect to the time series values, so that, there is a time phase distortion in stock market phenomena reconstruction. In this paper, we propose a suitable model inspired by concepts in mathematical morphology (MM) and lattice theory (LT). This model is generically called the increasing morphological perceptron (IMP). Also, we present a gradient steepest descent method to design the proposed IMP based on ideas from the back-propagation (BP) algorithm and using a systematic approach to overcome the problem of non-differentiability of morphological operations. Into the learning process we have included a procedure to overcome the RWD, which is an automatic correction step that is geared toward eliminating time phase distortions that occur in stock market phenomena. Furthermore, an experimental analysis is conducted with the IMP using four complex non-linear problems of time series forecasting from the Brazilian stock market. Additionally, two natural phenomena time series are used to assess forecasting performance of the proposed IMP with other non financial time series. At the end, the obtained results are discussed and compared to results found using models recently proposed in the literature. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Nonlinear data assimilation for the regional modeling of maximum ozone values.

    PubMed

    Božnar, Marija Zlata; Grašič, Boštjan; Mlakar, Primož; Gradišar, Dejan; Kocijan, Juš

    2017-11-01

    We present a new method of data assimilation with the aim of correcting the forecast of the maximum values of ozone in regional photo-chemical models for areas over complex terrain using multilayer perceptron artificial neural networks. Up until now, these types of models have been used as a single model for one location when forecasting concentrations of air pollutants. We propose a method for constructing a more ambitious model: a single model, which can be used at several locations because the model is spatially transferable and is valid for the whole 2D domain. To achieve this goal, we introduce three novel ideas. The new method improves correlation at measurement station locations by 10% on average and improves by approximately 5% elsewhere.

  3. Linear dynamical modes as new variables for data-driven ENSO forecast

    NASA Astrophysics Data System (ADS)

    Gavrilov, Andrey; Seleznev, Aleksei; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander; Kurths, Juergen

    2018-05-01

    A new data-driven model for analysis and prediction of spatially distributed time series is proposed. The model is based on a linear dynamical mode (LDM) decomposition of the observed data which is derived from a recently developed nonlinear dimensionality reduction approach. The key point of this approach is its ability to take into account simple dynamical properties of the observed system by means of revealing the system's dominant time scales. The LDMs are used as new variables for empirical construction of a nonlinear stochastic evolution operator. The method is applied to the sea surface temperature anomaly field in the tropical belt where the El Nino Southern Oscillation (ENSO) is the main mode of variability. The advantage of LDMs versus traditionally used empirical orthogonal function decomposition is demonstrated for this data. Specifically, it is shown that the new model has a competitive ENSO forecast skill in comparison with the other existing ENSO models.

  4. A simple new filter for nonlinear high-dimensional data assimilation

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Kirchgessner, Paul; Ahrens, Bodo

    2015-04-01

    The ensemble Kalman filter (EnKF) and its deterministic variants, mostly square root filters such as the ensemble transform Kalman filter (ETKF), represent a popular alternative to variational data assimilation schemes and are applied in a wide range of operational and research activities. Their forecast step employs an ensemble integration that fully respects the nonlinear nature of the analyzed system. In the analysis step, they implicitly assume the prior state and observation errors to be Gaussian. Consequently, in nonlinear systems, the analysis mean and covariance are biased, and these filters remain suboptimal. In contrast, the fully nonlinear, non-Gaussian particle filter (PF) only relies on Bayes' theorem, which guarantees an exact asymptotic behavior, but because of the so-called curse of dimensionality it is exposed to weight collapse. This work shows how to obtain a new analysis ensemble whose mean and covariance exactly match the Bayesian estimates. This is achieved by a deterministic matrix square root transformation of the forecast ensemble, and subsequently a suitable random rotation that significantly contributes to filter stability while preserving the required second-order statistics. The forecast step remains as in the ETKF. The proposed algorithm, which is fairly easy to implement and computationally efficient, is referred to as the nonlinear ensemble transform filter (NETF). The properties and performance of the proposed algorithm are investigated via a set of Lorenz experiments. They indicate that such a filter formulation can increase the analysis quality, even for relatively small ensemble sizes, compared to other ensemble filters in nonlinear, non-Gaussian scenarios. Furthermore, localization enhances the potential applicability of this PF-inspired scheme in larger-dimensional systems. Finally, the novel algorithm is coupled to a large-scale ocean general circulation model. The NETF is stable, behaves reasonably and shows a good performance with a realistic ensemble size. The results confirm that, in principle, it can be applied successfully and as simple as the ETKF in high-dimensional problems without further modifications of the algorithm, even though it is only based on the particle weights. This proves that the suggested method constitutes a useful filter for nonlinear, high-dimensional data assimilation, and is able to overcome the curse of dimensionality even in deterministic systems.

  5. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  6. Future mission studies: Forecasting solar flux directly from its chaotic time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.

  7. Travel Demand Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Southworth, Frank; Garrow, Dr. Laurie

    This chapter describes the principal types of both passenger and freight demand models in use today, providing a brief history of model development supported by references to a number of popular texts on the subject, and directing the reader to papers covering some of the more recent technical developments in the area. Over the past half century a variety of methods have been used to estimate and forecast travel demands, drawing concepts from economic/utility maximization theory, transportation system optimization and spatial interaction theory, using and often combining solution techniques as varied as Box-Jenkins methods, non-linear multivariate regression, non-linear mathematical programming,more » and agent-based microsimulation.« less

  8. Real-time Mainshock Forecast by Statistical Discrimination of Foreshock Clusters

    NASA Astrophysics Data System (ADS)

    Nomura, S.; Ogata, Y.

    2016-12-01

    Foreshock discremination is one of the most effective ways for short-time forecast of large main shocks. Though many large earthquakes accompany their foreshocks, discreminating them from enormous small earthquakes is difficult and only probabilistic evaluation from their spatio-temporal features and magnitude evolution may be available. Logistic regression is the statistical learning method best suited to such binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Statistical learning methods can keep learning discreminating features from updating catalog and give probabilistic recognition of forecast in real time. We estimated a non-linear function of foreshock proportion by smooth spline bases and evaluate the possibility of foreshocks by the logit function. In this study, we classified foreshocks from earthquake catalog by the Japan Meteorological Agency by single-link clustering methods and learned spatial and temporal features of foreshocks by the probability density ratio estimation. We use the epicentral locations, time spans and difference in magnitudes for learning and forecasting. Magnitudes of main shocks are also predicted our method by incorporating b-values into our method. We discuss the spatial pattern of foreshocks from the classifier composed by our model. We also implement a back test to validate predictive performance of the model by this catalog.

  9. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    NASA Astrophysics Data System (ADS)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  10. Solar flux forecasting using mutual information with an optimal delay

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Conway, D.; Rokni, M.; Sperling, R.; Roszman, L.; Cooley, J.

    1993-01-01

    Solar flux F(sub 10.7) directly affects the atmospheric density, thereby changing the lifetime and prediction of satellite orbits. For this reason, accurate forecasting of F(sub 10.7) is crucial for orbit determination of spacecraft. Our attempts to model and forecast F(sub 10.7) uncovered highly entangled dynamics. We concluded that the general lack of predictability in solar activity arises from its nonlinear nature. Nonlinear dynamics allow us to predict F(sub 10.7) more accurately than is possible using stochastic methods for time scales shorter than a characteristic horizon, and with about the same accuracy as using stochastic techniques when the forecasted data exceed this horizon. The forecast horizon is a function of two dynamical invariants: the attractor dimension and the Lyapunov exponent. In recent years, estimation of the attractor dimension reconstructed from a time series has become an important tool in data analysis. In calculating the invariants of the system, the first necessary step is the reconstruction of the attractor for the system from the time-delayed values of the time series. The choice of the time delay is critical for this reconstruction. For an infinite amount of noise-free data, the time delay can, in principle, be chosen almost arbitrarily. However, the quality of the phase portraits produced using the time-delay technique is determined by the value chosen for the delay time. Fraser and Swinney have shown that a good choice for this time delay is the one suggested by Shaw, which uses the first local minimum of the mutual information rather than the autocorrelation function to determine the time delay. This paper presents a refinement of this criterion and applies the refined technique to solar flux data to produce a forecast of the solar activity.

  11. Precipitation and floodiness: forecasts of flood hazard at the regional scale

    NASA Astrophysics Data System (ADS)

    Stephens, Liz; Day, Jonny; Pappenberger, Florian; Cloke, Hannah

    2016-04-01

    In 2008, a seasonal forecast of an increased likelihood of above-normal rainfall in West Africa led the Red Cross to take early humanitarian action (such as prepositioning of relief items) on the basis that this forecast implied heightened flood risk. However, there are a number of factors that lead to non-linearity between precipitation anomalies and flood hazard, so in this presentation we use a recently developed global-scale hydrological model driven by the ERA-Interim/Land precipitation reanalysis (1980-2010) to quantify this non-linearity. Using these data, we introduce the concept of floodiness to measure the incidence of floods over a large area, and quantify the link between monthly precipitation, river discharge and floodiness anomalies. Our analysis shows that floodiness is not well correlated with precipitation, demonstrating the problem of using seasonal precipitation forecasts as a proxy for forecasting flood hazard. This analysis demonstrates the value of developing hydrometeorological forecasts of floodiness for decision-makers. As a result, we are now working with the European Centre for Medium-Range Weather Forecasts and the Joint Research Centre, as partners of the operational Global Flood Awareness System (GloFAS), to implement floodiness forecasts in real-time.

  12. Artificial Neural Network Models for Long Lead Streamflow Forecasts using Climate Information

    NASA Astrophysics Data System (ADS)

    Kumar, J.; Devineni, N.

    2007-12-01

    Information on season ahead stream flow forecasts is very beneficial for the operation and management of water supply systems. Daily streamflow conditions at any particular reservoir primarily depend on atmospheric and land surface conditions including the soil moisture and snow pack. On the other hand recent studies suggest that developing long lead streamflow forecasts (3 months ahead) typically depends on exogenous climatic conditions particularly Sea Surface Temperature conditions (SST) in the tropical oceans. Examples of some oceanic variables are El Nino Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO). Identification of such conditions that influence the moisture transport into a given basin poses many challenges given the nonlinear dependency between the predictors (SST) and predictand (stream flows). In this study, we apply both linear and nonlinear dependency measures to identify the predictors that influence the winter flows into the Neuse basin. The predictor identification approach here adopted uses simple correlation coefficients to spearman rank correlation measures for detecting nonlinear dependency. All these dependency measures are employed with a lag 3 time series of the high flow season (January - February - March) using 75 years (1928-2002) of stream flows recorded in to the Falls Lake, Neuse River Basin. Developing streamflow forecasts contingent on these exogenous predictors will play an important role towards improved water supply planning and management. Recently, the soft computing techniques, such as artificial neural networks (ANNs) have provided an alternative method to solve complex problems efficiently. ANNs are data driven models which trains on the examples given to it. The ANNs functions as universal approximators and are non linear in nature. This paper presents a study aiming towards using climatic predictors for 3 month lead time streamflow forecast. ANN models representing the physical process of the system are developed between the identified predictors and the predictand. Predictors used are the scores of Principal Components Analysis (PCA). The models were tested and validated. The feed- forward multi-layer perceptron (MLP) type neural networks trained using the back-propagation algorithms are employed in the current study. The performance of the ANN-model forecasts are evaluated using various performance evaluation measures such as correlation coefficient, root mean square error (RMSE). The preliminary results shows that ANNs are efficient to forecast long lead time streamflows using climatic predictors.

  13. Forecasting of Machined Surface Waviness on the Basis of Self-oscillations Analysis

    NASA Astrophysics Data System (ADS)

    Belov, E. B.; Leonov, S. L.; Markov, A. M.; Sitnikov, A. A.; Khomenko, V. A.

    2017-01-01

    The paper states a problem of providing quality of geometrical characteristics of machined surfaces, which makes it necessary to forecast the occurrence and amount of oscillations appearing in the course of mechanical treatment. Objectives and tasks of the research are formulated. Sources of oscillation onset are defined: these are coordinate connections and nonlinear dependence of cutting force on the cutting velocity. A mathematical model of forecasting steady-state self-oscillations is investigated. The equation of the cutter tip motion is a system of two second-order nonlinear differential equations. The paper shows an algorithm describing a harmonic linearization method which allows for a significant reduction of the calculation time. In order to do that it is necessary to determine the amplitude of oscillations, frequency and a steady component of the first harmonic. Software which allows obtaining data on surface waviness parameters is described. The paper studies an example of the use of the developed model in semi-finished lathe machining of the shaft made from steel 40H which is a part of the BelAZ wheel electric actuator unit. Recommendations on eliminating self-oscillations in the process of shaft cutting and defect correction of the surface waviness are given.

  14. Chaos and Forecasting - Proceedings of the Royal Society Discussion Meeting

    NASA Astrophysics Data System (ADS)

    Tong, Howell

    1995-04-01

    The Table of Contents for the full book PDF is as follows: * Preface * Orthogonal Projection, Embedding Dimension and Sample Size in Chaotic Time Series from a Statistical Perspective * A Theory of Correlation Dimension for Stationary Time Series * On Prediction and Chaos in Stochastic Systems * Locally Optimized Prediction of Nonlinear Systems: Stochastic and Deterministic * A Poisson Distribution for the BDS Test Statistic for Independence in a Time Series * Chaos and Nonlinear Forecastability in Economics and Finance * Paradigm Change in Prediction * Predicting Nonuniform Chaotic Attractors in an Enzyme Reaction * Chaos in Geophysical Fluids * Chaotic Modulation of the Solar Cycle * Fractal Nature in Earthquake Phenomena and its Simple Models * Singular Vectors and the Predictability of Weather and Climate * Prediction as a Criterion for Classifying Natural Time Series * Measuring and Characterising Spatial Patterns, Dynamics and Chaos in Spatially-Extended Dynamical Systems and Ecologies * Non-Linear Forecasting and Chaos in Ecology and Epidemiology: Measles as a Case Study

  15. Identifying trends in climate: an application to the cenozoic

    NASA Astrophysics Data System (ADS)

    Richards, Gordon R.

    1998-05-01

    The recent literature on trending in climate has raised several issues, whether trends should be modeled as deterministic or stochastic, whether trends are nonlinear, and the relative merits of statistical models versus models based on physics. This article models trending since the late Cretaceous. This 68 million-year interval is selected because the reliability of tests for trending is critically dependent on the length of time spanned by the data. Two main hypotheses are tested, that the trend has been caused primarily by CO2 forcing, and that it reflects a variety of forcing factors which can be approximated by statistical methods. The CO2 data is obtained from model simulations. Several widely-used statistical models are found to be inadequate. ARIMA methods parameterize too much of the short-term variation, and do not identify low frequency movements. Further, the unit root in the ARIMA process does not predict the long-term path of temperature. Spectral methods also have little ability to predict temperature at long horizons. Instead, the statistical trend is estimated using a nonlinear smoothing filter. Both of these paradigms make it possible to model climate as a cointegrated process, in which temperature can wander quite far from the trend path in the intermediate term, but converges back over longer horizons. Comparing the forecasting properties of the two trend models demonstrates that the optimal forecasting model includes CO2 forcing and a parametric representation of the nonlinear variability in climate.

  16. Long-term forecasting of meteorological time series using Nonlinear Canonical Correlation Analysis (NLCCA)

    NASA Astrophysics Data System (ADS)

    Woldesellasse, H. T.; Marpu, P. R.; Ouarda, T.

    2016-12-01

    Wind is one of the crucial renewable energy sources which is expected to bring solutions to the challenges of clean energy and the global issue of climate change. A number of linear and nonlinear multivariate techniques has been used to predict the stochastic character of wind speed. A wind forecast with good accuracy has a positive impact on the reduction of electricity system cost and is essential for the effective grid management. Over the past years, few studies have been done on the assessment of teleconnections and its possible effects on the long-term wind speed variability in the UAE region. In this study Nonlinear Canonical Correlation Analysis (NLCCA) method is applied to study the relationship between global climate oscillation indices and meteorological variables, with a major emphasis on wind speed and wind direction, of Abu Dhabi, UAE. The wind dataset was obtained from six ground stations. The first mode of NLCCA is capable of capturing the nonlinear mode of the climate indices at different seasons, showing the symmetry between the warm states and the cool states. The strength of the nonlinear canonical correlation between the two sets of variables varies with the lead/lag time. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE) and Mean absolute error (MAE). The results indicated that NLCCA models provide more accurate information about the nonlinear intrinsic behaviour of the dataset of variables than linear CCA model in terms of the correlation and root mean square error. Key words: Nonlinear Canonical Correlation Analysis (NLCCA), Canonical Correlation Analysis, Neural Network, Climate Indices, wind speed, wind direction

  17. Equation-free mechanistic ecosystem forecasting using empirical dynamic modeling

    PubMed Central

    Ye, Hao; Beamish, Richard J.; Glaser, Sarah M.; Grant, Sue C. H.; Hsieh, Chih-hao; Richards, Laura J.; Schnute, Jon T.; Sugihara, George

    2015-01-01

    It is well known that current equilibrium-based models fall short as predictive descriptions of natural ecosystems, and particularly of fisheries systems that exhibit nonlinear dynamics. For example, model parameters assumed to be fixed constants may actually vary in time, models may fit well to existing data but lack out-of-sample predictive skill, and key driving variables may be misidentified due to transient (mirage) correlations that are common in nonlinear systems. With these frailties, it is somewhat surprising that static equilibrium models continue to be widely used. Here, we examine empirical dynamic modeling (EDM) as an alternative to imposed model equations and that accommodates both nonequilibrium dynamics and nonlinearity. Using time series from nine stocks of sockeye salmon (Oncorhynchus nerka) from the Fraser River system in British Columbia, Canada, we perform, for the the first time to our knowledge, real-data comparison of contemporary fisheries models with equivalent EDM formulations that explicitly use spawning stock and environmental variables to forecast recruitment. We find that EDM models produce more accurate and precise forecasts, and unlike extensions of the classic Ricker spawner–recruit equation, they show significant improvements when environmental factors are included. Our analysis demonstrates the strategic utility of EDM for incorporating environmental influences into fisheries forecasts and, more generally, for providing insight into how environmental factors can operate in forecast models, thus paving the way for equation-free mechanistic forecasting to be applied in management contexts. PMID:25733874

  18. Nonlinear forecasting as a way of distinguishing chaos from measurement error in time series

    NASA Astrophysics Data System (ADS)

    Sugihara, George; May, Robert M.

    1990-04-01

    An approach is presented for making short-term predictions about the trajectories of chaotic dynamical systems. The method is applied to data on measles, chickenpox, and marine phytoplankton populations, to show how apparent noise associated with deterministic chaos can be distinguished from sampling error and other sources of externally induced environmental noise.

  19. Forecasting the portuguese stock market time series by using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Isfan, Monica; Menezes, Rui; Mendes, Diana A.

    2010-04-01

    In this paper, we show that neural networks can be used to uncover the non-linearity that exists in the financial data. First, we follow a traditional approach by analysing the deterministic/stochastic characteristics of the Portuguese stock market data and some typical features are studied, like the Hurst exponents, among others. We also simulate a BDS test to investigate nonlinearities and the results are as expected: the financial time series do not exhibit linear dependence. Secondly, we trained four types of neural networks for the stock markets and used the models to make forecasts. The artificial neural networks were obtained using a three-layer feed-forward topology and the back-propagation learning algorithm. The quite large number of parameters that must be selected to develop a neural network forecasting model involves some trial and as a consequence the error is not small enough. In order to improve this we use a nonlinear optimization algorithm to minimize the error. Finally, the output of the 4 models is quite similar, leading to a qualitative forecast that we compare with the results of the application of k-nearest-neighbor for the same time series.

  20. Nonlinear techniques for forecasting solar activity directly from its time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1992-01-01

    Numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series are presented. This approach makes it possible to extract dynamical invariants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), given a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  1. Nonlinear techniques for forecasting solar activity directly from its time series

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1993-01-01

    This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  2. Seasonal forecasting of fire over Kalimantan, Indonesia

    NASA Astrophysics Data System (ADS)

    Spessa, A. C.; Field, R. D.; Pappenberger, F.; Langner, A.; Englhart, S.; Weber, U.; Stockdale, T.; Siegert, F.; Kaiser, J. W.; Moore, J.

    2015-03-01

    Large-scale fires occur frequently across Indonesia, particularly in the southern region of Kalimantan and eastern Sumatra. They have considerable impacts on carbon emissions, haze production, biodiversity, health, and economic activities. In this study, we demonstrate that severe fire and haze events in Indonesia can generally be predicted months in advance using predictions of seasonal rainfall from the ECMWF System 4 coupled ocean-atmosphere model. Based on analyses of long, up-to-date series observations on burnt area, rainfall, and tree cover, we demonstrate that fire activity is negatively correlated with rainfall and is positively associated with deforestation in Indonesia. There is a contrast between the southern region of Kalimantan (high fire activity, high tree cover loss, and strong non-linear correlation between observed rainfall and fire) and the central region of Kalimantan (low fire activity, low tree cover loss, and weak, non-linear correlation between observed rainfall and fire). The ECMWF seasonal forecast provides skilled forecasts of burnt and fire-affected area with several months lead time explaining at least 70% of the variance between rainfall and burnt and fire-affected area. Results are strongly influenced by El Niño years which show a consistent positive bias. Overall, our findings point to a high potential for using a more physical-based method for predicting fires with several months lead time in the tropics rather than one based on indexes only. We argue that seasonal precipitation forecasts should be central to Indonesia's evolving fire management policy.

  3. State estimation and prediction using clustered particle filters.

    PubMed

    Lee, Yoonsang; Majda, Andrew J

    2016-12-20

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors.

  4. State estimation and prediction using clustered particle filters

    PubMed Central

    Lee, Yoonsang; Majda, Andrew J.

    2016-01-01

    Particle filtering is an essential tool to improve uncertain model predictions by incorporating noisy observational data from complex systems including non-Gaussian features. A class of particle filters, clustered particle filters, is introduced for high-dimensional nonlinear systems, which uses relatively few particles compared with the standard particle filter. The clustered particle filter captures non-Gaussian features of the true signal, which are typical in complex nonlinear dynamical systems such as geophysical systems. The method is also robust in the difficult regime of high-quality sparse and infrequent observations. The key features of the clustered particle filtering are coarse-grained localization through the clustering of the state variables and particle adjustment to stabilize the method; each observation affects only neighbor state variables through clustering and particles are adjusted to prevent particle collapse due to high-quality observations. The clustered particle filter is tested for the 40-dimensional Lorenz 96 model with several dynamical regimes including strongly non-Gaussian statistics. The clustered particle filter shows robust skill in both achieving accurate filter results and capturing non-Gaussian statistics of the true signal. It is further extended to multiscale data assimilation, which provides the large-scale estimation by combining a cheap reduced-order forecast model and mixed observations of the large- and small-scale variables. This approach enables the use of a larger number of particles due to the computational savings in the forecast model. The multiscale clustered particle filter is tested for one-dimensional dispersive wave turbulence using a forecast model with model errors. PMID:27930332

  5. Heterogeneity: The key to failure forecasting

    PubMed Central

    Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.

    2015-01-01

    Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power. PMID:26307196

  6. Heterogeneity: The key to failure forecasting.

    PubMed

    Vasseur, Jérémie; Wadsworth, Fabian B; Lavallée, Yan; Bell, Andrew F; Main, Ian G; Dingwell, Donald B

    2015-08-26

    Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.

  7. Heterogeneity: The key to failure forecasting

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.

    2015-08-01

    Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.

  8. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  9. Predictability of extremes in non-linear hierarchically organized systems

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Soloviev, A.

    2011-12-01

    Understanding the complexity of non-linear dynamics of hierarchically organized systems progresses to new approaches in assessing hazard and risk of the extreme catastrophic events. In particular, a series of interrelated step-by-step studies of seismic process along with its non-stationary though self-organized behaviors, has led already to reproducible intermediate-term middle-range earthquake forecast/prediction technique that has passed control in forward real-time applications during the last two decades. The observed seismic dynamics prior to and after many mega, great, major, and strong earthquakes demonstrate common features of predictability and diverse behavior in course durable phase transitions in complex hierarchical non-linear system of blocks-and-faults of the Earth lithosphere. The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable analytical models, which leads to widespread practice of their deceptive application. The consequences of underestimation of seismic hazard propagate non-linearly into inflicted underestimation of risk and, eventually, into unexpected societal losses due to earthquakes and associated phenomena (i.e., collapse of buildings, landslides, tsunamis, liquefaction, etc.). The studies aimed at forecast/prediction of extreme events (interpreted as critical transitions) in geophysical and socio-economical systems include: (i) large earthquakes in geophysical systems of the lithosphere blocks-and-faults, (ii) starts and ends of economic recessions, (iii) episodes of a sharp increase in the unemployment rate, (iv) surge of the homicides in socio-economic systems. These studies are based on a heuristic search of phenomena preceding critical transitions and application of methodologies of pattern recognition of infrequent events. Any study of rare phenomena of highly complex origin, by their nature, implies using problem oriented methods, which design breaks the limits of classical statistical or econometric applications. The unambiguously designed forecast/prediction algorithms of the "yes or no" variety, analyze the observable quantitative integrals and indicators available to a given date, then provides unambiguous answer to the question whether a critical transition should be expected or not in the next time interval. Since the predictability of an originating non-linear dynamical system is limited in principle, the probabilistic component of forecast/prediction algorithms is represented by the empirical probabilities of alarms, on one side, and failures-to-predict, on the other, estimated on control sets achieved in the retro- and prospective experiments. Predicting in advance is the only decisive test of forecast/predictions and the relevant on-going experiments are conducted in the case seismic extremes, recessions, and increases of unemployment rate. The results achieved in real-time testing keep being encouraging and confirm predictability of the extremes.

  10. Lifting primordial non-Gaussianity above the noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welling, Yvette; Woude, Drian van der; Pajer, Enrico, E-mail: welling@strw.leidenuniv.nl, E-mail: D.C.vanderWoude@uu.nl, E-mail: enrico.pajer@gmail.com

    2016-08-01

    Primordial non-Gaussianity (PNG) in Large Scale Structures is obfuscated by the many additional sources of non-linearity. Within the Effective Field Theory approach to Standard Perturbation Theory, we show that matter non-linearities in the bispectrum can be modeled sufficiently well to strengthen current bounds with near future surveys, such as Euclid. We find that the EFT corrections are crucial to this improvement in sensitivity. Yet, our understanding of non-linearities is still insufficient to reach important theoretical benchmarks for equilateral PNG, while, for local PNG, our forecast is more optimistic. We consistently account for the theoretical error intrinsic to the perturbative approachmore » and discuss the details of its implementation in Fisher forecasts.« less

  11. Seasonal streamflow forecast with machine learning and teleconnection indices in the context non-stationary climate

    NASA Astrophysics Data System (ADS)

    Haguma, D.; Leconte, R.

    2017-12-01

    Spatial and temporal water resources variability are associated with large-scale pressure and circulation anomalies known as teleconnections that influence the pattern of the atmospheric circulation. Teleconnection indices have been used successfully to forecast streamflow in short term. However, in some watersheds, classical methods cannot establish relationships between seasonal streamflow and teleconnection indices because of weak correlation. In this study, machine learning algorithms have been applied for seasonal streamflow forecast using teleconnection indices. Machine learning offers an alternative to classical methods to address the non-linear relationship between streamflow and teleconnection indices the context non-stationary climate. Two machine learning algorithms, random forest (RF) and support vector machine (SVM), with teleconnection indices associated with North American climatology, have been used to forecast inflows for one and two leading seasons for the Romaine River and Manicouagan River watersheds, located in Quebec, Canada. The indices are Pacific-North America (PNA), North Atlantic Oscillation (NAO), El Niño-Southern Oscillation (ENSO), Arctic Oscillation (AO) and Pacific Decadal Oscillation (PDO). The results showed that the machine learning algorithms have an important predictive power for seasonal streamflow for one and two leading seasons. The RF performed better for training and SVM generally have better results with high predictive capability for testing. The RF which is an ensemble method, allowed to assess the uncertainty of the forecast. The integration of teleconnection indices responds to the seasonal forecast of streamflow in the conditions of the non-stationarity the climate, although the teleconnection indices have a weak correlation with streamflow.

  12. Assessment of Forecast Sensitivity to Observation and Its Application to Satellite Radiances

    NASA Astrophysics Data System (ADS)

    Ide, K.

    2017-12-01

    The Forecast sensitivity to observation provides practical and useful metric for the assessment of observation impact without conducting computationally intensive data denial experiments. Quite often complex data assimilation systems use a simplified version of the forecast sensitivity formulation based on ensembles. In this talk, we first present the comparison of forecast sensitivity for 4DVar, Hybrid-4DEnVar, and 4DEnKF with or without such simplifications using a highly nonlinear model. We then present the results of ensemble forecast sensitivity to satellite radiance observations for Hybrid-4DEnVart using NOAA's Global Forecast System.

  13. Forecasting of foreign exchange rates of Taiwan’s major trading partners by novel nonlinear Grey Bernoulli model NGBM(1, 1)

    NASA Astrophysics Data System (ADS)

    Chen, Chun-I.; Chen, Hong Long; Chen, Shuo-Pei

    2008-08-01

    The traditional Grey Model is easy to understand and simple to calculate, with satisfactory accuracy, but it is also lack of flexibility to adjust the model to acquire higher forecasting precision. This research studies feasibility and effectiveness of a novel Grey model together with the concept of the Bernoulli differential equation in ordinary differential equation. In this research, the author names this newly proposed model as Nonlinear Grey Bernoulli Model (NGBM). The NGBM is nonlinear differential equation with power index n. By controlling n, the curvature of the solution curve could be adjusted to fit the result of one time accumulated generating operation (1-AGO) of raw data. One extreme case from Grey system textbook is studied by NGBM, and two published articles are chosen for practical tests of NGBM. The results prove the novel NGBM is feasible and efficient. Finally, NGBM is used to forecast 2005 foreign exchange rates of twelve Taiwan major trading partners, including Taiwan.

  14. Error sensitivity analysis in 10-30-day extended range forecasting by using a nonlinear cross-prediction error model

    NASA Astrophysics Data System (ADS)

    Xia, Zhiye; Xu, Lisheng; Chen, Hongbin; Wang, Yongqian; Liu, Jinbao; Feng, Wenlan

    2017-06-01

    Extended range forecasting of 10-30 days, which lies between medium-term and climate prediction in terms of timescale, plays a significant role in decision-making processes for the prevention and mitigation of disastrous meteorological events. The sensitivity of initial error, model parameter error, and random error in a nonlinear crossprediction error (NCPE) model, and their stability in the prediction validity period in 10-30-day extended range forecasting, are analyzed quantitatively. The associated sensitivity of precipitable water, temperature, and geopotential height during cases of heavy rain and hurricane is also discussed. The results are summarized as follows. First, the initial error and random error interact. When the ratio of random error to initial error is small (10-6-10-2), minor variation in random error cannot significantly change the dynamic features of a chaotic system, and therefore random error has minimal effect on the prediction. When the ratio is in the range of 10-1-2 (i.e., random error dominates), attention should be paid to the random error instead of only the initial error. When the ratio is around 10-2-10-1, both influences must be considered. Their mutual effects may bring considerable uncertainty to extended range forecasting, and de-noising is therefore necessary. Second, in terms of model parameter error, the embedding dimension m should be determined by the factual nonlinear time series. The dynamic features of a chaotic system cannot be depicted because of the incomplete structure of the attractor when m is small. When m is large, prediction indicators can vanish because of the scarcity of phase points in phase space. A method for overcoming the cut-off effect ( m > 4) is proposed. Third, for heavy rains, precipitable water is more sensitive to the prediction validity period than temperature or geopotential height; however, for hurricanes, geopotential height is most sensitive, followed by precipitable water.

  15. Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján

    2017-06-01

    It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.

  16. An enhanced PM 2.5 air quality forecast model based on nonlinear regression and back-trajectory concentrations

    NASA Astrophysics Data System (ADS)

    Cobourn, W. Geoffrey

    2010-08-01

    An enhanced PM 2.5 air quality forecast model based on nonlinear regression (NLR) and back-trajectory concentrations has been developed for use in the Louisville, Kentucky metropolitan area. The PM 2.5 air quality forecast model is designed for use in the warm season, from May through September, when PM 2.5 air quality is more likely to be critical for human health. The enhanced PM 2.5 model consists of a basic NLR model, developed for use with an automated air quality forecast system, and an additional parameter based on upwind PM 2.5 concentration, called PM24. The PM24 parameter is designed to be determined manually, by synthesizing backward air trajectory and regional air quality information to compute 24-h back-trajectory concentrations. The PM24 parameter may be used by air quality forecasters to adjust the forecast provided by the automated forecast system. In this study of the 2007 and 2008 forecast seasons, the enhanced model performed well using forecasted meteorological data and PM24 as input. The enhanced PM 2.5 model was compared with three alternative models, including the basic NLR model, the basic NLR model with a persistence parameter added, and the NLR model with persistence and PM24. The two models that included PM24 were of comparable accuracy. The two models incorporating back-trajectory concentrations had lower mean absolute errors and higher rates of detecting unhealthy PM2.5 concentrations compared to the other models.

  17. Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    2017-07-01

    In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.

  18. Preliminary analysis on hybrid Box-Jenkins - GARCH modeling in forecasting gold price

    NASA Astrophysics Data System (ADS)

    Yaziz, Siti Roslindar; Azizan, Noor Azlinna; Ahmad, Maizah Hura; Zakaria, Roslinazairimah; Agrawal, Manju; Boland, John

    2015-02-01

    Gold has been regarded as a valuable precious metal and the most popular commodity as a healthy return investment. Hence, the analysis and prediction of gold price become very significant to investors. This study is a preliminary analysis on gold price and its volatility that focuses on the performance of hybrid Box-Jenkins models together with GARCH in analyzing and forecasting gold price. The Box-Cox formula is used as the data transformation method due to its potential best practice in normalizing data, stabilizing variance and reduces heteroscedasticity using 41-year daily gold price data series starting 2nd January 1973. Our study indicates that the proposed hybrid model ARIMA-GARCH with t-innovation can be a new potential approach in forecasting gold price. This finding proves the strength of GARCH in handling volatility in the gold price as well as overcomes the non-linear limitation in the Box-Jenkins modeling.

  19. Studying Climate Response to Forcing by the Nonlinear Dynamical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander

    2017-04-01

    An analysis of global climate response to external forcing, both anthropogenic (mainly, CO2 and aerosol) and natural (solar and volcanic), is needed for adequate predictions of global climate change. Being complex dynamical system, the climate reacts to external perturbations exciting feedbacks (both positive and negative) making the response non-trivial and poorly predictable. Thus an extraction of internal modes of climate system, investigation of their interaction with external forcings and further modeling and forecast of their dynamics, are all the problems providing the success of climate modeling. In the report the new method for principal mode extraction from climate data is presented. The method is based on the Nonlinear Dynamical Mode (NDM) expansion [1,2], but takes into account a number of external forcings applied to the system. Each NDM is represented by hidden time series governing the observed variability, which, together with external forcing time series, are mapped onto data space. While forcing time series are considered to be known, the hidden unknown signals underlying the internal climate dynamics are extracted from observed data by the suggested method. In particular, it gives us an opportunity to study the evolution of principal system's mode structure in changing external conditions and separate the internal climate variability from trends forced by external perturbations. Furthermore, the modes so obtained can be extrapolated beyond the observational time series, and long-term prognosis of modes' structure including characteristics of interconnections and responses to external perturbations, can be carried out. In this work the method is used for reconstructing and studying the principal modes of climate variability on inter-annual and decadal time scales accounting the external forcings such as anthropogenic emissions, variations of the solar activity and volcanic activity. The structure of the obtained modes as well as their response to external factors, e.g. forecast their change in 21 century under different CO2 emission scenarios, are discussed. [1] Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep15510 [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J. (2016). Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101. http://doi.org/10.1063/1.4968852

  20. Development of Ensemble Model Based Water Demand Forecasting Model

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Han; So, Byung-Jin; Kim, Seong-Hyeon; Kim, Byung-Seop

    2014-05-01

    In recent years, Smart Water Grid (SWG) concept has globally emerged over the last decade and also gained significant recognition in South Korea. Especially, there has been growing interest in water demand forecast and optimal pump operation and this has led to various studies regarding energy saving and improvement of water supply reliability. Existing water demand forecasting models are categorized into two groups in view of modeling and predicting their behavior in time series. One is to consider embedded patterns such as seasonality, periodicity and trends, and the other one is an autoregressive model that is using short memory Markovian processes (Emmanuel et al., 2012). The main disadvantage of the abovementioned model is that there is a limit to predictability of water demands of about sub-daily scale because the system is nonlinear. In this regard, this study aims to develop a nonlinear ensemble model for hourly water demand forecasting which allow us to estimate uncertainties across different model classes. The proposed model is consist of two parts. One is a multi-model scheme that is based on combination of independent prediction model. The other one is a cross validation scheme named Bagging approach introduced by Brieman (1996) to derive weighting factors corresponding to individual models. Individual forecasting models that used in this study are linear regression analysis model, polynomial regression, multivariate adaptive regression splines(MARS), SVM(support vector machine). The concepts are demonstrated through application to observed from water plant at several locations in the South Korea. Keywords: water demand, non-linear model, the ensemble forecasting model, uncertainty. Acknowledgements This subject is supported by Korea Ministry of Environment as "Projects for Developing Eco-Innovation Technologies (GT-11-G-02-001-6)

  1. New Approach To Hour-By-Hour Weather Forecast

    NASA Astrophysics Data System (ADS)

    Liao, Q. Q.; Wang, B.

    2017-12-01

    Fine hourly forecast in single station weather forecast is required in many human production and life application situations. Most previous MOS (Model Output Statistics) which used a linear regression model are hard to solve nonlinear natures of the weather prediction and forecast accuracy has not been sufficient at high temporal resolution. This study is to predict the future meteorological elements including temperature, precipitation, relative humidity and wind speed in a local region over a relatively short period of time at hourly level. By means of hour-to-hour NWP (Numeral Weather Prediction)meteorological field from Forcastio (https://darksky.net/dev/docs/forecast) and real-time instrumental observation including 29 stations in Yunnan and 3 stations in Tianjin of China from June to October 2016, predictions are made of the 24-hour hour-by-hour ahead. This study presents an ensemble approach to combine the information of instrumental observation itself and NWP. Use autoregressive-moving-average (ARMA) model to predict future values of the observation time series. Put newest NWP products into the equations derived from the multiple linear regression MOS technique. Handle residual series of MOS outputs with autoregressive (AR) model for the linear property presented in time series. Due to the complexity of non-linear property of atmospheric flow, support vector machine (SVM) is also introduced . Therefore basic data quality control and cross validation makes it able to optimize the model function parameters , and do 24 hours ahead residual reduction with AR/SVM model. Results show that AR model technique is better than corresponding multi-variant MOS regression method especially at the early 4 hours when the predictor is temperature. MOS-AR combined model which is comparable to MOS-SVM model outperform than MOS. Both of their root mean square error and correlation coefficients for 2 m temperature are reduced to 1.6 degree Celsius and 0.91 respectively. The forecast accuracy of 24- hour forecast deviation no more than 2 degree Celsius is 78.75 % for MOS-AR model and 81.23 % for AR model.

  2. Balanced Atmospheric Data Assimilation

    NASA Astrophysics Data System (ADS)

    Hastermann, Gottfried; Reinhardt, Maria; Klein, Rupert; Reich, Sebastian

    2017-04-01

    The atmosphere's multi-scale structure poses several major challenges in numerical weather prediction. One of these arises in the context of data assimilation. The large-scale dynamics of the atmosphere are balanced in the sense that acoustic or rapid internal wave oscillations generally come with negligibly small amplitudes. If triggered artificially, however, through inappropriate initialization or by data assimilation, such oscillations can have a detrimental effect on forecast quality as they interact with the moist aerothermodynamics of the atmosphere. In the setting of sequential Bayesian data assimilation, we therefore investigate two different strategies to reduce these artificial oscillations induced by the analysis step. On the one hand, we develop a new modification for a local ensemble transform Kalman filter, which penalizes imbalances via a minimization problem. On the other hand, we modify the first steps of the subsequent forecast to push the ensemble members back to the slow evolution. We therefore propose the use of certain asymptotically consistent integrators that can blend between the balanced and the unbalanced evolution model seamlessly. In our work, we furthermore present numerical results and performance of the proposed methods for two nonlinear ordinary differential equation models, where we can identify the different scales clearly. The first one is a Lorenz 96 model coupled with a wave equation. In this case the balance relation is linear and the imbalances are caused only by the localization of the filter. The second one is the elastic double pendulum where the balance relation itself is already highly nonlinear. In both cases the methods perform very well and could significantly reduce the imbalances and therefore increase the forecast quality of the slow variables.

  3. Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting.

    PubMed

    Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M

    2014-06-01

    Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind "noise," which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical "downscaling" of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations.

  4. State-space forecasting of Schistosoma haematobium time-series in Niono, Mali.

    PubMed

    Medina, Daniel C; Findley, Sally E; Doumbia, Seydou

    2008-08-13

    Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.-which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively-is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. In this longitudinal retrospective (01/1996-06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state-space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. The exponential smoothing state-space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium-induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel.

  5. State–Space Forecasting of Schistosoma haematobium Time-Series in Niono, Mali

    PubMed Central

    Medina, Daniel C.; Findley, Sally E.; Doumbia, Seydou

    2008-01-01

    Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.—which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively—is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state–space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. Conclusions/Significance The exponential smoothing state–space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium–induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel. PMID:18698361

  6. Stochastic Convection Parameterizations

    NASA Technical Reports Server (NTRS)

    Teixeira, Joao; Reynolds, Carolyn; Suselj, Kay; Matheou, Georgios

    2012-01-01

    computational fluid dynamics, radiation, clouds, turbulence, convection, gravity waves, surface interaction, radiation interaction, cloud and aerosol microphysics, complexity (vegetation, biogeochemistry, radiation versus turbulence/convection stochastic approach, non-linearities, Monte Carlo, high resolutions, large-Eddy Simulations, cloud structure, plumes, saturation in tropics, forecasting, parameterizations, stochastic, radiation-clod interaction, hurricane forecasts

  7. Evaluation of trade influence on economic growth rate by computational intelligence approach

    NASA Astrophysics Data System (ADS)

    Sokolov-Mladenović, Svetlana; Milovančević, Milos; Mladenović, Igor

    2017-01-01

    In this study was analyzed the influence of trade parameters on the economic growth forecasting accuracy. Computational intelligence method was used for the analyzing since the method can handle highly nonlinear data. It is known that the economic growth could be modeled based on the different trade parameters. In this study five input parameters were considered. These input parameters were: trade in services, exports of goods and services, imports of goods and services, trade and merchandise trade. All these parameters were calculated as added percentages in gross domestic product (GDP). The main goal was to select which parameters are the most impactful on the economic growth percentage. GDP was used as economic growth indicator. Results show that the imports of goods and services has the highest influence on the economic growth forecasting accuracy.

  8. On the validity of cosmological Fisher matrix forecasts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolz, Laura; Kilbinger, Martin; Weller, Jochen

    2012-09-01

    We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w{sub 0} and w{sub a}. For purely geometrical probes, and especially when marginalising over w{sub a}, we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function.more » More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts.« less

  9. A probabilistic neural network based approach for predicting the output power of wind turbines

    NASA Astrophysics Data System (ADS)

    Tabatabaei, Sajad

    2017-03-01

    Finding the authentic predicting tools of eliminating the uncertainty of wind speed forecasts is highly required while wind power sources are strongly penetrating. Recently, traditional predicting models of generating point forecasts have no longer been trustee. Thus, the present paper aims at utilising the concept of prediction intervals (PIs) to assess the uncertainty of wind power generation in power systems. Besides, this paper uses a newly introduced non-parametric approach called lower upper bound estimation (LUBE) to build the PIs since the forecasting errors are unable to be modelled properly by applying distribution probability functions. In the present proposed LUBE method, a PI combination-based fuzzy framework is used to overcome the performance instability of neutral networks (NNs) used in LUBE. In comparison to other methods, this formulation more suitably has satisfied the PI coverage and PI normalised average width (PINAW). Since this non-linear problem has a high complexity, a new heuristic-based optimisation algorithm comprising a novel modification is introduced to solve the aforesaid problems. Based on data sets taken from a wind farm in Australia, the feasibility and satisfying performance of the suggested method have been investigated.

  10. Analysis and prediction of aperiodic hydrodynamic oscillatory time series by feed-forward neural networks, fuzzy logic, and a local nonlinear predictor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentili, Pier Luigi, E-mail: pierluigi.gentili@unipg.it; Gotoda, Hiroshi; Dolnik, Milos

    Forecasting of aperiodic time series is a compelling challenge for science. In this work, we analyze aperiodic spectrophotometric data, proportional to the concentrations of two forms of a thermoreversible photochromic spiro-oxazine, that are generated when a cuvette containing a solution of the spiro-oxazine undergoes photoreaction and convection due to localized ultraviolet illumination. We construct the phase space for the system using Takens' theorem and we calculate the Lyapunov exponents and the correlation dimensions to ascertain the chaotic character of the time series. Finally, we predict the time series using three distinct methods: a feed-forward neural network, fuzzy logic, and amore » local nonlinear predictor. We compare the performances of these three methods.« less

  11. [Research on engine remaining useful life prediction based on oil spectrum analysis and particle filtering].

    PubMed

    Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song

    2013-09-01

    The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the

  12. Nonlinear solar cycle forecasting: theory and perspectives

    NASA Astrophysics Data System (ADS)

    Baranovski, A. L.; Clette, F.; Nollau, V.

    2008-02-01

    In this paper we develop a modern approach to solar cycle forecasting, based on the mathematical theory of nonlinear dynamics. We start from the design of a static curve fitting model for the experimental yearly sunspot number series, over a time scale of 306 years, starting from year 1700 and we establish a least-squares optimal pulse shape of a solar cycle. The cycle-to-cycle evolution of the parameters of the cycle shape displays different patterns, such as a Gleissberg cycle and a strong anomaly in the cycle evolution during the Dalton minimum. In a second step, we extract a chaotic mapping for the successive values of one of the key model parameters - the rate of the exponential growth-decrease of the solar activity during the n-th cycle. We examine piece-wise linear techniques for the approximation of the derived mapping and we provide its probabilistic analysis: calculation of the invariant distribution and autocorrelation function. We find analytical relationships for the sunspot maxima and minima, as well as their occurrence times, as functions of chaotic values of the above parameter. Based on a Lyapunov spectrum analysis of the embedded mapping, we finally establish a horizon of predictability for the method, which allows us to give the most probable forecasting of the upcoming solar cycle 24, with an expected peak height of 93±21 occurring in 2011/2012.

  13. Developing a Mixed Neural Network Approach to Forecast the Residential Electricity Consumption Based on Sensor Recorded Data

    PubMed Central

    Bâra, Adela; Stănică, Justina-Lavinia; Coculescu, Cristina

    2018-01-01

    In this paper, we report a study having as a main goal the obtaining of a method that can provide an accurate forecast of the residential electricity consumption, refining it up to the appliance level, using sensor recorded data, for residential smart homes complexes that use renewable energy sources as a part of their consumed electricity, overcoming the limitations of not having available historical meteorological data and the unwillingness of the contractor to acquire such data periodically in the future accurate short-term forecasts from a specialized institute due to the implied costs. In this purpose, we have developed a mixed artificial neural network (ANN) approach using both non-linear autoregressive with exogenous input (NARX) ANNs and function fitting neural networks (FITNETs). We have used a large dataset containing detailed electricity consumption data recorded by sensors, monitoring a series of individual appliances, while in the NARX case we have also used timestamps datasets as exogenous variables. After having developed and validated the forecasting method, we have compiled it in view of incorporating it into a cloud solution, being delivered to the contractor that can provide it as a service for a monthly fee to both the operators and residential consumers. PMID:29734761

  14. Developing a Mixed Neural Network Approach to Forecast the Residential Electricity Consumption Based on Sensor Recorded Data.

    PubMed

    Oprea, Simona-Vasilica; Pîrjan, Alexandru; Căruțașu, George; Petroșanu, Dana-Mihaela; Bâra, Adela; Stănică, Justina-Lavinia; Coculescu, Cristina

    2018-05-05

    In this paper, we report a study having as a main goal the obtaining of a method that can provide an accurate forecast of the residential electricity consumption, refining it up to the appliance level, using sensor recorded data, for residential smart homes complexes that use renewable energy sources as a part of their consumed electricity, overcoming the limitations of not having available historical meteorological data and the unwillingness of the contractor to acquire such data periodically in the future accurate short-term forecasts from a specialized institute due to the implied costs. In this purpose, we have developed a mixed artificial neural network (ANN) approach using both non-linear autoregressive with exogenous input (NARX) ANNs and function fitting neural networks (FITNETs). We have used a large dataset containing detailed electricity consumption data recorded by sensors, monitoring a series of individual appliances, while in the NARX case we have also used timestamps datasets as exogenous variables. After having developed and validated the forecasting method, we have compiled it in view of incorporating it into a cloud solution, being delivered to the contractor that can provide it as a service for a monthly fee to both the operators and residential consumers.

  15. Results on SSH neural network forecasting in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Rixen, Michel; Beckers, Jean-Marie; Alvarez, Alberto; Tintore, Joaquim

    2002-01-01

    Nowadays, satellites are the only monitoring systems that cover almost continuously all possible ocean areas and are now an essential part of operational oceanography. A novel approach based on artificial intelligence (AI) concepts, exploits pasts time series of satellite images to infer near future ocean conditions at the surface by neural networks and genetic algorithms. The size of the AI problem is drastically reduced by splitting the spatio-temporal variability contained in the remote sensing data by using empirical orthogonal function (EOF) decomposition. The problem of forecasting the dynamics of a 2D surface field can thus be reduced by selecting the most relevant empirical modes, and non-linear time series predictors are then applied on the amplitudes only. In the present case study, we use altimetric maps of the Mediterranean Sea, combining TOPEX-POSEIDON and ERS-1/2 data for the period 1992 to 1997. The learning procedure is applied to each mode individually. The final forecast is then reconstructed form the EOFs and the forecasted amplitudes and compared to the real observed field for validation of the method.

  16. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting.

    PubMed

    Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.

  17. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting

    PubMed Central

    Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627

  18. Use of forecasting signatures to help distinguish periodicity, randomness, and chaos in ripples and other spatial patterns

    USGS Publications Warehouse

    Rubin, D.M.

    1992-01-01

    Forecasting of one-dimensional time series previously has been used to help distinguish periodicity, chaos, and noise. This paper presents two-dimensional generalizations for making such distinctions for spatial patterns. The techniques are evaluated using synthetic spatial patterns and then are applied to a natural example: ripples formed in sand by blowing wind. Tests with the synthetic patterns demonstrate that the forecasting techniques can be applied to two-dimensional spatial patterns, with the same utility and limitations as when applied to one-dimensional time series. One limitation is that some combinations of periodicity and randomness exhibit forecasting signatures that mimic those of chaos. For example, sine waves distorted with correlated phase noise have forecasting errors that increase with forecasting distance, errors that, are minimized using nonlinear models at moderate embedding dimensions, and forecasting properties that differ significantly between the original and surrogates. Ripples formed in sand by flowing air or water typically vary in geometry from one to another, even when formed in a flow that is uniform on a large scale; each ripple modifies the local flow or sand-transport field, thereby influencing the geometry of the next ripple downcurrent. Spatial forecasting was used to evaluate the hypothesis that such a deterministic process - rather than randomness or quasiperiodicity - is responsible for the variation between successive ripples. This hypothesis is supported by a forecasting error that increases with forecasting distance, a greater accuracy of nonlinear relative to linear models, and significant differences between forecasts made with the original ripples and those made with surrogate patterns. Forecasting signatures cannot be used to distinguish ripple geometry from sine waves with correlated phase noise, but this kind of structure can be ruled out by two geometric properties of the ripples: Successive ripples are highly correlated in wavelength, and ripple crests display dislocations such as branchings and mergers. ?? 1992 American Institute of Physics.

  19. Comparison of Filtering Methods for the Modeling and Retrospective Forecasting of Influenza Epidemics

    PubMed Central

    Yang, Wan; Karspeck, Alicia; Shaman, Jeffrey

    2014-01-01

    A variety of filtering methods enable the recursive estimation of system state variables and inference of model parameters. These methods have found application in a range of disciplines and settings, including engineering design and forecasting, and, over the last two decades, have been applied to infectious disease epidemiology. For any system of interest, the ideal filter depends on the nonlinearity and complexity of the model to which it is applied, the quality and abundance of observations being entrained, and the ultimate application (e.g. forecast, parameter estimation, etc.). Here, we compare the performance of six state-of-the-art filter methods when used to model and forecast influenza activity. Three particle filters—a basic particle filter (PF) with resampling and regularization, maximum likelihood estimation via iterated filtering (MIF), and particle Markov chain Monte Carlo (pMCMC)—and three ensemble filters—the ensemble Kalman filter (EnKF), the ensemble adjustment Kalman filter (EAKF), and the rank histogram filter (RHF)—were used in conjunction with a humidity-forced susceptible-infectious-recovered-susceptible (SIRS) model and weekly estimates of influenza incidence. The modeling frameworks, first validated with synthetic influenza epidemic data, were then applied to fit and retrospectively forecast the historical incidence time series of seven influenza epidemics during 2003–2012, for 115 cities in the United States. Results suggest that when using the SIRS model the ensemble filters and the basic PF are more capable of faithfully recreating historical influenza incidence time series, while the MIF and pMCMC do not perform as well for multimodal outbreaks. For forecast of the week with the highest influenza activity, the accuracies of the six model-filter frameworks are comparable; the three particle filters perform slightly better predicting peaks 1–5 weeks in the future; the ensemble filters are more accurate predicting peaks in the past. PMID:24762780

  20. Comparison of Two Hybrid Models for Forecasting the Incidence of Hemorrhagic Fever with Renal Syndrome in Jiangsu Province, China

    PubMed Central

    Wu, Wei; Guo, Junqiao; An, Shuyi; Guan, Peng; Ren, Yangwu; Xia, Linzi; Zhou, Baosen

    2015-01-01

    Background Cases of hemorrhagic fever with renal syndrome (HFRS) are widely distributed in eastern Asia, especially in China, Russia, and Korea. It is proved to be a difficult task to eliminate HFRS completely because of the diverse animal reservoirs and effects of global warming. Reliable forecasting is useful for the prevention and control of HFRS. Methods Two hybrid models, one composed of nonlinear autoregressive neural network (NARNN) and autoregressive integrated moving average (ARIMA) the other composed of generalized regression neural network (GRNN) and ARIMA were constructed to predict the incidence of HFRS in the future one year. Performances of the two hybrid models were compared with ARIMA model. Results The ARIMA, ARIMA-NARNN ARIMA-GRNN model fitted and predicted the seasonal fluctuation well. Among the three models, the mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of ARIMA-NARNN hybrid model was the lowest both in modeling stage and forecasting stage. As for the ARIMA-GRNN hybrid model, the MSE, MAE and MAPE of modeling performance and the MSE and MAE of forecasting performance were less than the ARIMA model, but the MAPE of forecasting performance did not improve. Conclusion Developing and applying the ARIMA-NARNN hybrid model is an effective method to make us better understand the epidemic characteristics of HFRS and could be helpful to the prevention and control of HFRS. PMID:26270814

  1. An Excel Solver Exercise to Introduce Nonlinear Regression

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Business students taking business analytics courses that have significant predictive modeling components, such as marketing research, data mining, forecasting, and advanced financial modeling, are introduced to nonlinear regression using application software that is a "black box" to the students. Thus, although correct models are…

  2. Multi-scale Quantitative Precipitation Forecasting Using Nonlinear and Nonstationary Teleconnection Signals and Artificial Neural Network Models

    EPA Science Inventory

    Global sea surface temperature (SST) anomalies can affect terrestrial precipitation via ocean-atmosphere interaction known as climate teleconnection. Non-stationary and non-linear characteristics of the ocean-atmosphere system make the identification of the teleconnection signals...

  3. Predicting financial market crashes using ghost singularities.

    PubMed

    Smug, Damian; Ashwin, Peter; Sornette, Didier

    2018-01-01

    We analyse the behaviour of a non-linear model of coupled stock and bond prices exhibiting periodically collapsing bubbles. By using the formalism of dynamical system theory, we explain what drives the bubbles and how foreshocks or aftershocks are generated. A dynamical phase space representation of that system coupled with standard multiplicative noise rationalises the log-periodic power law singularity pattern documented in many historical financial bubbles. The notion of 'ghosts of finite-time singularities' is introduced and used to estimate the end of an evolving bubble, using finite-time singularities of an approximate normal form near the bifurcation point. We test the forecasting skill of this method on different stochastic price realisations and compare with Monte Carlo simulations of the full system. Remarkably, the approximate normal form is significantly more precise and less biased. Moreover, the method of ghosts of singularities is less sensitive to the noise realisation, thus providing more robust forecasts.

  4. Predicting financial market crashes using ghost singularities

    PubMed Central

    2018-01-01

    We analyse the behaviour of a non-linear model of coupled stock and bond prices exhibiting periodically collapsing bubbles. By using the formalism of dynamical system theory, we explain what drives the bubbles and how foreshocks or aftershocks are generated. A dynamical phase space representation of that system coupled with standard multiplicative noise rationalises the log-periodic power law singularity pattern documented in many historical financial bubbles. The notion of ‘ghosts of finite-time singularities’ is introduced and used to estimate the end of an evolving bubble, using finite-time singularities of an approximate normal form near the bifurcation point. We test the forecasting skill of this method on different stochastic price realisations and compare with Monte Carlo simulations of the full system. Remarkably, the approximate normal form is significantly more precise and less biased. Moreover, the method of ghosts of singularities is less sensitive to the noise realisation, thus providing more robust forecasts. PMID:29596485

  5. Assessing the potential for improving S2S forecast skill through multimodel ensembling

    NASA Astrophysics Data System (ADS)

    Vigaud, N.; Robertson, A. W.; Tippett, M. K.; Wang, L.; Bell, M. J.

    2016-12-01

    Non-linear logistic regression is well suited to probability forecasting and has been successfully applied in the past to ensemble weather and climate predictions, providing access to the full probabilities distribution without any Gaussian assumption. However, little work has been done at sub-monthly lead times where relatively small re-forecast ensembles and lengths represent new challenges for which post-processing avenues have yet to be investigated. A promising approach consists in extending the definition of non-linear logistic regression by including the quantile of the forecast distribution as one of the predictors. So-called Extended Logistic Regression (ELR), which enables mutually consistent individual threshold probabilities, is here applied to ECMWF, CFSv2 and CMA re-forecasts from the S2S database in order to produce rainfall probabilities at weekly resolution. The ELR model is trained on seasonally-varying tercile categories computed for lead times of 1 to 4 weeks. It is then tested in a cross-validated manner, i.e. allowing real-time predictability applications, to produce rainfall tercile probabilities from individual weekly hindcasts that are finally combined by equal pooling. Results will be discussed over a broader North American region, where individual and MME forecasts generated out to 4 weeks lead are characterized by good probabilistic reliability but low sharpness, exhibiting systematically more skill in winter than summer.

  6. Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting

    PubMed Central

    Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M

    2014-01-01

    Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Key Points Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations PMID:26213518

  7. Numerical modeling of nonlinear modulation of coda wave interferometry in a multiple scattering medium with the presence of a localized micro-cracked zone

    NASA Astrophysics Data System (ADS)

    Chen, Guangzhi; Pageot, Damien; Legland, Jean-Baptiste; Abraham, Odile; Chekroun, Mathieu; Tournat, Vincent

    2018-04-01

    The spectral element method is used to perform a parametric sensitivity study of the nonlinear coda wave interferometry (NCWI) method in a homogeneous sample with localized damage [1]. The influence of a strong pump wave on a localized nonlinear damage zone is modeled as modifications to the elastic properties of an effective damage zone (EDZ), depending on the pump wave amplitude. The local change of the elastic modulus and the attenuation coefficient have been shown to vary linearly with respect to the excitation amplitude of the pump wave as in previous experimental studies of Zhang et al. [2]. In this study, the boundary conditions of the cracks, i.e. clapping effects is taken into account in the modeling of the damaged zone. The EDZ is then modeled with random cracks of random orientations, new parametric studies are established to model the pump wave influence with two new parameters: the change of the crack length and the crack density. The numerical results reported constitute another step towards quantification and forecasting of the nonlinear acoustic response of a cracked material, which proves to be necessary for quantitative non-destructive evaluation.

  8. Forecasting the mortality rates using Lee-Carter model and Heligman-Pollard model

    NASA Astrophysics Data System (ADS)

    Ibrahim, R. I.; Ngataman, N.; Abrisam, W. N. A. Wan Mohd

    2017-09-01

    Improvement in life expectancies has driven further declines in mortality. The sustained reduction in mortality rates and its systematic underestimation has been attracting the significant interest of researchers in recent years because of its potential impact on population size and structure, social security systems, and (from an actuarial perspective) the life insurance and pensions industry worldwide. Among all forecasting methods, the Lee-Carter model has been widely accepted by the actuarial community and Heligman-Pollard model has been widely used by researchers in modelling and forecasting future mortality. Therefore, this paper only focuses on Lee-Carter model and Heligman-Pollard model. The main objective of this paper is to investigate how accurately these two models will perform using Malaysian data. Since these models involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 8.0 (MATLAB 8.0) software will be used to estimate the parameters of the models. Autoregressive Integrated Moving Average (ARIMA) procedure is applied to acquire the forecasted parameters for both models as the forecasted mortality rates are obtained by using all the values of forecasted parameters. To investigate the accuracy of the estimation, the forecasted results will be compared against actual data of mortality rates. The results indicate that both models provide better results for male population. However, for the elderly female population, Heligman-Pollard model seems to underestimate to the mortality rates while Lee-Carter model seems to overestimate to the mortality rates.

  9. A comparison of two adaptive multivariate analysis methods (PLSR and ANN) for winter wheat yield forecasting using Landsat-8 OLI images

    NASA Astrophysics Data System (ADS)

    Chen, Pengfei; Jing, Qi

    2017-02-01

    An assumption that the non-linear method is more reasonable than the linear method when canopy reflectance is used to establish the yield prediction model was proposed and tested in this study. For this purpose, partial least squares regression (PLSR) and artificial neural networks (ANN), represented linear and non-linear analysis method, were applied and compared for wheat yield prediction. Multi-period Landsat-8 OLI images were collected at two different wheat growth stages, and a field campaign was conducted to obtain grain yields at selected sampling sites in 2014. The field data were divided into a calibration database and a testing database. Using calibration data, a cross-validation concept was introduced for the PLSR and ANN model construction to prevent over-fitting. All models were tested using the test data. The ANN yield-prediction model produced R2, RMSE and RMSE% values of 0.61, 979 kg ha-1, and 10.38%, respectively, in the testing phase, performing better than the PLSR yield-prediction model, which produced R2, RMSE, and RMSE% values of 0.39, 1211 kg ha-1, and 12.84%, respectively. Non-linear method was suggested as a better method for yield prediction.

  10. Novel images extraction model using improved delay vector variance feature extraction and multi-kernel neural network for EEG detection and prediction.

    PubMed

    Ge, Jing; Zhang, Guoping

    2015-01-01

    Advanced intelligent methodologies could help detect and predict diseases from the EEG signals in cases the manual analysis is inefficient available, for instance, the epileptic seizures detection and prediction. This is because the diversity and the evolution of the epileptic seizures make it very difficult in detecting and identifying the undergoing disease. Fortunately, the determinism and nonlinearity in a time series could characterize the state changes. Literature review indicates that the Delay Vector Variance (DVV) could examine the nonlinearity to gain insight into the EEG signals but very limited work has been done to address the quantitative DVV approach. Hence, the outcomes of the quantitative DVV should be evaluated to detect the epileptic seizures. To develop a new epileptic seizure detection method based on quantitative DVV. This new epileptic seizure detection method employed an improved delay vector variance (IDVV) to extract the nonlinearity value as a distinct feature. Then a multi-kernel functions strategy was proposed in the extreme learning machine (ELM) network to provide precise disease detection and prediction. The nonlinearity is more sensitive than the energy and entropy. 87.5% overall accuracy of recognition and 75.0% overall accuracy of forecasting were achieved. The proposed IDVV and multi-kernel ELM based method was feasible and effective for epileptic EEG detection. Hence, the newly proposed method has importance for practical applications.

  11. Using a Hybrid Model to Forecast the Prevalence of Schistosomiasis in Humans

    PubMed Central

    Zhou, Lingling; Xia, Jing; Yu, Lijing; Wang, Ying; Shi, Yun; Cai, Shunxiang; Nie, Shaofa

    2016-01-01

    Background: We previously proposed a hybrid model combining both the autoregressive integrated moving average (ARIMA) and the nonlinear autoregressive neural network (NARNN) models in forecasting schistosomiasis. Our purpose in the current study was to forecast the annual prevalence of human schistosomiasis in Yangxin County, using our ARIMA-NARNN model, thereby further certifying the reliability of our hybrid model. Methods: We used the ARIMA, NARNN and ARIMA-NARNN models to fit and forecast the annual prevalence of schistosomiasis. The modeling time range included was the annual prevalence from 1956 to 2008 while the testing time range included was from 2009 to 2012. The mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) were used to measure the model performance. We reconstructed the hybrid model to forecast the annual prevalence from 2013 to 2016. Results: The modeling and testing errors generated by the ARIMA-NARNN model were lower than those obtained from either the single ARIMA or NARNN models. The predicted annual prevalence from 2013 to 2016 demonstrated an initial decreasing trend, followed by an increase. Conclusions: The ARIMA-NARNN model can be well applied to analyze surveillance data for early warning systems for the control and elimination of schistosomiasis. PMID:27023573

  12. On-line estimation of nonlinear physical systems

    USGS Publications Warehouse

    Christakos, G.

    1988-01-01

    Recursive algorithms for estimating states of nonlinear physical systems are presented. Orthogonality properties are rediscovered and the associated polynomials are used to linearize state and observation models of the underlying random processes. This requires some key hypotheses regarding the structure of these processes, which may then take account of a wide range of applications. The latter include streamflow forecasting, flood estimation, environmental protection, earthquake engineering, and mine planning. The proposed estimation algorithm may be compared favorably to Taylor series-type filters, nonlinear filters which approximate the probability density by Edgeworth or Gram-Charlier series, as well as to conventional statistical linearization-type estimators. Moreover, the method has several advantages over nonrecursive estimators like disjunctive kriging. To link theory with practice, some numerical results for a simulated system are presented, in which responses from the proposed and extended Kalman algorithms are compared. ?? 1988 International Association for Mathematical Geology.

  13. The analysis and application of a new hybrid pollutants forecasting model using modified Kolmogorov-Zurbenko filter.

    PubMed

    Li, Peizhi; Wang, Yong; Dong, Qingli

    2017-04-01

    Cities in China suffer from severe smog and haze, and a forecasting system with high accuracy is of great importance to foresee the concentrations of the airborne particles. Compared with chemical transport models, the growing artificial intelligence models can simulate nonlinearities and interactive relationships and getting more accurate results. In this paper, the Kolmogorov-Zurbenko (KZ) filter is modified and firstly applied to construct the model using an artificial intelligence method. The concentration of inhalable particles and fine particulate matter in Dalian are used to analyze the filtered components and test the forecasting accuracy. Besides, an extended experiment is made by implementing a comprehensive comparison and a stability test using data in three other cities in China. Results testify the excellent performance of the developed hybrid models, which can be utilized to better understand the temporal features of pollutants and to perform a better air pollution control and management. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. The analysis and forecasting of male cycling time trial records established within England and Wales.

    PubMed

    Dyer, Bryce; Hassani, Hossein; Shadi, Mehran

    2016-01-01

    The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.

  15. Simulating the effect of non-linear mode coupling in cosmological parameter estimation

    NASA Astrophysics Data System (ADS)

    Kiessling, A.; Taylor, A. N.; Heavens, A. F.

    2011-09-01

    Fisher Information Matrix methods are commonly used in cosmology to estimate the accuracy that cosmological parameters can be measured with a given experiment and to optimize the design of experiments. However, the standard approach usually assumes both data and parameter estimates are Gaussian-distributed. Further, for survey forecasts and optimization it is usually assumed that the power-spectrum covariance matrix is diagonal in Fourier space. However, in the low-redshift Universe, non-linear mode coupling will tend to correlate small-scale power, moving information from lower to higher order moments of the field. This movement of information will change the predictions of cosmological parameter accuracy. In this paper we quantify this loss of information by comparing naïve Gaussian Fisher matrix forecasts with a maximum likelihood parameter estimation analysis of a suite of mock weak lensing catalogues derived from N-body simulations, based on the SUNGLASS pipeline, for a 2D and tomographic shear analysis of a Euclid-like survey. In both cases, we find that the 68 per cent confidence area of the Ωm-σ8 plane increases by a factor of 5. However, the marginal errors increase by just 20-40 per cent. We propose a new method to model the effects of non-linear shear-power mode coupling in the Fisher matrix by approximating the shear-power distribution as a multivariate Gaussian with a covariance matrix derived from the mock weak lensing survey. We find that this approximation can reproduce the 68 per cent confidence regions of the full maximum likelihood analysis in the Ωm-σ8 plane to high accuracy for both 2D and tomographic weak lensing surveys. Finally, we perform a multiparameter analysis of Ωm, σ8, h, ns, w0 and wa to compare the Gaussian and non-linear mode-coupled Fisher matrix contours. The 6D volume of the 1σ error contours for the non-linear Fisher analysis is a factor of 3 larger than for the Gaussian case, and the shape of the 68 per cent confidence volume is modified. We propose that future Fisher matrix estimates of cosmological parameter accuracies should include mode-coupling effects.

  16. Improved nonlinear prediction method

    NASA Astrophysics Data System (ADS)

    Adenan, Nur Hamiza; Md Noorani, Mohd Salmi

    2014-06-01

    The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.

  17. Forecasting daily source air quality using multivariate statistical analysis and radial basis function networks.

    PubMed

    Sun, Gang; Hoff, Steven J; Zelle, Brian C; Nelson, Minda A

    2008-12-01

    It is vital to forecast gas and particle matter concentrations and emission rates (GPCER) from livestock production facilities to assess the impact of airborne pollutants on human health, ecological environment, and global warming. Modeling source air quality is a complex process because of abundant nonlinear interactions between GPCER and other factors. The objective of this study was to introduce statistical methods and radial basis function (RBF) neural network to predict daily source air quality in Iowa swine deep-pit finishing buildings. The results show that four variables (outdoor and indoor temperature, animal units, and ventilation rates) were identified as relative important model inputs using statistical methods. It can be further demonstrated that only two factors, the environment factor and the animal factor, were capable of explaining more than 94% of the total variability after performing principal component analysis. The introduction of fewer uncorrelated variables to the neural network would result in the reduction of the model structure complexity, minimize computation cost, and eliminate model overfitting problems. The obtained results of RBF network prediction were in good agreement with the actual measurements, with values of the correlation coefficient between 0.741 and 0.995 and very low values of systemic performance indexes for all the models. The good results indicated the RBF network could be trained to model these highly nonlinear relationships. Thus, the RBF neural network technology combined with multivariate statistical methods is a promising tool for air pollutant emissions modeling.

  18. Improved ensemble-mean forecasting of ENSO events by a zero-mean stochastic error model of an intermediate coupled model

    NASA Astrophysics Data System (ADS)

    Zheng, Fei; Zhu, Jiang

    2017-04-01

    How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.

  19. Extracting Leading Nonlinear Modes of Changing Climate From Global SST Time Series

    NASA Astrophysics Data System (ADS)

    Mukhin, D.; Gavrilov, A.; Loskutov, E. M.; Feigin, A. M.; Kurths, J.

    2017-12-01

    Data-driven modeling of climate requires adequate principal variables extracted from observed high-dimensional data. For constructing such variables it is needed to find spatial-temporal patterns explaining a substantial part of the variability and comprising all dynamically related time series from the data. The difficulties of this task rise from the nonlinearity and non-stationarity of the climate dynamical system. The nonlinearity leads to insufficiency of linear methods of data decomposition for separating different processes entangled in the observed time series. On the other hand, various forcings, both anthropogenic and natural, make the dynamics non-stationary, and we should be able to describe the response of the system to such forcings in order to separate the modes explaining the internal variability. The method we present is aimed to overcome both these problems. The method is based on the Nonlinear Dynamical Mode (NDM) decomposition [1,2], but takes into account external forcing signals. An each mode depends on hidden, unknown a priori, time series which, together with external forcing time series, are mapped onto data space. Finding both the hidden signals and the mapping allows us to study the evolution of the modes' structure in changing external conditions and to compare the roles of the internal variability and forcing in the observed behavior. The method is used for extracting of the principal modes of SST variability on inter-annual and multidecadal time scales accounting the external forcings such as CO2, variations of the solar activity and volcanic activity. The structure of the revealed teleconnection patterns as well as their forecast under different CO2 emission scenarios are discussed.[1] Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. [2] Gavrilov, A., Mukhin, D., Loskutov, E., Volodin, E., Feigin, A., & Kurths, J. (2016). Method for reconstructing nonlinear modes with adaptive structure from multidimensional data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 26(12), 123101.

  20. ADAPTATION AND APPLICATION OF THE COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODELING SYSTEM FOR REAL-TIME AIR QUALITY FORECASTING DURING THE SUMMER OF 2004

    EPA Science Inventory

    The ability to forecast local and regional air pollution events is challenging since the processes governing the production and sustenance of atmospheric pollutants are complex and often non-linear. Comprehensive atmospheric models, by representing in as much detail as possible t...

  1. West-WRF Sensitivity to Sea Surface Temperature Boundary Condition in California Precipitation Forecasts of AR Related Events

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Cornuelle, B. D.; Martin, A.; Weihs, R. R.; Ralph, M.

    2017-12-01

    We evaluated the merit in coastal precipitation forecasts by inclusion of high resolution sea surface temperature (SST) from blended satellite and in situ observations as a boundary condition (BC) to the Weather Research and Forecast (WRF) mesoscale model through simple perturbation tests. Our sensitivity analyses shows that the limited improvement of watershed scale precipitation forecast is credible. When only SST BC is changed, there is an uncertainty introduced because of artificial model state equilibrium and the nonlinear nature of the WRF model system. With the change of SST on the order of a fraction of a degree centigrade, we found that the part of random perturbation forecast response is saturated after 48 hours when it reaches to the order magnitude of the linear response. It is important to update the SST at a shorter time period, so that the independent excited nonlinear modes can cancel each other. The uncertainty in our SST configuration is quantitatively equivalent to adding to a spatially uncorrelated Guasian noise of zero mean and 0.05 degree of standard deviation to the SST. At this random noise perturbation magnitude, the ensemble average behaves well within a convergent range. It is also found that the sensitivity of forecast changes in response to SST changes. This is measured by the ratio of the spatial variability of mean of the ensemble perturbations over the spatial variability of the corresponding forecast. The ratio is about 10% for surface latent heat flux, 5 % for IWV, and less than 1% for surface pressure.

  2. Predicting climate effects on Pacific sardine

    PubMed Central

    Deyle, Ethan R.; Fogarty, Michael; Hsieh, Chih-hao; Kaufman, Les; MacCall, Alec D.; Munch, Stephan B.; Perretti, Charles T.; Ye, Hao; Sugihara, George

    2013-01-01

    For many marine species and habitats, climate change and overfishing present a double threat. To manage marine resources effectively, it is necessary to adapt management to changes in the physical environment. Simple relationships between environmental conditions and fish abundance have long been used in both fisheries and fishery management. In many cases, however, physical, biological, and human variables feed back on each other. For these systems, associations between variables can change as the system evolves in time. This can obscure relationships between population dynamics and environmental variability, undermining our ability to forecast changes in populations tied to physical processes. Here we present a methodology for identifying physical forcing variables based on nonlinear forecasting and show how the method provides a predictive understanding of the influence of physical forcing on Pacific sardine. PMID:23536299

  3. Convective Weather Forecast Quality Metrics for Air Traffic Management Decision-Making

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano B.; Gyarfas, Brett; Chan, William N.; Meyn, Larry A.

    2006-01-01

    Since numerical weather prediction models are unable to accurately forecast the severity and the location of the storm cells several hours into the future when compared with observation data, there has been a growing interest in probabilistic description of convective weather. The classical approach for generating uncertainty bounds consists of integrating the state equations and covariance propagation equations forward in time. This step is readily recognized as the process update step of the Kalman Filter algorithm. The second well known method, known as the Monte Carlo method, consists of generating output samples by driving the forecast algorithm with input samples selected from distributions. The statistical properties of the distributions of the output samples are then used for defining the uncertainty bounds of the output variables. This method is computationally expensive for a complex model compared to the covariance propagation method. The main advantage of the Monte Carlo method is that a complex non-linear model can be easily handled. Recently, a few different methods for probabilistic forecasting have appeared in the literature. A method for computing probability of convection in a region using forecast data is described in Ref. 5. Probability at a grid location is computed as the fraction of grid points, within a box of specified dimensions around the grid location, with forecast convection precipitation exceeding a specified threshold. The main limitation of this method is that the results are dependent on the chosen dimensions of the box. The examples presented Ref. 5 show that this process is equivalent to low-pass filtering of the forecast data with a finite support spatial filter. References 6 and 7 describe the technique for computing percentage coverage within a 92 x 92 square-kilometer box and assigning the value to the center 4 x 4 square-kilometer box. This technique is same as that described in Ref. 5. Characterizing the forecast, following the process described in Refs. 5 through 7, in terms of percentage coverage or confidence level is notionally sound compared to characterizing in terms of probabilities because the probability of the forecast being correct can only be determined using actual observations. References 5 through 7 only use the forecast data and not the observations. The method for computing the probability of detection, false alarm ratio and several forecast quality metrics (Skill Scores) using both the forecast and observation data are given in Ref. 2. This paper extends the statistical verification method in Ref. 2 to determine co-occurrence probabilities. The method consists of computing the probability that a severe weather cell (grid location) is detected in the observation data in the neighborhood of the severe weather cell in the forecast data. Probabilities of occurrence at the grid location and in its neighborhood with higher severity, and with lower severity in the observation data compared to that in the forecast data are examined. The method proposed in Refs. 5 through 7 is used for computing the probability that a certain number of cells in the neighborhood of severe weather cells in the forecast data are seen as severe weather cells in the observation data. Finally, the probability of existence of gaps in the observation data in the neighborhood of severe weather cells in forecast data is computed. Gaps are defined as openings between severe weather cells through which an aircraft can safely fly to its intended destination. The rest of the paper is organized as follows. Section II summarizes the statistical verification method described in Ref. 2. The extension of this method for computing the co-occurrence probabilities in discussed in Section HI. Numerical examples using NCWF forecast data and NCWD observation data are presented in Section III to elucidate the characteristics of the co-occurrence probabilities. This section also discusses the procedure for computing throbabilities that the severity of convection in the observation data will be higher or lower in the neighborhood of grid locations compared to that indicated at the grid locations in the forecast data. The probability of coverage of neighborhood grid cells is also described via examples in this section. Section IV discusses the gap detection algorithm and presents a numerical example to illustrate the method. The locations of the detected gaps in the observation data are used along with the locations of convective weather cells in the forecast data to determine the probability of existence of gaps in the neighborhood of these cells. Finally, the paper is concluded in Section V.

  4. Seasonal forecasting of hydrological drought in the Limpopo Basin: a comparison of statistical methods

    NASA Astrophysics Data System (ADS)

    Seibert, Mathias; Merz, Bruno; Apel, Heiko

    2017-03-01

    The Limpopo Basin in southern Africa is prone to droughts which affect the livelihood of millions of people in South Africa, Botswana, Zimbabwe and Mozambique. Seasonal drought early warning is thus vital for the whole region. In this study, the predictability of hydrological droughts during the main runoff period from December to May is assessed using statistical approaches. Three methods (multiple linear models, artificial neural networks, random forest regression trees) are compared in terms of their ability to forecast streamflow with up to 12 months of lead time. The following four main findings result from the study. 1. There are stations in the basin at which standardised streamflow is predictable with lead times up to 12 months. The results show high inter-station differences of forecast skill but reach a coefficient of determination as high as 0.73 (cross validated). 2. A large range of potential predictors is considered in this study, comprising well-established climate indices, customised teleconnection indices derived from sea surface temperatures and antecedent streamflow as a proxy of catchment conditions. El Niño and customised indices, representing sea surface temperature in the Atlantic and Indian oceans, prove to be important teleconnection predictors for the region. Antecedent streamflow is a strong predictor in small catchments (with median 42 % explained variance), whereas teleconnections exert a stronger influence in large catchments. 3. Multiple linear models show the best forecast skill in this study and the greatest robustness compared to artificial neural networks and random forest regression trees, despite their capabilities to represent nonlinear relationships. 4. Employed in early warning, the models can be used to forecast a specific drought level. Even if the coefficient of determination is low, the forecast models have a skill better than a climatological forecast, which is shown by analysis of receiver operating characteristics (ROCs). Seasonal statistical forecasts in the Limpopo show promising results, and thus it is recommended to employ them as complementary to existing forecasts in order to strengthen preparedness for droughts.

  5. Development of Real-time Tsunami Inundation Forecast Using Ocean Bottom Tsunami Networks along the Japan Trench

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.

    2015-12-01

    In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.

  6. Systematic construction and control of stereo nerve vision network in intelligent manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Hua; Wang, Helong; Guo, Chunjie; Ding, Quanxin; Zhou, Liwei

    2017-10-01

    A system method of constructing stereo vision by using neural network is proposed, and the operation and control mechanism in actual operation are proposed. This method makes effective use of the neural network in learning and memory function, by after training with samples. Moreover, the neural network can learn the nonlinear relationship in the stereoscopic vision system and the internal and external orientation elements. These considerations are Worthy of attention, which includes limited constraints, the scientific of critical group, the operating speed and the operability in technical aspects. The results support our theoretical forecast.

  7. Dynamics of electricity market correlations

    NASA Astrophysics Data System (ADS)

    Alvarez-Ramirez, J.; Escarela-Perez, R.; Espinosa-Perez, G.; Urrea, R.

    2009-06-01

    Electricity market participants rely on demand and price forecasts to decide their bidding strategies, allocate assets, negotiate bilateral contracts, hedge risks, and plan facility investments. However, forecasting is hampered by the non-linear and stochastic nature of price time series. Diverse modeling strategies, from neural networks to traditional transfer functions, have been explored. These approaches are based on the assumption that price series contain correlations that can be exploited for model-based prediction purposes. While many works have been devoted to the demand and price modeling, a limited number of reports on the nature and dynamics of electricity market correlations are available. This paper uses detrended fluctuation analysis to study correlations in the demand and price time series and takes the Australian market as a case study. The results show the existence of correlations in both demand and prices over three orders of magnitude in time ranging from hours to months. However, the Hurst exponent is not constant over time, and its time evolution was computed over a subsample moving window of 250 observations. The computations, also made for two Canadian markets, show that the correlations present important fluctuations over a seasonal one-year cycle. Interestingly, non-linearities (measured in terms of a multifractality index) and reduced price predictability are found for the June-July periods, while the converse behavior is displayed during the December-January period. In terms of forecasting models, our results suggest that non-linear recursive models should be considered for accurate day-ahead price estimation. On the other hand, linear models seem to suffice for demand forecasting purposes.

  8. Artificial neural network model of the hybrid EGARCH volatility of the Taiwan stock index option prices

    NASA Astrophysics Data System (ADS)

    Tseng, Chih-Hsiung; Cheng, Sheng-Tzong; Wang, Yi-Hsien; Peng, Jin-Tang

    2008-05-01

    This investigation integrates a novel hybrid asymmetric volatility approach into an Artificial Neural Networks option-pricing model to upgrade the forecasting ability of the price of derivative securities. The use of the new hybrid asymmetric volatility method can simultaneously decrease the stochastic and nonlinearity of the error term sequence, and capture the asymmetric volatility. Therefore, analytical results of the ANNS option-pricing model reveal that Grey-EGARCH volatility provides greater predictability than other volatility approaches.

  9. Comparison of Two Hybrid Models for Forecasting the Incidence of Hemorrhagic Fever with Renal Syndrome in Jiangsu Province, China.

    PubMed

    Wu, Wei; Guo, Junqiao; An, Shuyi; Guan, Peng; Ren, Yangwu; Xia, Linzi; Zhou, Baosen

    2015-01-01

    Cases of hemorrhagic fever with renal syndrome (HFRS) are widely distributed in eastern Asia, especially in China, Russia, and Korea. It is proved to be a difficult task to eliminate HFRS completely because of the diverse animal reservoirs and effects of global warming. Reliable forecasting is useful for the prevention and control of HFRS. Two hybrid models, one composed of nonlinear autoregressive neural network (NARNN) and autoregressive integrated moving average (ARIMA) the other composed of generalized regression neural network (GRNN) and ARIMA were constructed to predict the incidence of HFRS in the future one year. Performances of the two hybrid models were compared with ARIMA model. The ARIMA, ARIMA-NARNN ARIMA-GRNN model fitted and predicted the seasonal fluctuation well. Among the three models, the mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of ARIMA-NARNN hybrid model was the lowest both in modeling stage and forecasting stage. As for the ARIMA-GRNN hybrid model, the MSE, MAE and MAPE of modeling performance and the MSE and MAE of forecasting performance were less than the ARIMA model, but the MAPE of forecasting performance did not improve. Developing and applying the ARIMA-NARNN hybrid model is an effective method to make us better understand the epidemic characteristics of HFRS and could be helpful to the prevention and control of HFRS.

  10. Real-time management of a multipurpose water reservoir with a heteroscedastic inflow model

    NASA Astrophysics Data System (ADS)

    Pianosi, F.; Soncini-Sessa, R.

    2009-10-01

    Stochastic dynamic programming has been extensively used as a method for designing optimal regulation policies for water reservoirs. However, the potential of this method is dramatically reduced by its computational burden, which often forces to introduce strong approximations in the model of the system, especially in the description of the reservoir inflow. In this paper, an approach to partially remedy this problem is proposed and applied to a real world case study. It foresees solving the management problem on-line, using a reduced model of the system and the inflow forecast provided by a dynamic model. By doing so, all the hydrometeorological information that is available in real-time is fully exploited. The model here proposed for the inflow forecasting is a nonlinear, heteroscedastic model that provides both the expected value and the standard deviation of the inflow through dynamic relations. The effectiveness of such model for the purpose of the reservoir regulation is evaluated through simulation and comparison with the results provided by conventional homoscedastic inflow models.

  11. Optimization of GM(1,1) power model

    NASA Astrophysics Data System (ADS)

    Luo, Dang; Sun, Yu-ling; Song, Bo

    2013-10-01

    GM (1,1) power model is the expansion of traditional GM (1,1) model and Grey Verhulst model. Compared with the traditional models, GM (1,1) power model has the following advantage: The power exponent in the model which best matches the actual data values can be found by certain technology. So, GM (1,1) power model can reflect nonlinear features of the data, simulate and forecast with high accuracy. It's very important to determine the best power exponent during the modeling process. In this paper, according to the GM(1,1) power model of albino equation is Bernoulli equation, through variable substitution, turning it into the GM(1,1) model of the linear albino equation form, and then through the grey differential equation properly built, established GM(1,1) power model, and parameters with pattern search method solution. Finally, we illustrate the effectiveness of the new methods with the example of simulating and forecasting the promotion rates from senior secondary schools to higher education in China.

  12. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting.

    PubMed

    Hippert, Henrique S; Taylor, James W

    2010-04-01

    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Parametric reduced models for the nonlinear Schrödinger equation

    NASA Astrophysics Data System (ADS)

    Harlim, John; Li, Xiantao

    2015-05-01

    Reduced models for the (defocusing) nonlinear Schrödinger equation are developed. In particular, we develop reduced models that only involve the low-frequency modes given noisy observations of these modes. The ansatz of the reduced parametric models are obtained by employing a rational approximation and a colored-noise approximation, respectively, on the memory terms and the random noise of a generalized Langevin equation that is derived from the standard Mori-Zwanzig formalism. The parameters in the resulting reduced models are inferred from noisy observations with a recently developed ensemble Kalman filter-based parametrization method. The forecasting skill across different temperature regimes are verified by comparing the moments up to order four, a two-time correlation function statistics, and marginal densities of the coarse-grained variables.

  14. Parametric reduced models for the nonlinear Schrödinger equation.

    PubMed

    Harlim, John; Li, Xiantao

    2015-05-01

    Reduced models for the (defocusing) nonlinear Schrödinger equation are developed. In particular, we develop reduced models that only involve the low-frequency modes given noisy observations of these modes. The ansatz of the reduced parametric models are obtained by employing a rational approximation and a colored-noise approximation, respectively, on the memory terms and the random noise of a generalized Langevin equation that is derived from the standard Mori-Zwanzig formalism. The parameters in the resulting reduced models are inferred from noisy observations with a recently developed ensemble Kalman filter-based parametrization method. The forecasting skill across different temperature regimes are verified by comparing the moments up to order four, a two-time correlation function statistics, and marginal densities of the coarse-grained variables.

  15. The development of a combined mathematical model to forecast the incidence of hepatitis E in Shanghai, China.

    PubMed

    Ren, Hong; Li, Jian; Yuan, Zheng-An; Hu, Jia-Yu; Yu, Yan; Lu, Yi-Han

    2013-09-08

    Sporadic hepatitis E has become an important public health concern in China. Accurate forecasting of the incidence of hepatitis E is needed to better plan future medical needs. Few mathematical models can be used because hepatitis E morbidity data has both linear and nonlinear patterns. We developed a combined mathematical model using an autoregressive integrated moving average model (ARIMA) and a back propagation neural network (BPNN) to forecast the incidence of hepatitis E. The morbidity data of hepatitis E in Shanghai from 2000 to 2012 were retrieved from the China Information System for Disease Control and Prevention. The ARIMA-BPNN combined model was trained with 144 months of morbidity data from January 2000 to December 2011, validated with 12 months of data January 2012 to December 2012, and then employed to forecast hepatitis E incidence January 2013 to December 2013 in Shanghai. Residual analysis, Root Mean Square Error (RMSE), normalized Bayesian Information Criterion (BIC), and stationary R square methods were used to compare the goodness-of-fit among ARIMA models. The Bayesian regularization back-propagation algorithm was used to train the network. The mean error rate (MER) was used to assess the validity of the combined model. A total of 7,489 hepatitis E cases was reported in Shanghai from 2000 to 2012. Goodness-of-fit (stationary R2=0.531, BIC= -4.768, Ljung-Box Q statistics=15.59, P=0.482) and parameter estimates were used to determine the best-fitting model as ARIMA (0,1,1)×(0,1,1)12. Predicted morbidity values in 2012 from best-fitting ARIMA model and actual morbidity data from 2000 to 2011 were used to further construct the combined model. The MER of the ARIMA model and the ARIMA-BPNN combined model were 0.250 and 0.176, respectively. The forecasted incidence of hepatitis E in 2013 was 0.095 to 0.372 per 100,000 population. There was a seasonal variation with a peak during January-March and a nadir during August-October. Time series analysis suggested a seasonal pattern of hepatitis E morbidity in Shanghai, China. An ARIMA-BPNN combined model was used to fit the linear and nonlinear patterns of time series data, and accurately forecast hepatitis E infections.

  16. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  17. Subsurface characterization with localized ensemble Kalman filter employing adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Delijani, Ebrahim Biniaz; Pishvaie, Mahmoud Reza; Boozarjomehry, Ramin Bozorgmehry

    2014-07-01

    Ensemble Kalman filter, EnKF, as a Monte Carlo sequential data assimilation method has emerged promisingly for subsurface media characterization during past decade. Due to high computational cost of large ensemble size, EnKF is limited to small ensemble set in practice. This results in appearance of spurious correlation in covariance structure leading to incorrect or probable divergence of updated realizations. In this paper, a universal/adaptive thresholding method is presented to remove and/or mitigate spurious correlation problem in the forecast covariance matrix. This method is, then, extended to regularize Kalman gain directly. Four different thresholding functions have been considered to threshold forecast covariance and gain matrices. These include hard, soft, lasso and Smoothly Clipped Absolute Deviation (SCAD) functions. Three benchmarks are used to evaluate the performances of these methods. These benchmarks include a small 1D linear model and two 2D water flooding (in petroleum reservoirs) cases whose levels of heterogeneity/nonlinearity are different. It should be noted that beside the adaptive thresholding, the standard distance dependant localization and bootstrap Kalman gain are also implemented for comparison purposes. We assessed each setup with different ensemble sets to investigate the sensitivity of each method on ensemble size. The results indicate that thresholding of forecast covariance yields more reliable performance than Kalman gain. Among thresholding function, SCAD is more robust for both covariance and gain estimation. Our analyses emphasize that not all assimilation cycles do require thresholding and it should be performed wisely during the early assimilation cycles. The proposed scheme of adaptive thresholding outperforms other methods for subsurface characterization of underlying benchmarks.

  18. Prediction of ozone concentration in tropospheric levels using artificial neural networks and support vector machine at Rio de Janeiro, Brazil

    NASA Astrophysics Data System (ADS)

    Luna, A. S.; Paredes, M. L. L.; de Oliveira, G. C. G.; Corrêa, S. M.

    2014-12-01

    It is well known that air quality is a complex function of emissions, meteorology and topography, and statistical tools provide a sound framework for relating these variables. The observed data were contents of nitrogen dioxide (NO2), nitrogen monoxide (NO), nitrogen oxides (NOx), carbon monoxide (CO), ozone (O3), scalar wind speed (SWS), global solar radiation (GSR), temperature (TEM), moisture content in the air (HUM), collected by a mobile automatic monitoring station at Rio de Janeiro City in two places of the metropolitan area during 2011 and 2012. The aims of this study were: (1) to analyze the behavior of the variables, using the method of PCA for exploratory data analysis; (2) to propose forecasts of O3 levels from primary pollutants and meteorological factors, using nonlinear regression methods like ANN and SVM, from primary pollutants and meteorological factors. The PCA technique showed that for first dataset, variables NO, NOx and SWS have a greater impact on the concentration of O3 and the other data set had the TEM and GSR as the most influential variables. The obtained results from the nonlinear regression techniques ANN and SVM were remarkably closely and acceptable to one dataset presenting coefficient of determination for validation respectively 0.9122 and 0.9152, and root mean square error of 7.66 and 7.85, respectively. For these datasets, the PCA, SVM and ANN had demonstrated their robustness as useful tools for evaluation, and forecast scenarios for air quality.

  19. Beyond adaptive-critic creative learning for intelligent mobile robots

    NASA Astrophysics Data System (ADS)

    Liao, Xiaoqun; Cao, Ming; Hall, Ernest L.

    2001-10-01

    Intelligent industrial and mobile robots may be considered proven technology in structured environments. Teach programming and supervised learning methods permit solutions to a variety of applications. However, we believe that to extend the operation of these machines to more unstructured environments requires a new learning method. Both unsupervised learning and reinforcement learning are potential candidates for these new tasks. The adaptive critic method has been shown to provide useful approximations or even optimal control policies to non-linear systems. The purpose of this paper is to explore the use of new learning methods that goes beyond the adaptive critic method for unstructured environments. The adaptive critic is a form of reinforcement learning. A critic element provides only high level grading corrections to a cognition module that controls the action module. In the proposed system the critic's grades are modeled and forecasted, so that an anticipated set of sub-grades are available to the cognition model. The forecasting grades are interpolated and are available on the time scale needed by the action model. The success of the system is highly dependent on the accuracy of the forecasted grades and adaptability of the action module. Examples from the guidance of a mobile robot are provided to illustrate the method for simple line following and for the more complex navigation and control in an unstructured environment. The theory presented that is beyond the adaptive critic may be called creative theory. Creative theory is a form of learning that models the highest level of human learning - imagination. The application of the creative theory appears to not only be to mobile robots but also to many other forms of human endeavor such as educational learning and business forecasting. Reinforcement learning such as the adaptive critic may be applied to known problems to aid in the discovery of their solutions. The significance of creative theory is that it permits the discovery of the unknown problems, ones that are not yet recognized but may be critical to survival or success.

  20. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    NASA Astrophysics Data System (ADS)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  1. Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias

    2016-06-25

    This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.

  2. Testing for nonlinearity in time series: The method of surrogate data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, J.; Galdrikian, B.; Longtin, A.

    1991-01-01

    We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less

  3. Confidence intervals in Flow Forecasting by using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Panagoulia, Dionysia; Tsekouras, George

    2014-05-01

    One of the major inadequacies in implementation of Artificial Neural Networks (ANNs) for flow forecasting is the development of confidence intervals, because the relevant estimation cannot be implemented directly, contrasted to the classical forecasting methods. The variation in the ANN output is a measure of uncertainty in the model predictions based on the training data set. Different methods for uncertainty analysis, such as bootstrap, Bayesian, Monte Carlo, have already proposed for hydrologic and geophysical models, while methods for confidence intervals, such as error output, re-sampling, multi-linear regression adapted to ANN have been used for power load forecasting [1-2]. The aim of this paper is to present the re-sampling method for ANN prediction models and to develop this for flow forecasting of the next day. The re-sampling method is based on the ascending sorting of the errors between real and predicted values for all input vectors. The cumulative sample distribution function of the prediction errors is calculated and the confidence intervals are estimated by keeping the intermediate value, rejecting the extreme values according to the desired confidence levels, and holding the intervals symmetrical in probability. For application of the confidence intervals issue, input vectors are used from the Mesochora catchment in western-central Greece. The ANN's training algorithm is the stochastic training back-propagation process with decreasing functions of learning rate and momentum term, for which an optimization process is conducted regarding the crucial parameters values, such as the number of neurons, the kind of activation functions, the initial values and time parameters of learning rate and momentum term etc. Input variables are historical data of previous days, such as flows, nonlinearly weather related temperatures and nonlinearly weather related rainfalls based on correlation analysis between the under prediction flow and each implicit input variable of different ANN structures [3]. The performance of each ANN structure is evaluated by the voting analysis based on eleven criteria, which are the root mean square error (RMSE), the correlation index (R), the mean absolute percentage error (MAPE), the mean percentage error (MPE), the mean percentage error (ME), the percentage volume in errors (VE), the percentage error in peak (MF), the normalized mean bias error (NMBE), the normalized root mean bias error (NRMSE), the Nash-Sutcliffe model efficiency coefficient (E) and the modified Nash-Sutcliffe model efficiency coefficient (E1). The next day flow for the test set is calculated using the best ANN structure's model. Consequently, the confidence intervals of various confidence levels for training, evaluation and test sets are compared in order to explore the generalisation dynamics of confidence intervals from training and evaluation sets. [1] H.S. Hippert, C.E. Pedreira, R.C. Souza, "Neural networks for short-term load forecasting: A review and evaluation," IEEE Trans. on Power Systems, vol. 16, no. 1, 2001, pp. 44-55. [2] G. J. Tsekouras, N.E. Mastorakis, F.D. Kanellos, V.T. Kontargyri, C.D. Tsirekis, I.S. Karanasiou, Ch.N. Elias, A.D. Salis, P.A. Kontaxis, A.A. Gialketsi: "Short term load forecasting in Greek interconnected power system using ANN: Confidence Interval using a novel re-sampling technique with corrective Factor", WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing, (CSECS '10), Vouliagmeni, Athens, Greece, December 29-31, 2010. [3] D. Panagoulia, I. Trichakis, G. J. Tsekouras: "Flow Forecasting via Artificial Neural Networks - A Study for Input Variables conditioned on atmospheric circulation", European Geosciences Union, General Assembly 2012 (NH1.1 / AS1.16 - Extreme meteorological and hydrological events induced by severe weather and climate change), Vienna, Austria, 22-27 April 2012.

  4. Learning-based Wind Estimation using Distant Soundings for Unguided Aerial Delivery

    NASA Astrophysics Data System (ADS)

    Plyler, M.; Cahoy, K.; Angermueller, K.; Chen, D.; Markuzon, N.

    2016-12-01

    Delivering unguided, parachuted payloads from aircraft requires accurate knowledge of the wind field inside an operational zone. Usually, a dropsonde released from the aircraft over the drop zone gives a more accurate wind estimate than a forecast. Mission objectives occasionally demand releasing the dropsonde away from the drop zone, but still require accuracy and precision. Barnes interpolation and many other assimilation methods do poorly when the forecast error is inconsistent in a forecast grid. A machine learning approach can better leverage non-linear relations between different weather patterns and thus provide a better wind estimate at the target drop zone when using data collected up to 100 km away. This study uses the 13 km resolution Rapid Refresh (RAP) dataset available through NOAA and subsamples to an area around Yuma, AZ and up to approximately 10km AMSL. RAP forecast grids are updated with simulated dropsondes taken from analysis (historical weather maps). We train models using different data mining and machine learning techniques, most notably boosted regression trees, that can accurately assimilate the distant dropsonde. The model takes a forecast grid and simulated remote dropsonde data as input and produces an estimate of the wind stick over the drop zone. Using ballistic winds as a defining metric, we show our data driven approach does better than Barnes interpolation under some conditions, most notably when the forecast error is different between the two locations, on test data previously unseen by the model. We study and evaluate the model's performance depending on the size, the time lag, the drop altitude, and the geographic location of the training set, and identify parameters most contributing to the accuracy of the wind estimation. This study demonstrates a new approach for assimilating remotely released dropsondes, based on boosted regression trees, and shows improvement in wind estimation over currently used methods.

  5. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence

    PubMed Central

    Kelly, David; Majda, Andrew J.; Tong, Xin T.

    2015-01-01

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335

  6. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    PubMed

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  7. Forecasting characteristics of flood effects

    NASA Astrophysics Data System (ADS)

    Khamutova, M. V.; Rezchikov, A. F.; Kushnikov, V. A.; Ivaschenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikova, E. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.

    2018-05-01

    The article presents the development of a mathematical model of the system dynamics. Mathematical model allows forecasting the characteristics of flood effects. Model is based on a causal diagram and is presented by a system of nonlinear differential equations. Simulated characteristics are the nodes of the diagram, and edges define the functional relationships between them. The numerical solution of the system of equations using the Runge-Kutta method was obtained. Computer experiments to determine the characteristics on different time interval have been made and results of experiments have been compared with real data of real flood. The obtained results make it possible to assert that the developed model is valid. The results of study are useful in development of an information system for the operating and dispatching staff of the Ministry of the Russian Federation for Civil Defence, Emergencies and Elimination of Consequences of Natural Disasters (EMERCOM).

  8. Long-term evolution of electron distribution function due to nonlinear resonant interaction with whistler mode waves

    NASA Astrophysics Data System (ADS)

    Artemyev, Anton V.; Neishtadt, Anatoly I.; Vasiliev, Alexei A.

    2018-04-01

    Accurately modelling and forecasting of the dynamics of the Earth's radiation belts with the available computer resources represents an important challenge that still requires significant advances in the theoretical plasma physics field of wave-particle resonant interaction. Energetic electron acceleration or scattering into the Earth's atmosphere are essentially controlled by their resonances with electromagnetic whistler mode waves. The quasi-linear diffusion equation describes well this resonant interaction for low intensity waves. During the last decade, however, spacecraft observations in the radiation belts have revealed a large number of whistler mode waves with sufficiently high intensity to interact with electrons in the nonlinear regime. A kinetic equation including such nonlinear wave-particle interactions and describing the long-term evolution of the electron distribution is the focus of the present paper. Using the Hamiltonian theory of resonant phenomena, we describe individual electron resonance with an intense coherent whistler mode wave. The derived characteristics of such a resonance are incorporated into a generalized kinetic equation which includes non-local transport in energy space. This transport is produced by resonant electron trapping and nonlinear acceleration. We describe the methods allowing the construction of nonlinear resonant terms in the kinetic equation and discuss possible applications of this equation.

  9. A recurrence-weighted prediction algorithm for musical analysis

    NASA Astrophysics Data System (ADS)

    Colucci, Renato; Leguizamon Cucunuba, Juan Sebastián; Lloyd, Simon

    2018-03-01

    Forecasting the future behaviour of a system using past data is an important topic. In this article we apply nonlinear time series analysis in the context of music, and present new algorithms for extending a sample of music, while maintaining characteristics similar to the original piece. By using ideas from ergodic theory, we adapt the classical prediction method of Lorenz analogues so as to take into account recurrence times, and demonstrate with examples, how the new algorithm can produce predictions with a high degree of similarity to the original sample.

  10. Forecasting monthly inflow discharge of the Iffezheim reservoir using data-driven models

    NASA Astrophysics Data System (ADS)

    Zhang, Qing; Aljoumani, Basem; Hillebrand, Gudrun; Hoffmann, Thomas; Hinkelmann, Reinhard

    2017-04-01

    River stream flow is an essential element in hydrology study fields, especially for reservoir management, since it defines input into reservoirs. Forecasting this stream flow plays an important role in short or long-term planning and management in the reservoir, e.g. optimized reservoir and hydroelectric operation or agricultural irrigation. Highly accurate flow forecasting can significantly reduce economic losses and is always pursued by reservoir operators. Therefore, hydrologic time series forecasting has received tremendous attention of researchers. Many models have been proposed to improve the hydrological forecasting. Due to the fact that most natural phenomena occurring in environmental systems appear to behave in random or probabilistic ways, different cases may need a different methods to forecast the inflow and even a unique treatment to improve the forecast accuracy. The purpose of this study is to determine an appropriate model for forecasting monthly inflow to the Iffezheim reservoir in Germany, which is the last of the barrages in the Upper Rhine. Monthly time series of discharges, measured from 1946 to 2001 at the Plittersdorf station, which is located 6 km downstream of the Iffezheim reservoir, were applied. The accuracies of the used stochastic models - Fiering model and Auto-Regressive Integrated Moving Average models (ARIMA) are compared with Artificial Intelligence (AI) models - single Artificial Neural Network (ANN) and Wavelet ANN models (WANN). The Fiering model is a linear stochastic model and used for generating synthetic monthly data. The basic idea in modeling time series using ARIMA is to identify a simple model with as few model parameters as possible in order to provide a good statistical fit to the data. To identify and fit the ARIMA models, four phase approaches were used: identification, parameter estimation, diagnostic checking, and forecasting. An automatic selection criterion, such as the Akaike information criterion, is utilized to enhance this flexible approach to set up the model. As distinct from both stochastic models, the ANN and its related conjunction methods Wavelet-ANN (WANN) models are effective to handle non-linear systems and have been developed with antecedent flows as inputs to forecast up to 12-months lead-time for the Iffezheim reservoir. In the ANN and WANN models, the Feed Forward Back Propagation method (FFBP) is applied. The sigmoid activity and linear functions were used with several different neurons for the hidden layers and for the output layer, respectively. To compare the accuracy of the different models and identify the most suitable model for reliable forecasting, four quantitative standard statistical performance evaluation measures, the root mean square error (RMSE), the mean bias error (MAE) and the determination correlation coefficient (DC), are employed. The results reveal that the ARIMA (2, 1, 2) performs better than Fiering, ANN and WANN models. Further, the WANN model is found to be slightly better than the ANN model for forecasting monthly inflow of the Iffezheim reservoir. As a result, by using the ARIMA model, the predicted and observed values agree reasonably well.

  11. Multidimensional density shaping by sigmoids.

    PubMed

    Roth, Z; Baram, Y

    1996-01-01

    An estimate of the probability density function of a random vector is obtained by maximizing the output entropy of a feedforward network of sigmoidal units with respect to the input weights. Classification problems can be solved by selecting the class associated with the maximal estimated density. Newton's optimization method, applied to the estimated density, yields a recursive estimator for a random variable or a random sequence. A constrained connectivity structure yields a linear estimator, which is particularly suitable for "real time" prediction. A Gaussian nonlinearity yields a closed-form solution for the network's parameters, which may also be used for initializing the optimization algorithm when other nonlinearities are employed. A triangular connectivity between the neurons and the input, which is naturally suggested by the statistical setting, reduces the number of parameters. Applications to classification and forecasting problems are demonstrated.

  12. Forecasting of dissolved oxygen in the Guanting reservoir using an optimized NGBM (1,1) model.

    PubMed

    An, Yan; Zou, Zhihong; Zhao, Yanfei

    2015-03-01

    An optimized nonlinear grey Bernoulli model was proposed by using a particle swarm optimization algorithm to solve the parameter optimization problem. In addition, each item in the first-order accumulated generating sequence was set in turn as an initial condition to determine which alternative would yield the highest forecasting accuracy. To test the forecasting performance, the optimized models with different initial conditions were then used to simulate dissolved oxygen concentrations in the Guanting reservoir inlet and outlet (China). The empirical results show that the optimized model can remarkably improve forecasting accuracy, and the particle swarm optimization technique is a good tool to solve parameter optimization problems. What's more, the optimized model with an initial condition that performs well in in-sample simulation may not do as well as in out-of-sample forecasting. Copyright © 2015. Published by Elsevier B.V.

  13. Improved Neural Networks with Random Weights for Short-Term Load Forecasting

    PubMed Central

    Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo

    2015-01-01

    An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting. PMID:26629825

  14. Improved Neural Networks with Random Weights for Short-Term Load Forecasting.

    PubMed

    Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo

    2015-01-01

    An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting.

  15. Verification of an ensemble prediction system for storm surge forecast in the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Lionello, Piero

    2014-12-01

    In the Adriatic Sea, storm surges present a significant threat to Venice and to the flat coastal areas of the northern coast of the basin. Sea level forecast is of paramount importance for the management of daily activities and for operating the movable barriers that are presently being built for the protection of the city. In this paper, an EPS (ensemble prediction system) for operational forecasting of storm surge in the northern Adriatic Sea is presented and applied to a 3-month-long period (October-December 2010). The sea level EPS is based on the HYPSE (hydrostatic Padua Sea elevation) model, which is a standard single-layer nonlinear shallow water model, whose forcings (mean sea level pressure and surface wind fields) are provided by the ensemble members of the ECMWF (European Center for Medium-Range Weather Forecasts) EPS. Results are verified against observations at five tide gauges located along the Croatian and Italian coasts of the Adriatic Sea. Forecast uncertainty increases with the predicted value of the storm surge and with the forecast lead time. The EMF (ensemble mean forecast) provided by the EPS has a rms (root mean square) error lower than the DF (deterministic forecast), especially for short (up to 3 days) lead times. Uncertainty for short lead times of the forecast and for small storm surges is mainly caused by uncertainty of the initial condition of the hydrodynamical model. Uncertainty for large lead times and large storm surges is mainly caused by uncertainty in the meteorological forcings. The EPS spread increases with the rms error of the forecast. For large lead times the EPS spread and the forecast error substantially coincide. However, the EPS spread in this study, which does not account for uncertainty in the initial condition, underestimates the error during the early part of the forecast and for small storm surge values. On the contrary, it overestimates the rms error for large surge values. The PF (probability forecast) of the EPS has a clear skill in predicting the actual probability distribution of sea level, and it outperforms simple "dressed" PF methods. A probability estimate based on the single DF is shown to be inadequate. However, a PF obtained with a prescribed Gaussian distribution and centered on the DF value performs very similarly to the EPS-based PF.

  16. Short-Term Distribution System State Forecast Based on Optimal Synchrophasor Sensor Placement and Extreme Learning Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Zhang, Yingchen

    This paper proposes an approach for distribution system state forecasting, which aims to provide an accurate and high speed state forecasting with an optimal synchrophasor sensor placement (OSSP) based state estimator and an extreme learning machine (ELM) based forecaster. Specifically, considering the sensor installation cost and measurement error, an OSSP algorithm is proposed to reduce the number of synchrophasor sensor and keep the whole distribution system numerically and topologically observable. Then, the weighted least square (WLS) based system state estimator is used to produce the training data for the proposed forecaster. Traditionally, the artificial neural network (ANN) and support vectormore » regression (SVR) are widely used in forecasting due to their nonlinear modeling capabilities. However, the ANN contains heavy computation load and the best parameters for SVR are difficult to obtain. In this paper, the ELM, which overcomes these drawbacks, is used to forecast the future system states with the historical system states. The proposed approach is effective and accurate based on the testing results.« less

  17. Application and verification of ECMWF seasonal forecast for wind energy

    NASA Astrophysics Data System (ADS)

    Žagar, Mark; Marić, Tomislav; Qvist, Martin; Gulstad, Line

    2015-04-01

    A good understanding of long-term annual energy production (AEP) is crucial when assessing the business case of investing in green energy like wind power. The art of wind-resource assessment has emerged into a scientific discipline on its own, which has advanced at high pace over the last decade. This has resulted in continuous improvement of the AEP accuracy and, therefore, increase in business case certainty. Harvesting the full potential output of a wind farm or a portfolio of wind farms depends heavily on optimizing operation and management strategy. The necessary information for short-term planning (up to 14 days) is provided by standard weather and power forecasting services, and the long-term plans are based on climatology. However, the wind-power industry is lacking quality information on intermediate scales of the expected variability in seasonal and intra-annual variations and their geographical distribution. The seasonal power forecast presented here is designed to bridge this gap. The seasonal power production forecast is based on the ECMWF seasonal weather forecast and the Vestas' high-resolution, mesoscale weather library. The seasonal weather forecast is enriched through a layer of statistical post-processing added to relate large-scale wind speed anomalies to mesoscale climatology. The resulting predicted energy production anomalies, thus, include mesoscale effects not captured by the global forecasting systems. The turbine power output is non-linearly related to the wind speed, which has important implications for the wind power forecast. In theory, the wind power is proportional to the cube of wind speed. However, due to the nature of turbine design, this exponent is close to 3 only at low wind speeds, becomes smaller as the wind speed increases, and above 11-13 m/s the power output remains constant, called the rated power. The non-linear relationship between wind speed and the power output generally increases sensitivity of the forecasted power to the wind speed anomalies. On the other hand, in some cases and areas where turbines operate close to, or above the rated power, the sensitivity of power forecast is reduced. Thus, the seasonal power forecasting system requires good knowledge of the changes in frequency of events with sufficient wind speeds to have acceptable skill. The scientific background for the Vestas seasonal power forecasting system is described and the relationship between predicted monthly wind speed anomalies and observed wind energy production are investigated for a number of operating wind farms in different climate zones. Current challenges will be discussed and some future research and development areas identified.

  18. Forecasting the response of Earth's surface to future climatic and land use changes: A review of methods and research needs

    DOE PAGES

    Pelletier, Jon D.; Murray, A. Brad; Pierce, Jennifer L.; ...

    2015-07-14

    In the future, Earth will be warmer, precipitation events will be more extreme, global mean sea level will rise, and many arid and semiarid regions will be drier. Human modifications of landscapes will also occur at an accelerated rate as developed areas increase in size and population density. We now have gridded global forecasts, being continually improved, of the climatic and land use changes (C&LUC) that are likely to occur in the coming decades. However, besides a few exceptions, consensus forecasts do not exist for how these C&LUC will likely impact Earth-surface processes and hazards. In some cases, we havemore » the tools to forecast the geomorphic responses to likely future C&LUC. Fully exploiting these models and utilizing these tools will require close collaboration among Earth-surface scientists and Earth-system modelers. This paper assesses the state-of-the-art tools and data that are being used or could be used to forecast changes in the state of Earth's surface as a result of likely future C&LUC. We also propose strategies for filling key knowledge gaps, emphasizing where additional basic research and/or collaboration across disciplines are necessary. The main body of the paper addresses cross-cutting issues, including the importance of nonlinear/threshold-dominated interactions among topography, vegetation, and sediment transport, as well as the importance of alternate stable states and extreme, rare events for understanding and forecasting Earth-surface response to C&LUC. Five supplements delve into different scales or process zones (global-scale assessments and fluvial, aeolian, glacial/periglacial, and coastal process zones) in detail.« less

  19. Predictability of CFSv2 in the tropical Indo-Pacific region, at daily and subseasonal time scales

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, V.

    2018-06-01

    The predictability of a coupled climate model is evaluated at daily and intraseasonal time scales in the tropical Indo-Pacific region during boreal summer and winter. This study has assessed the daily retrospective forecasts of the Climate Forecast System version 2 from the National Centers of Environmental Prediction for the period 1982-2010. The growth of errors in the forecasts of daily precipitation, monsoon intraseasonal oscillation (MISO) and the Madden-Julian oscillation (MJO) is studied. The seasonal cycle of the daily climatology of precipitation is reasonably well predicted except for the underestimation during the peak of summer. The anomalies follow the typical pattern of error growth in nonlinear systems and show no difference between summer and winter. The initial errors in all the cases are found to be in the nonlinear phase of the error growth. The doubling time of small errors is estimated by applying Lorenz error formula. For summer and winter, the doubling time of the forecast errors is in the range of 4-7 and 5-14 days while the doubling time of the predictability errors is 6-8 and 8-14 days, respectively. The doubling time in MISO during the summer and MJO during the winter is in the range of 12-14 days, indicating higher predictability and providing optimism for long-range prediction. There is no significant difference in the growth of forecasts errors originating from different phases of MISO and MJO, although the prediction of the active phase seems to be slightly better.

  20. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Dissolved oxygen content prediction in crab culture using a hybrid intelligent method

    PubMed Central

    Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang

    2016-01-01

    A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds. PMID:27270206

  2. Dissolved oxygen content prediction in crab culture using a hybrid intelligent method.

    PubMed

    Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang

    2016-06-08

    A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds.

  3. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    NASA Astrophysics Data System (ADS)

    An, Zhe; Rey, Daniel; Ye, Jingxin; Abarbanel, Henry D. I.

    2017-01-01

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of the full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. We show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.

  4. Uncertainty analysis of neural network based flood forecasting models: An ensemble based approach for constructing prediction interval

    NASA Astrophysics Data System (ADS)

    Kasiviswanathan, K.; Sudheer, K.

    2013-05-01

    Artificial neural network (ANN) based hydrologic models have gained lot of attention among water resources engineers and scientists, owing to their potential for accurate prediction of flood flows as compared to conceptual or physics based hydrologic models. The ANN approximates the non-linear functional relationship between the complex hydrologic variables in arriving at the river flow forecast values. Despite a large number of applications, there is still some criticism that ANN's point prediction lacks in reliability since the uncertainty of predictions are not quantified, and it limits its use in practical applications. A major concern in application of traditional uncertainty analysis techniques on neural network framework is its parallel computing architecture with large degrees of freedom, which makes the uncertainty assessment a challenging task. Very limited studies have considered assessment of predictive uncertainty of ANN based hydrologic models. In this study, a novel method is proposed that help construct the prediction interval of ANN flood forecasting model during calibration itself. The method is designed to have two stages of optimization during calibration: at stage 1, the ANN model is trained with genetic algorithm (GA) to obtain optimal set of weights and biases vector, and during stage 2, the optimal variability of ANN parameters (obtained in stage 1) is identified so as to create an ensemble of predictions. During the 2nd stage, the optimization is performed with multiple objectives, (i) minimum residual variance for the ensemble mean, (ii) maximum measured data points to fall within the estimated prediction interval and (iii) minimum width of prediction interval. The method is illustrated using a real world case study of an Indian basin. The method was able to produce an ensemble that has an average prediction interval width of 23.03 m3/s, with 97.17% of the total validation data points (measured) lying within the interval. The derived prediction interval for a selected hydrograph in the validation data set is presented in Fig 1. It is noted that most of the observed flows lie within the constructed prediction interval, and therefore provides information about the uncertainty of the prediction. One specific advantage of the method is that when ensemble mean value is considered as a forecast, the peak flows are predicted with improved accuracy by this method compared to traditional single point forecasted ANNs. Fig. 1 Prediction Interval for selected hydrograph

  5. Comparative Performance Evaluation of Rainfall-runoff Models, Six of Black-box Type and One of Conceptual Type, From The Galway Flow Forecasting System (gffs) Package, Applied On Two Irish Catchments

    NASA Astrophysics Data System (ADS)

    Goswami, M.; O'Connor, K. M.; Shamseldin, A. Y.

    The "Galway Real-Time River Flow Forecasting System" (GFFS) is a software pack- age developed at the Department of Engineering Hydrology, of the National University of Ireland, Galway, Ireland. It is based on a selection of lumped black-box and con- ceptual rainfall-runoff models, all developed in Galway, consisting primarily of both the non-parametric (NP) and parametric (P) forms of two black-box-type rainfall- runoff models, namely, the Simple Linear Model (SLM-NP and SLM-P) and the seasonally-based Linear Perturbation Model (LPM-NP and LPM-P), together with the non-parametric wetness-index-based Linearly Varying Gain Factor Model (LVGFM), the black-box Artificial Neural Network (ANN) Model, and the conceptual Soil Mois- ture Accounting and Routing (SMAR) Model. Comprised of the above suite of mod- els, the system enables the user to calibrate each model individually, initially without updating, and it is capable also of producing combined (i.e. consensus) forecasts us- ing the Simple Average Method (SAM), the Weighted Average Method (WAM), or the Artificial Neural Network Method (NNM). The updating of each model output is achieved using one of four different techniques, namely, simple Auto-Regressive (AR) updating, Linear Transfer Function (LTF) updating, Artificial Neural Network updating (NNU), and updating by the Non-linear Auto-Regressive Exogenous-input method (NARXM). The models exhibit a considerable range of variation in degree of complexity of structure, with corresponding degrees of complication in objective func- tion evaluation. Operating in continuous river-flow simulation and updating modes, these models and techniques have been applied to two Irish catchments, namely, the Fergus and the Brosna. A number of performance evaluation criteria have been used to comparatively assess the model discharge forecast efficiency.

  6. Predictability of a Coupled Model of ENSO Using Singular Vector Analysis: Optimal Growth and Forecast Skill.

    NASA Astrophysics Data System (ADS)

    Xue, Yan

    The optimal growth and its relationship with the forecast skill of the Zebiak and Cane model are studied using a simple statistical model best fit to the original nonlinear model and local linear tangent models about idealized climatic states (the mean background and ENSO cycles in a long model run), and the actual forecast states, including two sets of runs using two different initialization procedures. The seasonally varying Markov model best fit to a suite of 3-year forecasts in a reduced EOF space (18 EOFs) fits the original nonlinear model reasonably well and has comparable or better forecast skill. The initial error growth in a linear evolution operator A is governed by the eigenvalues of A^{T}A, and the square roots of eigenvalues and eigenvectors of A^{T}A are named singular values and singular vectors. One dominant growing singular vector is found, and the optimal 6 month growth rate is largest for a (boreal) spring start and smallest for a fall start. Most of the variation in the optimal growth rate of the two forecasts is seasonal, attributable to the seasonal variations in the mean background, except that in the cold events it is substantially suppressed. It is found that the mean background (zero anomaly) is the most unstable state, and the "forecast IC states" are more unstable than the "coupled model states". One dominant growing singular vector is found, characterized by north-south and east -west dipoles, convergent winds on the equator in the eastern Pacific and a deepened thermocline in the whole equatorial belt. This singular vector is insensitive to initial time and optimization time, but its final pattern is a strong function of initial states. The ENSO system is inherently unpredictable for the dominant singular vector can amplify 5-fold to 24-fold in 6 months and evolve into the large scales characteristic of ENSO. However, the inherent ENSO predictability is only a secondary factor, while the mismatches between the model and data is a primary factor controlling the current forecast skill.

  7. Novel hybrid linear stochastic with non-linear extreme learning machine methods for forecasting monthly rainfall a tropical climate.

    PubMed

    Zeynoddin, Mohammad; Bonakdari, Hossein; Azari, Arash; Ebtehaj, Isa; Gharabaghi, Bahram; Riahi Madavar, Hossein

    2018-09-15

    A novel hybrid approach is presented that can more accurately predict monthly rainfall in a tropical climate by integrating a linear stochastic model with a powerful non-linear extreme learning machine method. This new hybrid method was then evaluated by considering four general scenarios. In the first scenario, the modeling process is initiated without preprocessing input data as a base case. While in other three scenarios, the one-step and two-step procedures are utilized to make the model predictions more precise. The mentioned scenarios are based on a combination of stationarization techniques (i.e., differencing, seasonal and non-seasonal standardization and spectral analysis), and normality transforms (i.e., Box-Cox, John and Draper, Yeo and Johnson, Johnson, Box-Cox-Mod, log, log standard, and Manly). In scenario 2, which is a one-step scenario, the stationarization methods are employed as preprocessing approaches. In scenario 3 and 4, different combinations of normality transform, and stationarization methods are considered as preprocessing techniques. In total, 61 sub-scenarios are evaluated resulting 11013 models (10785 linear methods, 4 nonlinear models, and 224 hybrid models are evaluated). The uncertainty of the linear, nonlinear and hybrid models are examined by Monte Carlo technique. The best preprocessing technique is the utilization of Johnson normality transform and seasonal standardization (respectively) (R 2  = 0.99; RMSE = 0.6; MAE = 0.38; RMSRE = 0.1, MARE = 0.06, UI = 0.03 &UII = 0.05). The results of uncertainty analysis indicated the good performance of proposed technique (d-factor = 0.27; 95PPU = 83.57). Moreover, the results of the proposed methodology in this study were compared with an evolutionary hybrid of adaptive neuro fuzzy inference system (ANFIS) with firefly algorithm (ANFIS-FFA) demonstrating that the new hybrid methods outperformed ANFIS-FFA method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Assessment of the Structural Conditions of the San Clemente a Vomano Abbey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benedettini, Francesco; Alaggio, Rocco; Fusco, Felice

    2008-07-08

    The simultaneous use of a Finite Element (FE) accurate modeling, dynamical tests, model updating and nonlinear analysis are used to describe the integrated approach used by the authors to assess the structural conditions and the seismic vulnerability of an historical masonry structure: the Abbey Church of San Clemente al Vomano, situated in the Notaresco territory (TE, Italy) commissioned by Ermengarda, daughter of the Emperor Ludovico II, and built at the end of IX century together with a monastery to host a monastic community. Dynamical tests 'in operational conditions' and modal identification have been used to perform the FE model validation.more » Both a simple and direct method as the kinematic analysis applied on meaningful sub-structures and a nonlinear 3D dynamic analysis conducted by using the FE model have been used to forecast the seismic performance of the Church.« less

  9. Using Seasonal Forecasts for medium-term Electricity Demand Forecasting on Italy

    NASA Astrophysics Data System (ADS)

    De Felice, M.; Alessandri, A.; Ruti, P.

    2012-12-01

    Electricity demand forecast is an essential tool for energy management and operation scheduling for electric utilities. In power engineering, medium-term forecasting is defined as the prediction up to 12 months ahead, and commonly is performed considering weather climatology and not actual forecasts. This work aims to analyze the predictability of electricity demand on seasonal time scale, considering seasonal samples, i.e. average on three months. Electricity demand data has been provided by Italian Transmission System Operator for eight different geographical areas, in Fig. 1 for each area is shown the average yearly demand anomaly for each season. This work uses data for each summer during 1990-2010 and all the datasets have been pre-processed to remove trends and reduce the influence of calendar and economic effects. The choice of focusing this research on the summer period is due to the critical peaks of demand that power grid is subject during hot days. Weather data have been included considering observations provided by ECMWF ERA-INTERIM reanalyses. Primitive variables (2-metres temperature, pressure, etc) and derived variables (cooling and heating degree days) have been averaged for summer months. A particular attention has been given to the influence of persistence of positive temperature anomaly and a derived variable which count the number of consecutive days of extreme-days has been used. Electricity demand forecast has been performed using linear and nonlinear regression methods and stepwise model selection procedures have been used to perform a variable selection with respect to performance measures. Significance tests on multiple linear regression showed the importance of cooling degree days during summer in the North-East and South of Italy with an increase of statistical significance after 2003, a result consistent with the diffusion of air condition and ventilation equipment in the last decade. Finally, using seasonal climate forecasts we evaluate the performances of electricity demand forecast performed with predicted variables on Italian regions with encouraging results on the South of Italy. This work gives an initial assessment on the predictability of electricity demand on seasonal time scale, evaluating the relevance of climate information provided by seasonal forecasts for electricity management during high-demand periods.;

  10. What might we learn from climate forecasts?

    PubMed Central

    Smith, Leonard A.

    2002-01-01

    Most climate models are large dynamical systems involving a million (or more) variables on big computers. Given that they are nonlinear and not perfect, what can we expect to learn from them about the earth's climate? How can we determine which aspects of their output might be useful and which are noise? And how should we distribute resources between making them “better,” estimating variables of true social and economic interest, and quantifying how good they are at the moment? Just as “chaos” prevents accurate weather forecasts, so model error precludes accurate forecasts of the distributions that define climate, yielding uncertainty of the second kind. Can we estimate the uncertainty in our uncertainty estimates? These questions are discussed. Ultimately, all uncertainty is quantified within a given modeling paradigm; our forecasts need never reflect the uncertainty in a physical system. PMID:11875200

  11. Fuzzy Temporal Logic Based Railway Passenger Flow Forecast Model

    PubMed Central

    Dou, Fei; Jia, Limin; Wang, Li; Xu, Jie; Huang, Yakun

    2014-01-01

    Passenger flow forecast is of essential importance to the organization of railway transportation and is one of the most important basics for the decision-making on transportation pattern and train operation planning. Passenger flow of high-speed railway features the quasi-periodic variations in a short time and complex nonlinear fluctuation because of existence of many influencing factors. In this study, a fuzzy temporal logic based passenger flow forecast model (FTLPFFM) is presented based on fuzzy logic relationship recognition techniques that predicts the short-term passenger flow for high-speed railway, and the forecast accuracy is also significantly improved. An applied case that uses the real-world data illustrates the precision and accuracy of FTLPFFM. For this applied case, the proposed model performs better than the k-nearest neighbor (KNN) and autoregressive integrated moving average (ARIMA) models. PMID:25431586

  12. Robust nonlinear canonical correlation analysis: application to seasonal climate forecasting

    NASA Astrophysics Data System (ADS)

    Cannon, A. J.; Hsieh, W. W.

    2008-02-01

    Robust variants of nonlinear canonical correlation analysis (NLCCA) are introduced to improve performance on datasets with low signal-to-noise ratios, for example those encountered when making seasonal climate forecasts. The neural network model architecture of standard NLCCA is kept intact, but the cost functions used to set the model parameters are replaced with more robust variants. The Pearson product-moment correlation in the double-barreled network is replaced by the biweight midcorrelation, and the mean squared error (mse) in the inverse mapping networks can be replaced by the mean absolute error (mae). Robust variants of NLCCA are demonstrated on a synthetic dataset and are used to forecast sea surface temperatures in the tropical Pacific Ocean based on the sea level pressure field. Results suggest that adoption of the biweight midcorrelation can lead to improved performance, especially when a strong, common event exists in both predictor/predictand datasets. Replacing the mse by the mae leads to improved performance on the synthetic dataset, but not on the climate dataset except at the longest lead time, which suggests that the appropriate cost function for the inverse mapping networks is more problem dependent.

  13. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  14. Evaluation and prediction of solar radiation for energy management based on neural networks

    NASA Astrophysics Data System (ADS)

    Aldoshina, O. V.; Van Tai, Dinh

    2017-08-01

    Currently, there is a high rate of distribution of renewable energy sources and distributed power generation based on intelligent networks; therefore, meteorological forecasts are particularly useful for planning and managing the energy system in order to increase its overall efficiency and productivity. The application of artificial neural networks (ANN) in the field of photovoltaic energy is presented in this article. Implemented in this study, two periodically repeating dynamic ANS, that are the concentration of the time delay of a neural network (CTDNN) and the non-linear autoregression of a network with exogenous inputs of the NAEI, are used in the development of a model for estimating and daily forecasting of solar radiation. ANN show good productivity, as reliable and accurate models of daily solar radiation are obtained. This allows to successfully predict the photovoltaic output power for this installation. The potential of the proposed method for controlling the energy of the electrical network is shown using the example of the application of the NAEI network for predicting the electric load.

  15. Jump-Diffusion models and structural changes for asset forecasting in hydrology

    NASA Astrophysics Data System (ADS)

    Tranquille Temgoua, André Guy; Martel, Richard; Chang, Philippe J. J.; Rivera, Alfonso

    2017-04-01

    Impacts of climate change on surface water and groundwater are of concern in many regions of the world since water is an essential natural resource. Jump-Diffusion models are generally used in economics and other related fields but not in hydrology. The potential application could be made for hydrologic data series analysis and forecast. The present study uses Jump-Diffusion models by adding structural changes to detect fluctuations in hydrologic processes in relationship with climate change. The model implicitly assumes that modifications in rivers' flowrates can be divided into three categories: (a) normal changes due to irregular precipitation events especially in tropical regions causing major disturbance in hydrologic processes (this component is modelled by a discrete Brownian motion); (b) abnormal, sudden and non-persistent modifications in hydrologic proceedings are handled by Poisson processes; (c) the persistence of hydrologic fluctuations characterized by structural changes in hydrological data related to climate variability. The objective of this paper is to add structural changes in diffusion models with jumps, in order to capture the persistence of hydrologic fluctuations. Indirectly, the idea is to observe if there are structural changes of discharge/recharge over the study area, and to find an efficient and flexible model able of capturing a wide variety of hydrologic processes. Structural changes in hydrological data are estimated using the method of nonlinear discrete filters via Method of Simulated Moments (MSM). An application is given using sensitive parameters such as baseflow index and recession coefficient to capture discharge/recharge. Historical dataset are examined by the Volume Spread Analysis (VSA) to detect real time and random perturbations in hydrologic processes. The application of the method allows establishing more accurate hydrologic parameters. The impact of this study is perceptible in forecasting floods and groundwater recession. Keywords: hydrologic processes, Jump-Diffusion models, structural changes, forecast, climate change

  16. Observation Impacts for Longer Forecast Lead-Times

    NASA Astrophysics Data System (ADS)

    Mahajan, R.; Gelaro, R.; Todling, R.

    2013-12-01

    Observation impact on forecasts evaluated using adjoint-based techniques (e.g. Langland and Baker, 2004) are limited by the validity of the assumptions underlying the forecasting model adjoint. Most applications of this approach have focused on deriving observation impacts on short-range forecasts (e.g. 24-hour) in part to stay well within linearization assumptions. The most widely used measure of observation impact relies on the availability of the analysis for verifying the forecasts. As pointed out by Gelaro et al. (2007), and more recently by Todling (2013), this introduces undesirable correlations in the measure that are likely to affect the resulting assessment of the observing system. Stappers and Barkmeijer (2012) introduced a technique that, in principle, allows extending the validity of tangent linear and corresponding adjoint models to longer lead-times, thereby reducing the correlations in the measures used for observation impact assessments. The methodology provides the means to better represent linearized models by making use of Gaussian quadrature relations to handle various underlying non-linear model trajectories. The formulation is exact for particular bi-linear dynamics; it corresponds to an approximation for general-type nonlinearities and must be tested for large atmospheric models. The present work investigates the approach of Stappers and Barkmeijer (2012)in the context of NASA's Goddard Earth Observing System Version 5 (GEOS-5) atmospheric data assimilation system (ADAS). The goal is to calculate observation impacts in the GEOS-5 ADAS for forecast lead-times of at least 48 hours in order to reduce the potential for undesirable correlations that occur at shorter forecast lead times. References [1]Langland, R. H., and N. L. Baker, 2004: Estimation of observation impact using the NRL atmospheric variational data assimilation adjoint system. Tellus, 56A, 189-201. [2] Gelaro, R., Y. Zhu, and R. M. Errico, 2007: Examination of various-order adjoint-based approximations of observation impact. Meteoroloische Zeitschrift, 16, 685-692. [3]Stappers, R. J. J., and J. Barkmeijer, 2012: Optimal linearization trajectories for tangent linear models. Q. J. R. Meteorol. Soc., 138, 170-184. [4] Todling, R. 2013: Comparing two approaches for assessing observation impact. Mon. Wea. Rev., 141, 1484-1505.

  17. Algorithm for predicting the evolution of series of dynamics of complex systems in solving information problems

    NASA Astrophysics Data System (ADS)

    Kasatkina, T. I.; Dushkin, A. V.; Pavlov, V. A.; Shatovkin, R. R.

    2018-03-01

    In the development of information, systems and programming to predict the series of dynamics, neural network methods have recently been applied. They are more flexible, in comparison with existing analogues and are capable of taking into account the nonlinearities of the series. In this paper, we propose a modified algorithm for predicting the series of dynamics, which includes a method for training neural networks, an approach to describing and presenting input data, based on the prediction by the multilayer perceptron method. To construct a neural network, the values of a series of dynamics at the extremum points and time values corresponding to them, formed based on the sliding window method, are used as input data. The proposed algorithm can act as an independent approach to predicting the series of dynamics, and be one of the parts of the forecasting system. The efficiency of predicting the evolution of the dynamics series for a short-term one-step and long-term multi-step forecast by the classical multilayer perceptron method and a modified algorithm using synthetic and real data is compared. The result of this modification was the minimization of the magnitude of the iterative error that arises from the previously predicted inputs to the inputs to the neural network, as well as the increase in the accuracy of the iterative prediction of the neural network.

  18. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    NASA Astrophysics Data System (ADS)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  19. Monthly reservoir inflow forecasting using a new hybrid SARIMA genetic programming approach

    NASA Astrophysics Data System (ADS)

    Moeeni, Hamid; Bonakdari, Hossein; Ebtehaj, Isa

    2017-03-01

    Forecasting reservoir inflow is one of the most important components of water resources and hydroelectric systems operation management. Seasonal autoregressive integrated moving average (SARIMA) models have been frequently used for predicting river flow. SARIMA models are linear and do not consider the random component of statistical data. To overcome this shortcoming, monthly inflow is predicted in this study based on a combination of seasonal autoregressive integrated moving average (SARIMA) and gene expression programming (GEP) models, which is a new hybrid method (SARIMA-GEP). To this end, a four-step process is employed. First, the monthly inflow datasets are pre-processed. Second, the datasets are modelled linearly with SARIMA and in the third stage, the non-linearity of residual series caused by linear modelling is evaluated. After confirming the non-linearity, the residuals are modelled in the fourth step using a gene expression programming (GEP) method. The proposed hybrid model is employed to predict the monthly inflow to the Jamishan Dam in west Iran. Thirty years' worth of site measurements of monthly reservoir dam inflow with extreme seasonal variations are used. The results of this hybrid model (SARIMA-GEP) are compared with SARIMA, GEP, artificial neural network (ANN) and SARIMA-ANN models. The results indicate that the SARIMA-GEP model ( R 2=78.8, VAF =78.8, RMSE =0.89, MAPE =43.4, CRM =0.053) outperforms SARIMA and GEP and SARIMA-ANN ( R 2=68.3, VAF =66.4, RMSE =1.12, MAPE =56.6, CRM =0.032) displays better performance than the SARIMA and ANN models. A comparison of the two hybrid models indicates the superiority of SARIMA-GEP over the SARIMA-ANN model.

  20. Researches on High Accuracy Prediction Methods of Earth Orientation Parameters

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.

    2015-09-01

    The Earth rotation reflects the coupling process among the solid Earth, atmosphere, oceans, mantle, and core of the Earth on multiple spatial and temporal scales. The Earth rotation can be described by the Earth's orientation parameters, which are abbreviated as EOP (mainly including two polar motion components PM_X and PM_Y, and variation in the length of day ΔLOD). The EOP is crucial in the transformation between the terrestrial and celestial reference systems, and has important applications in many areas such as the deep space exploration, satellite precise orbit determination, and astrogeodynamics. However, the EOP products obtained by the space geodetic technologies generally delay by several days to two weeks. The growing demands for modern space navigation make high-accuracy EOP prediction be a worthy topic. This thesis is composed of the following three aspects, for the purpose of improving the EOP forecast accuracy. (1) We analyze the relation between the length of the basic data series and the EOP forecast accuracy, and compare the EOP prediction accuracy for the linear autoregressive (AR) model and the nonlinear artificial neural network (ANN) method by performing the least squares (LS) extrapolations. The results show that the high precision forecast of EOP can be realized by appropriate selection of the basic data series length according to the required time span of EOP prediction: for short-term prediction, the basic data series should be shorter, while for the long-term prediction, the series should be longer. The analysis also showed that the LS+AR model is more suitable for the short-term forecasts, while the LS+ANN model shows the advantages in the medium- and long-term forecasts. (2) We develop for the first time a new method which combines the autoregressive model and Kalman filter (AR+Kalman) in short-term EOP prediction. The equations of observation and state are established using the EOP series and the autoregressive coefficients respectively, which are used to improve/re-evaluate the AR model. Comparing to the single AR model, the AR+Kalman method performs better in the prediction of UT1-UTC and ΔLOD, and the improvement in the prediction of the polar motion is significant. (3) Following the successful Earth Orientation Parameter Prediction Comparison Campaign (EOP PCC), the Earth Orientation Parameter Combination of Prediction Pilot Project (EOPC PPP) was sponsored in 2010. As one of the participants from China, we update and submit the short- and medium-term (1 to 90 days) EOP predictions every day. From the current comparative statistics, our prediction accuracy is on the medium international level. We will carry out more innovative researches to improve the EOP forecast accuracy and enhance our level in EOP forecast.

  1. 4D Hybrid Ensemble-Variational Data Assimilation for the NCEP GFS: Outer Loops and Variable Transforms

    NASA Astrophysics Data System (ADS)

    Kleist, D. T.; Ide, K.; Mahajan, R.; Thomas, C.

    2014-12-01

    The use of hybrid error covariance models has become quite popular for numerical weather prediction (NWP). One such method for incorporating localized covariances from an ensemble within the variational framework utilizes an augmented control variable (EnVar), and has been implemented in the operational NCEP data assimilation system (GSI). By taking the existing 3D EnVar algorithm in GSI and allowing for four-dimensional ensemble perturbations, coupled with the 4DVAR infrastructure already in place, a 4D EnVar capability has been developed. The 4D EnVar algorithm has a few attractive qualities relative to 4DVAR, including the lack of need for tangent-linear and adjoint model as well as reduced computational cost. Preliminary results using real observations have been encouraging, showing forecast improvements nearly as large as were found in moving from 3DVAR to hybrid 3D EnVar. 4D EnVar is the method of choice for the next generation assimilation system for use with the operational NCEP global model, the global forecast system (GFS). The use of an outer-loop has long been the method of choice for 4DVar data assimilation to help address nonlinearity. An outer loop involves the re-running of the (deterministic) background forecast from the updated initial condition at the beginning of the assimilation window, and proceeding with another inner loop minimization. Within 4D EnVar, a similar procedure can be adopted since the solver evaluates a 4D analysis increment throughout the window, consistent with the valid times of the 4D ensemble perturbations. In this procedure, the ensemble perturbations are kept fixed and centered about the updated background state. This is analogous to the quasi-outer loop idea developed for the EnKF. Here, we present results for both toy model and real NWP systems demonstrating the impact from incorporating outer loops to address nonlinearity within the 4D EnVar context. The appropriate amplitudes for observation and background error covariances in subsequent outer loops will be explored. Lastly, variable transformations on the ensemble perturbations will be utilized to help address issues of non-Gaussianity. This may be particularly important for variables that clearly have non-Gaussian error characteristics such as water vapor and cloud condensate.

  2. A Maple package for improved global mapping forecast

    NASA Astrophysics Data System (ADS)

    Carli, H.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2014-03-01

    We present a Maple implementation of the well known global approach to time series analysis and some further developments designed to improve the computational efficiency of the forecasting capabilities of the approach. This global approach can be summarized as being a reconstruction of the phase space, based on a time ordered series of data obtained from the system. After that, using the reconstructed vectors, a portion of this space is used to produce a mapping, a polynomial fitting, through a minimization procedure, that represents the system and can be employed to forecast further entries for the series. In the present implementation, we introduce a set of commands, tools, in order to perform all these tasks. For example, the command VecTS deals mainly with the reconstruction of the vector in the phase space. The command GfiTS deals with producing the minimization and the fitting. ForecasTS uses all these and produces the prediction of the next entries. For the non-standard algorithms, we here present two commands: IforecasTS and NiforecasTS that, respectively deal with the one-step and the N-step forecasting. Finally, we introduce two further tools to aid the forecasting. The commands GfiTS and AnalysTS, basically, perform an analysis of the behavior of each portion of a series regarding the settings used on the commands just mentioned above. Catalogue identifier: AERW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERW_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3001 No. of bytes in distributed program, including test data, etc.: 95018 Distribution format: tar.gz Programming language: Maple 14. Computer: Any capable of running Maple Operating system: Any capable of running Maple. Tested on Windows ME, Windows XP, Windows 7. RAM: 128 MB Classification: 4.3, 4.9, 5 Nature of problem: Time series analysis and improving forecast capability. Solution method: The method of solution is partially based on a result published in [1]. Restrictions: If the time series that is being analyzed presents a great amount of noise or if the dynamical system behind the time series is of high dimensionality (Dim≫3), then the method may not work well. Unusual features: Our implementation can, in the cases where the dynamics behind the time series is given by a system of low dimensionality, greatly improve the forecast. Running time: This depends strongly on the command that is being used. References: [1] Barbosa, L.M.C.R., Duarte, L.G.S., Linhares, C.A. and da Mota, L.A.C.P., Improving the global fitting method on nonlinear time series analysis, Phys. Rev. E 74, 026702 (2006).

  3. Forecasting Daily Patient Outflow From a Ward Having No Real-Time Clinical Data

    PubMed Central

    Tran, Truyen; Luo, Wei; Phung, Dinh; Venkatesh, Svetha

    2016-01-01

    Background: Modeling patient flow is crucial in understanding resource demand and prioritization. We study patient outflow from an open ward in an Australian hospital, where currently bed allocation is carried out by a manager relying on past experiences and looking at demand. Automatic methods that provide a reasonable estimate of total next-day discharges can aid in efficient bed management. The challenges in building such methods lie in dealing with large amounts of discharge noise introduced by the nonlinear nature of hospital procedures, and the nonavailability of real-time clinical information in wards. Objective Our study investigates different models to forecast the total number of next-day discharges from an open ward having no real-time clinical data. Methods We compared 5 popular regression algorithms to model total next-day discharges: (1) autoregressive integrated moving average (ARIMA), (2) the autoregressive moving average with exogenous variables (ARMAX), (3) k-nearest neighbor regression, (4) random forest regression, and (5) support vector regression. Although the autoregressive integrated moving average model relied on past 3-month discharges, nearest neighbor forecasting used median of similar discharges in the past in estimating next-day discharge. In addition, the ARMAX model used the day of the week and number of patients currently in ward as exogenous variables. For the random forest and support vector regression models, we designed a predictor set of 20 patient features and 88 ward-level features. Results Our data consisted of 12,141 patient visits over 1826 days. Forecasting quality was measured using mean forecast error, mean absolute error, symmetric mean absolute percentage error, and root mean square error. When compared with a moving average prediction model, all 5 models demonstrated superior performance with the random forests achieving 22.7% improvement in mean absolute error, for all days in the year 2014. Conclusions In the absence of clinical information, our study recommends using patient-level and ward-level data in predicting next-day discharges. Random forest and support vector regression models are able to use all available features from such data, resulting in superior performance over traditional autoregressive methods. An intelligent estimate of available beds in wards plays a crucial role in relieving access block in emergency departments. PMID:27444059

  4. A Hybrid Model for Predicting the Prevalence of Schistosomiasis in Humans of Qianjiang City, China

    PubMed Central

    Wang, Ying; Lu, Zhouqin; Tian, Lihong; Tan, Li; Shi, Yun; Nie, Shaofa; Liu, Li

    2014-01-01

    Backgrounds/Objective Schistosomiasis is still a major public health problem in China, despite the fact that the government has implemented a series of strategies to prevent and control the spread of the parasitic disease. Advanced warning and reliable forecasting can help policymakers to adjust and implement strategies more effectively, which will lead to the control and elimination of schistosomiasis. Our aim is to explore the application of a hybrid forecasting model to track the trends of the prevalence of schistosomiasis in humans, which provides a methodological basis for predicting and detecting schistosomiasis infection in endemic areas. Methods A hybrid approach combining the autoregressive integrated moving average (ARIMA) model and the nonlinear autoregressive neural network (NARNN) model to forecast the prevalence of schistosomiasis in the future four years. Forecasting performance was compared between the hybrid ARIMA-NARNN model, and the single ARIMA or the single NARNN model. Results The modelling mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model was 0.1869×10−4, 0.0029, 0.0419 with a corresponding testing error of 0.9375×10−4, 0.0081, 0.9064, respectively. These error values generated with the hybrid model were all lower than those obtained from the single ARIMA or NARNN model. The forecasting values were 0.75%, 0.80%, 0.76% and 0.77% in the future four years, which demonstrated a no-downward trend. Conclusion The hybrid model has high quality prediction accuracy in the prevalence of schistosomiasis, which provides a methodological basis for future schistosomiasis monitoring and control strategies in the study area. It is worth attempting to utilize the hybrid detection scheme in other schistosomiasis-endemic areas including other infectious diseases. PMID:25119882

  5. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    DOE PAGES

    An, Zhe; Rey, Daniel; Ye, Jingxin; ...

    2017-01-16

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of themore » full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. Here, we show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.« less

  6. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Zhe; Rey, Daniel; Ye, Jingxin

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of themore » full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. Here, we show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.« less

  7. [Prediction of Encapsulation Temperatures of Copolymer Films in Photovoltaic Cells Using Hyperspectral Imaging Techniques and Chemometrics].

    PubMed

    Lin, Ping; Chen, Yong-ming; Yao, Zhi-lei

    2015-11-01

    A novel method of combination of the chemometrics and the hyperspectral imaging techniques was presented to detect the temperatures of Ethylene-Vinyl Acetate copolymer (EVA) films in photovoltaic cells during the thermal encapsulation process. Four varieties of the EVA films which had been heated at the temperatures of 128, 132, 142 and 148 °C during the photovoltaic cells production process were used for investigation in this paper. These copolymer encapsulation films were firstly scanned by the hyperspectral imaging equipment (Spectral Imaging Ltd. Oulu, Finland). The scanning band range of hyperspectral equipemnt was set between 904.58 and 1700.01 nm. The hyperspectral dataset of copolymer films was randomly divided into two parts for the training and test purpose. Each type of the training set and test set contained 90 and 10 instances, respectively. The obtained hyperspectral images of EVA films were dealt with by using the ENVI (Exelis Visual Information Solutions, USA) software. The size of region of interest (ROI) of each obtained hyperspectral image of EVA film was set as 150 x 150 pixels. The average of reflectance hyper spectra of all the pixels in the ROI was used as the characteristic curve to represent the instance. There kinds of chemometrics methods including partial least squares regression (PLSR), multi-class support vector machine (SVM) and large margin nearest neighbor (LMNN) were used to correlate the characteristic hyper spectra with the encapsulation temperatures of of copolymer films. The plot of weighted regression coefficients illustrated that both bands of short- and long-wave near infrared hyperspectral data contributed to enhancing the prediction accuracy of the forecast model. Because the attained reflectance hyperspectral data of EVA materials displayed the strong nonlinearity, the prediction performance of linear modeling method of PLSR declined and the prediction precision only reached to 95%. The kernel-based forecast models were introduced to eliminate the impact of nonlinear hyperspectral data to some extent through mapping the original nonlinear hyperspectral data to the high dimensional linear feature space, so the relationship between the nonlinear hyperspectral data and the encapsulation temperatures of EVA films was fully disclosed finally. Compared with the prediction results of three proposed models, the prediction performance of LMNN was superior to the other two, whose final recognition accuracy achieved 100%. The results indicated that the methods of combination of LMNN model with the hyperspectral imaging techniques was the best one for accurately and rapidly determining the encapsulation temperatures of EVA films of photovoltaic cells. In addition, this paper had created the ideal conditions for automatically monitoring and effectively controlling the encapsulation temperatures of EVA films in the photovoltaic cells production process.

  8. Conditional nonlinear optimal perturbations based on the particle swarm optimization and their applications to the predictability problems

    NASA Astrophysics Data System (ADS)

    Zheng, Qin; Yang, Zubin; Sha, Jianxin; Yan, Jun

    2017-02-01

    In predictability problem research, the conditional nonlinear optimal perturbation (CNOP) describes the initial perturbation that satisfies a certain constraint condition and causes the largest prediction error at the prediction time. The CNOP has been successfully applied in estimation of the lower bound of maximum predictable time (LBMPT). Generally, CNOPs are calculated by a gradient descent algorithm based on the adjoint model, which is called ADJ-CNOP. This study, through the two-dimensional Ikeda model, investigates the impacts of the nonlinearity on ADJ-CNOP and the corresponding precision problems when using ADJ-CNOP to estimate the LBMPT. Our conclusions are that (1) when the initial perturbation is large or the prediction time is long, the strong nonlinearity of the dynamical model in the prediction variable will lead to failure of the ADJ-CNOP method, and (2) when the objective function has multiple extreme values, ADJ-CNOP has a large probability of producing local CNOPs, hence making a false estimation of the LBMPT. Furthermore, the particle swarm optimization (PSO) algorithm, one kind of intelligent algorithm, is introduced to solve this problem. The method using PSO to compute CNOP is called PSO-CNOP. The results of numerical experiments show that even with a large initial perturbation and long prediction time, or when the objective function has multiple extreme values, PSO-CNOP can always obtain the global CNOP. Since the PSO algorithm is a heuristic search algorithm based on the population, it can overcome the impact of nonlinearity and the disturbance from multiple extremes of the objective function. In addition, to check the estimation accuracy of the LBMPT presented by PSO-CNOP and ADJ-CNOP, we partition the constraint domain of initial perturbations into sufficiently fine grid meshes and take the LBMPT obtained by the filtering method as a benchmark. The result shows that the estimation presented by PSO-CNOP is closer to the true value than the one by ADJ-CNOP with the forecast time increasing.

  9. Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States.

    PubMed

    Yamana, Teresa K; Kandula, Sasikiran; Shaman, Jeffrey

    2017-11-01

    Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time.

  10. Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States

    PubMed Central

    Kandula, Sasikiran; Shaman, Jeffrey

    2017-01-01

    Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time. PMID:29107987

  11. Nonlinear autoregressive neural networks with external inputs for forecasting of typhoon inundation level.

    PubMed

    Ouyang, Huei-Tau

    2017-08-01

    Accurate inundation level forecasting during typhoon invasion is crucial for organizing response actions such as the evacuation of people from areas that could potentially flood. This paper explores the ability of nonlinear autoregressive neural networks with exogenous inputs (NARX) to predict inundation levels induced by typhoons. Two types of NARX architecture were employed: series-parallel (NARX-S) and parallel (NARX-P). Based on cross-correlation analysis of rainfall and water-level data from historical typhoon records, 10 NARX models (five of each architecture type) were constructed. The forecasting ability of each model was assessed by considering coefficient of efficiency (CE), relative time shift error (RTS), and peak water-level error (PE). The results revealed that high CE performance could be achieved by employing more model input variables. Comparisons of the two types of model demonstrated that the NARX-S models outperformed the NARX-P models in terms of CE and RTS, whereas both performed exceptionally in terms of PE and without significant difference. The NARX-S and NARX-P models with the highest overall performance were identified and their predictions were compared with those of traditional ARX-based models. The NARX-S model outperformed the ARX-based models in all three indexes, whereas the NARX-P model exhibited comparable CE performance and superior RTS and PE performance.

  12. Hourly runoff forecasting for flood risk management: Application of various computational intelligence models

    NASA Astrophysics Data System (ADS)

    Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.

    2015-10-01

    Reliable river flow forecasts play a key role in flood risk mitigation. Among different approaches of river flow forecasting, data driven approaches have become increasingly popular in recent years due to their minimum information requirements and ability to simulate nonlinear and non-stationary characteristics of hydrological processes. In this study, attempts are made to apply four different types of data driven approaches, namely traditional artificial neural networks (ANN), adaptive neuro-fuzzy inference systems (ANFIS), wavelet neural networks (WNN), and, hybrid ANFIS with multi resolution analysis using wavelets (WNF). Developed models applied for real time flood forecasting at Casino station on Richmond River, Australia which is highly prone to flooding. Hourly rainfall and runoff data were used to drive the models which have been used for forecasting with 1, 6, 12, 24, 36 and 48 h lead-time. The performance of models further improved by adding an upstream river flow data (Wiangaree station), as another effective input. All models perform satisfactorily up to 12 h lead-time. However, the hybrid wavelet-based models significantly outperforming the ANFIS and ANN models in the longer lead-time forecasting. The results confirm the robustness of the proposed structure of the hybrid models for real time runoff forecasting in the study area.

  13. A preliminary study of the impact of the ERS 1 C band scatterometer wind data on the European Centre for Medium-Range Weather Forecasts global data assimilation system

    NASA Technical Reports Server (NTRS)

    Hoffman, Ross N.

    1993-01-01

    A preliminary assessment of the impact of the ERS 1 scatterometer wind data on the current European Centre for Medium-Range Weather Forecasts analysis and forecast system has been carried out. Although the scatterometer data results in changes to the analyses and forecasts, there is no consistent improvement or degradation. Our results are based on comparing analyses and forecasts from assimilation cycles. The two sets of analyses are very similar except for the low level wind fields over the ocean. Impacts on the analyzed wind fields are greater over the southern ocean, where other data are scarce. For the most part the mass field increments are too small to balance the wind increments. The effect of the nonlinear normal mode initialization on the analysis differences is quite small, but we observe that the differences tend to wash out in the subsequent 6-hour forecast. In the Northern Hemisphere, analysis differences are very small, except directly at the scatterometer locations. Forecast comparisons reveal large differences in the Southern Hemisphere after 72 hours. Notable differences in the Northern Hemisphere do not appear until late in the forecast. Overall, however, the Southern Hemisphere impacts are neutral. The experiments described are preliminary in several respects. We expect these data to ultimately prove useful for global data assimilation.

  14. Chaos of radiative heat-loss-induced flame front instability.

    PubMed

    Kinugawa, Hikaru; Ueda, Kazuhiro; Gotoda, Hiroshi

    2016-03-01

    We are intensively studying the chaos via the period-doubling bifurcation cascade in radiative heat-loss-induced flame front instability by analytical methods based on dynamical systems theory and complex networks. Significant changes in flame front dynamics in the chaotic region, which cannot be seen in the bifurcation diagrams, were successfully extracted from recurrence quantification analysis and nonlinear forecasting and from the network entropy. The temporal dynamics of the fuel concentration in the well-developed chaotic region is much more complicated than that of the flame front temperature. It exhibits self-affinity as a result of the scale-free structure in the constructed visibility graph.

  15. Air Pollution Forecasts: An Overview

    PubMed Central

    Bai, Lu; Wang, Jianzhou; Lu, Haiyan

    2018-01-01

    Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies. PMID:29673227

  16. Air Pollution Forecasts: An Overview.

    PubMed

    Bai, Lu; Wang, Jianzhou; Ma, Xuejiao; Lu, Haiyan

    2018-04-17

    Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies.

  17. Monthly monsoon rainfall forecasting using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Ganti, Ravikumar

    2014-10-01

    Indian agriculture sector heavily depends on monsoon rainfall for successful harvesting. In the past, prediction of rainfall was mainly performed using regression models, which provide reasonable accuracy in the modelling and forecasting of complex physical systems. Recently, Artificial Neural Networks (ANNs) have been proposed as efficient tools for modelling and forecasting. A feed-forward multi-layer perceptron type of ANN architecture trained using the popular back-propagation algorithm was employed in this study. Other techniques investigated for modeling monthly monsoon rainfall include linear and non-linear regression models for comparison purposes. The data employed in this study include monthly rainfall and monthly average of the daily maximum temperature in the North Central region in India. Specifically, four regression models and two ANN model's were developed. The performance of various models was evaluated using a wide variety of standard statistical parameters and scatter plots. The results obtained in this study for forecasting monsoon rainfalls using ANNs have been encouraging. India's economy and agricultural activities can be effectively managed with the help of the availability of the accurate monsoon rainfall forecasts.

  18. Reference evapotranspiration forecasting based on local meteorological and global climate information screened by partial mutual information

    NASA Astrophysics Data System (ADS)

    Fang, Wei; Huang, Shengzhi; Huang, Qiang; Huang, Guohe; Meng, Erhao; Luan, Jinkai

    2018-06-01

    In this study, reference evapotranspiration (ET0) forecasting models are developed for the least economically developed regions subject to meteorological data scarcity. Firstly, the partial mutual information (PMI) capable of capturing the linear and nonlinear dependence is investigated regarding its utility to identify relevant predictors and exclude those that are redundant through the comparison with partial linear correlation. An efficient input selection technique is crucial for decreasing model data requirements. Then, the interconnection between global climate indices and regional ET0 is identified. Relevant climatic indices are introduced as additional predictors to comprise information regarding ET0, which ought to be provided by meteorological data unavailable. The case study in the Jing River and Beiluo River basins, China, reveals that PMI outperforms the partial linear correlation in excluding the redundant information, favouring the yield of smaller predictor sets. The teleconnection analysis identifies the correlation between Nino 1 + 2 and regional ET0, indicating influences of ENSO events on the evapotranspiration process in the study area. Furthermore, introducing Nino 1 + 2 as predictors helps to yield more accurate ET0 forecasts. A model performance comparison also shows that non-linear stochastic models (SVR or RF with input selection through PMI) do not always outperform linear models (MLR with inputs screen by linear correlation). However, the former can offer quite comparable performance depending on smaller predictor sets. Therefore, efforts such as screening model inputs through PMI and incorporating global climatic indices interconnected with ET0 can benefit the development of ET0 forecasting models suitable for data-scarce regions.

  19. Cross scale interactions, nonlinearities, and forecasting catastrophic events

    USGS Publications Warehouse

    Peters, Debra P.C.; Pielke, Roger A.; Bestelmeyer, Brandon T.; Allen, Craig D.; Munson-McGee, Stuart; Havstad, Kris M.

    2004-01-01

    Catastrophic events share characteristic nonlinear behaviors that are often generated by cross-scale interactions and feedbacks among system elements. These events result in surprises that cannot easily be predicted based on information obtained at a single scale. Progress on catastrophic events has focused on one of the following two areas: nonlinear dynamics through time without an explicit consideration of spatial connectivity [Holling, C. S. (1992) Ecol. Monogr. 62, 447–502] or spatial connectivity and the spread of contagious processes without a consideration of cross-scale interactions and feedbacks [Zeng, N., Neeling, J. D., Lau, L. M. & Tucker, C. J. (1999) Science 286, 1537–1540]. These approaches rarely have ventured beyond traditional disciplinary boundaries. We provide an interdisciplinary, conceptual, and general mathematical framework for understanding and forecasting nonlinear dynamics through time and across space. We illustrate the generality and usefulness of our approach by using new data and recasting published data from ecology (wildfires and desertification), epidemiology (infectious diseases), and engineering (structural failures). We show that decisions that minimize the likelihood of catastrophic events must be based on cross-scale interactions, and such decisions will often be counterintuitive. Given the continuing challenges associated with global change, approaches that cross disciplinary boundaries to include interactions and feedbacks at multiple scales are needed to increase our ability to predict catastrophic events and develop strategies for minimizing their occurrence and impacts. Our framework is an important step in developing predictive tools and designing experiments to examine cross-scale interactions.

  20. Multiannual forecasting of seasonal influenza dynamics reveals climatic and evolutionary drivers.

    PubMed

    Axelsen, Jacob Bock; Yaari, Rami; Grenfell, Bryan T; Stone, Lewi

    2014-07-01

    Human influenza occurs annually in most temperate climatic zones of the world, with epidemics peaking in the cold winter months. Considerable debate surrounds the relative role of epidemic dynamics, viral evolution, and climatic drivers in driving year-to-year variability of outbreaks. The ultimate test of understanding is prediction; however, existing influenza models rarely forecast beyond a single year at best. Here, we use a simple epidemiological model to reveal multiannual predictability based on high-quality influenza surveillance data for Israel; the model fit is corroborated by simple metapopulation comparisons within Israel. Successful forecasts are driven by temperature, humidity, antigenic drift, and immunity loss. Essentially, influenza dynamics are a balance between large perturbations following significant antigenic jumps, interspersed with nonlinear epidemic dynamics tuned by climatic forcing.

  1. Ensemble flare forecasting: using numerical weather prediction techniques to improve space weather operations

    NASA Astrophysics Data System (ADS)

    Murray, S.; Guerra, J. A.

    2017-12-01

    One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.

  2. Tsunami propagation modelling - a sensitivity study

    NASA Astrophysics Data System (ADS)

    Dao, M. H.; Tkalich, P.

    2007-12-01

    Indian Ocean (2004) Tsunami and following tragic consequences demonstrated lack of relevant experience and preparedness among involved coastal nations. After the event, scientific and forecasting circles of affected countries have started a capacity building to tackle similar problems in the future. Different approaches have been used for tsunami propagation, such as Boussinesq and Nonlinear Shallow Water Equations (NSWE). These approximations were obtained assuming different relevant importance of nonlinear, dispersion and spatial gradient variation phenomena and terms. The paper describes further development of original TUNAMI-N2 model to take into account additional phenomena: astronomic tide, sea bottom friction, dispersion, Coriolis force, and spherical curvature. The code is modified to be suitable for operational forecasting, and the resulting version (TUNAMI-N2-NUS) is verified using test cases, results of other models, and real case scenarios. Using the 2004 Tsunami event as one of the scenarios, the paper examines sensitivity of numerical solutions to variation of different phenomena and parameters, and the results are analyzed and ranked accordingly.

  3. Forecasting Japanese encephalitis incidence from historical morbidity patterns: Statistical analysis with 27 years of observation in Assam, India.

    PubMed

    Handique, Bijoy K; Khan, Siraj A; Mahanta, J; Sudhakar, S

    2014-09-01

    Japanese encephalitis (JE) is one of the dreaded mosquito-borne viral diseases mostly prevalent in south Asian countries including India. Early warning of the disease in terms of disease intensity is crucial for taking adequate and appropriate intervention measures. The present study was carried out in Dibrugarh district in the state of Assam located in the northeastern region of India to assess the accuracy of selected forecasting methods based on historical morbidity patterns of JE incidence during the past 22 years (1985-2006). Four selected forecasting methods, viz. seasonal average (SA), seasonal adjustment with last three observations (SAT), modified method adjusting long-term and cyclic trend (MSAT), and autoregressive integrated moving average (ARIMA) have been employed to assess the accuracy of each of the forecasting methods. The forecasting methods were validated for five consecutive years from 2007-2012 and accuracy of each method has been assessed. The forecasting method utilising seasonal adjustment with long-term and cyclic trend emerged as best forecasting method among the four selected forecasting methods and outperformed the even statistically more advanced ARIMA method. Peak of the disease incidence could effectively be predicted with all the methods, but there are significant variations in magnitude of forecast errors among the selected methods. As expected, variation in forecasts at primary health centre (PHC) level is wide as compared to that of district level forecasts. The study showed that adopted forecasting techniques could reasonably forecast the intensity of JE cases at PHC level without considering the external variables. The results indicate that the understanding of long-term and cyclic trend of the disease intensity will improve the accuracy of the forecasts, but there is a need for making the forecast models more robust to explain sudden variation in the disease intensity with detail analysis of parasite and host population dynamics.

  4. Constraints on Rational Model Weighting, Blending and Selecting when Constructing Probability Forecasts given Multiple Models

    NASA Astrophysics Data System (ADS)

    Higgins, S. M. W.; Du, H. L.; Smith, L. A.

    2012-04-01

    Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.

  5. Diffusion Forecasting Model with Basis Functions from QR-Decomposition

    NASA Astrophysics Data System (ADS)

    Harlim, John; Yang, Haizhao

    2018-06-01

    The diffusion forecasting is a nonparametric approach that provably solves the Fokker-Planck PDE corresponding to Itô diffusion without knowing the underlying equation. The key idea of this method is to approximate the solution of the Fokker-Planck equation with a discrete representation of the shift (Koopman) operator on a set of basis functions generated via the diffusion maps algorithm. While the choice of these basis functions is provably optimal under appropriate conditions, computing these basis functions is quite expensive since it requires the eigendecomposition of an N× N diffusion matrix, where N denotes the data size and could be very large. For large-scale forecasting problems, only a few leading eigenvectors are computationally achievable. To overcome this computational bottleneck, a new set of basis functions constructed by orthonormalizing selected columns of the diffusion matrix and its leading eigenvectors is proposed. This computation can be carried out efficiently via the unpivoted Householder QR factorization. The efficiency and effectiveness of the proposed algorithm will be shown in both deterministically chaotic and stochastic dynamical systems; in the former case, the superiority of the proposed basis functions over purely eigenvectors is significant, while in the latter case forecasting accuracy is improved relative to using a purely small number of eigenvectors. Supporting arguments will be provided on three- and six-dimensional chaotic ODEs, a three-dimensional SDE that mimics turbulent systems, and also on the two spatial modes associated with the boreal winter Madden-Julian Oscillation obtained from applying the Nonlinear Laplacian Spectral Analysis on the measured Outgoing Longwave Radiation.

  6. Diffusion Forecasting Model with Basis Functions from QR-Decomposition

    NASA Astrophysics Data System (ADS)

    Harlim, John; Yang, Haizhao

    2017-12-01

    The diffusion forecasting is a nonparametric approach that provably solves the Fokker-Planck PDE corresponding to Itô diffusion without knowing the underlying equation. The key idea of this method is to approximate the solution of the Fokker-Planck equation with a discrete representation of the shift (Koopman) operator on a set of basis functions generated via the diffusion maps algorithm. While the choice of these basis functions is provably optimal under appropriate conditions, computing these basis functions is quite expensive since it requires the eigendecomposition of an N× N diffusion matrix, where N denotes the data size and could be very large. For large-scale forecasting problems, only a few leading eigenvectors are computationally achievable. To overcome this computational bottleneck, a new set of basis functions constructed by orthonormalizing selected columns of the diffusion matrix and its leading eigenvectors is proposed. This computation can be carried out efficiently via the unpivoted Householder QR factorization. The efficiency and effectiveness of the proposed algorithm will be shown in both deterministically chaotic and stochastic dynamical systems; in the former case, the superiority of the proposed basis functions over purely eigenvectors is significant, while in the latter case forecasting accuracy is improved relative to using a purely small number of eigenvectors. Supporting arguments will be provided on three- and six-dimensional chaotic ODEs, a three-dimensional SDE that mimics turbulent systems, and also on the two spatial modes associated with the boreal winter Madden-Julian Oscillation obtained from applying the Nonlinear Laplacian Spectral Analysis on the measured Outgoing Longwave Radiation.

  7. Multifractality and value-at-risk forecasting of exchange rates

    NASA Astrophysics Data System (ADS)

    Batten, Jonathan A.; Kinateder, Harald; Wagner, Niklas

    2014-05-01

    This paper addresses market risk prediction for high frequency foreign exchange rates under nonlinear risk scaling behaviour. We use a modified version of the multifractal model of asset returns (MMAR) where trading time is represented by the series of volume ticks. Our dataset consists of 138,418 5-min round-the-clock observations of EUR/USD spot quotes and trading ticks during the period January 5, 2006 to December 31, 2007. Considering fat-tails, long-range dependence as well as scale inconsistency with the MMAR, we derive out-of-sample value-at-risk (VaR) forecasts and compare our approach to historical simulation as well as a benchmark GARCH(1,1) location-scale VaR model. Our findings underline that the multifractal properties in EUR/USD returns in fact have notable risk management implications. The MMAR approach is a parsimonious model which produces admissible VaR forecasts at the 12-h forecast horizon. For the daily horizon, the MMAR outperforms both alternatives based on conditional as well as unconditional coverage statistics.

  8. Short-term PV/T module temperature prediction based on PCA-RBF neural network

    NASA Astrophysics Data System (ADS)

    Li, Jiyong; Zhao, Zhendong; Li, Yisheng; Xiao, Jing; Tang, Yunfeng

    2018-02-01

    Aiming at the non-linearity and large inertia of temperature control in PV/T system, short-term temperature prediction of PV/T module is proposed, to make the PV/T system controller run forward according to the short-term forecasting situation to optimize control effect. Based on the analysis of the correlation between PV/T module temperature and meteorological factors, and the temperature of adjacent time series, the principal component analysis (PCA) method is used to pre-process the original input sample data. Combined with the RBF neural network theory, the simulation results show that the PCA method makes the prediction accuracy of the network model higher and the generalization performance stronger than that of the RBF neural network without the main component extraction.

  9. Improved water-level forecasting for the Northwest European Shelf and North Sea through direct modelling of tide, surge and non-linear interaction

    NASA Astrophysics Data System (ADS)

    Zijl, Firmijn; Verlaan, Martin; Gerritsen, Herman

    2013-07-01

    In real-time operational coastal forecasting systems for the northwest European shelf, the representation accuracy of tide-surge models commonly suffers from insufficiently accurate tidal representation, especially in shallow near-shore areas with complex bathymetry and geometry. Therefore, in conventional operational systems, the surge component from numerical model simulations is used, while the harmonically predicted tide, accurately known from harmonic analysis of tide gauge measurements, is added to forecast the full water-level signal at tide gauge locations. Although there are errors associated with this so-called astronomical correction (e.g. because of the assumption of linearity of tide and surge), for current operational models, astronomical correction has nevertheless been shown to increase the representation accuracy of the full water-level signal. The simulated modulation of the surge through non-linear tide-surge interaction is affected by the poor representation of the tide signal in the tide-surge model, which astronomical correction does not improve. Furthermore, astronomical correction can only be applied to locations where the astronomic tide is known through a harmonic analysis of in situ measurements at tide gauge stations. This provides a strong motivation to improve both tide and surge representation of numerical models used in forecasting. In the present paper, we propose a new generation tide-surge model for the northwest European Shelf (DCSMv6). This is the first application on this scale in which the tidal representation is such that astronomical correction no longer improves the accuracy of the total water-level representation and where, consequently, the straightforward direct model forecasting of total water levels is better. The methodology applied to improve both tide and surge representation of the model is discussed, with emphasis on the use of satellite altimeter data and data assimilation techniques for reducing parameter uncertainty. Historic DCSMv6 model simulations are compared against shelf wide observations for a full calendar year. For a selection of stations, these results are compared to those with astronomical correction, which confirms that the tide representation in coastal regions has sufficient accuracy, and that forecasting total water levels directly yields superior results.

  10. Time series analysis of reference crop evapotranspiration using soft computing techniques for Ganjam District, Odisha, India

    NASA Astrophysics Data System (ADS)

    Patra, S. R.

    2017-12-01

    Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.

  11. Nesting large-eddy simulations within mesoscale simulations for wind energy applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundquist, J K; Mirocha, J D; Chow, F K

    2008-09-08

    With increasing demand for more accurate atmospheric simulations for wind turbine micrositing, for operational wind power forecasting, and for more reliable turbine design, simulations of atmospheric flow with resolution of tens of meters or higher are required. These time-dependent large-eddy simulations (LES), which resolve individual atmospheric eddies on length scales smaller than turbine blades and account for complex terrain, are possible with a range of commercial and open-source software, including the Weather Research and Forecasting (WRF) model. In addition to 'local' sources of turbulence within an LES domain, changing weather conditions outside the domain can also affect flow, suggesting thatmore » a mesoscale model provide boundary conditions to the large-eddy simulations. Nesting a large-eddy simulation within a mesoscale model requires nuanced representations of turbulence. Our group has improved the Weather and Research Forecasting model's (WRF) LES capability by implementing the Nonlinear Backscatter and Anisotropy (NBA) subfilter stress model following Kosovic (1997) and an explicit filtering and reconstruction technique to compute the Resolvable Subfilter-Scale (RSFS) stresses (following Chow et al, 2005). We have also implemented an immersed boundary method (IBM) in WRF to accommodate complex terrain. These new models improve WRF's LES capabilities over complex terrain and in stable atmospheric conditions. We demonstrate approaches to nesting LES within a mesoscale simulation for farms of wind turbines in hilly regions. Results are sensitive to the nesting method, indicating that care must be taken to provide appropriate boundary conditions, and to allow adequate spin-up of turbulence in the LES domain.« less

  12. Neutrino masses and cosmological parameters from a Euclid-like survey: Markov Chain Monte Carlo forecasts including theoretical errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Audren, Benjamin; Lesgourgues, Julien; Bird, Simeon

    2013-01-01

    We present forecasts for the accuracy of determining the parameters of a minimal cosmological model and the total neutrino mass based on combined mock data for a future Euclid-like galaxy survey and Planck. We consider two different galaxy surveys: a spectroscopic redshift survey and a cosmic shear survey. We make use of the Monte Carlo Markov Chains (MCMC) technique and assume two sets of theoretical errors. The first error is meant to account for uncertainties in the modelling of the effect of neutrinos on the non-linear galaxy power spectrum and we assume this error to be fully correlated in Fouriermore » space. The second error is meant to parametrize the overall residual uncertainties in modelling the non-linear galaxy power spectrum at small scales, and is conservatively assumed to be uncorrelated and to increase with the ratio of a given scale to the scale of non-linearity. It hence increases with wavenumber and decreases with redshift. With these two assumptions for the errors and assuming further conservatively that the uncorrelated error rises above 2% at k = 0.4 h/Mpc and z = 0.5, we find that a future Euclid-like cosmic shear/galaxy survey achieves a 1-σ error on M{sub ν} close to 32 meV/25 meV, sufficient for detecting the total neutrino mass with good significance. If the residual uncorrelated errors indeed rises rapidly towards smaller scales in the non-linear regime as we have assumed here then the data on non-linear scales does not increase the sensitivity to the total neutrino mass. Assuming instead a ten times smaller theoretical error with the same scale dependence, the error on the total neutrino mass decreases moderately from σ(M{sub ν}) = 18 meV to 14 meV when mildly non-linear scales with 0.1 h/Mpc < k < 0.6 h/Mpc are included in the analysis of the galaxy survey data.« less

  13. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    NASA Astrophysics Data System (ADS)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.

    2018-03-01

    Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  14. A Modified LS+AR Model to Improve the Accuracy of the Short-term Polar Motion Prediction

    NASA Astrophysics Data System (ADS)

    Wang, Z. W.; Wang, Q. X.; Ding, Y. Q.; Zhang, J. J.; Liu, S. S.

    2017-03-01

    There are two problems of the LS (Least Squares)+AR (AutoRegressive) model in polar motion forecast: the inner residual value of LS fitting is reasonable, but the residual value of LS extrapolation is poor; and the LS fitting residual sequence is non-linear. It is unsuitable to establish an AR model for the residual sequence to be forecasted, based on the residual sequence before forecast epoch. In this paper, we make solution to those two problems with two steps. First, restrictions are added to the two endpoints of LS fitting data to fix them on the LS fitting curve. Therefore, the fitting values next to the two endpoints are very close to the observation values. Secondly, we select the interpolation residual sequence of an inward LS fitting curve, which has a similar variation trend as the LS extrapolation residual sequence, as the modeling object of AR for the residual forecast. Calculation examples show that this solution can effectively improve the short-term polar motion prediction accuracy by the LS+AR model. In addition, the comparison results of the forecast models of RLS (Robustified Least Squares)+AR, RLS+ARIMA (AutoRegressive Integrated Moving Average), and LS+ANN (Artificial Neural Network) confirm the feasibility and effectiveness of the solution for the polar motion forecast. The results, especially for the polar motion forecast in the 1-10 days, show that the forecast accuracy of the proposed model can reach the world level.

  15. Key Technology of Real-Time Road Navigation Method Based on Intelligent Data Research

    PubMed Central

    Tang, Haijing; Liang, Yu; Huang, Zhongnan; Wang, Taoyi; He, Lin; Du, Yicong; Ding, Gangyi

    2016-01-01

    The effect of traffic flow prediction plays an important role in routing selection. Traditional traffic flow forecasting methods mainly include linear, nonlinear, neural network, and Time Series Analysis method. However, all of them have some shortcomings. This paper analyzes the existing algorithms on traffic flow prediction and characteristics of city traffic flow and proposes a road traffic flow prediction method based on transfer probability. This method first analyzes the transfer probability of upstream of the target road and then makes the prediction of the traffic flow at the next time by using the traffic flow equation. Newton Interior-Point Method is used to obtain the optimal value of parameters. Finally, it uses the proposed model to predict the traffic flow at the next time. By comparing the existing prediction methods, the proposed model has proven to have good performance. It can fast get the optimal value of parameters faster and has higher prediction accuracy, which can be used to make real-time traffic flow prediction. PMID:27872637

  16. Hybrid Support Vector Regression and Autoregressive Integrated Moving Average Models Improved by Particle Swarm Optimization for Property Crime Rates Forecasting with Economic Indicators

    PubMed Central

    Alwee, Razana; Hj Shamsuddin, Siti Mariyam; Sallehuddin, Roselina

    2013-01-01

    Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models. PMID:23766729

  17. Hybrid support vector regression and autoregressive integrated moving average models improved by particle swarm optimization for property crime rates forecasting with economic indicators.

    PubMed

    Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Sallehuddin, Roselina

    2013-01-01

    Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models.

  18. Forecasting of palm oil price in Malaysia using linear and nonlinear methods

    NASA Astrophysics Data System (ADS)

    Nor, Abu Hassan Shaari Md; Sarmidi, Tamat; Hosseinidoust, Ehsan

    2014-09-01

    The first question that comes to the mind is: "How can we predict the palm oil price accurately?" This question is the authorities, policy makers and economist's question for a long period of time. The first reason is that in the recent years Malaysia showed a comparative advantage in palm oil production and has become top producer and exporter in the world. Secondly, palm oil price plays significant role in government budget and represents important source of income for Malaysia, which potentially can influence the magnitude of monetary policies and eventually have an impact on inflation. Thirdly, knowledge on the future trends would be helpful in the planning and decision making procedures and will generate precise fiscal and monetary policy. Daily data on palm oil prices along with the ARIMA models, neural networks and fuzzy logic systems are employed in this paper. Empirical findings indicate that the dynamic neural network of NARX and the hybrid system of ANFIS provide higher accuracy than the ARIMA and static neural network for forecasting the palm oil price in Malaysia.

  19. Water demand forecasting: review of soft computing methods.

    PubMed

    Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R

    2017-07-01

    Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.

  20. Prediction of a service demand using combined forecasting approach

    NASA Astrophysics Data System (ADS)

    Zhou, Ling

    2017-08-01

    Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.

  1. An overview of health forecasting.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D

    2013-01-01

    Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.

  2. Linear and non-linear Modified Gravity forecasts with future surveys

    NASA Astrophysics Data System (ADS)

    Casas, Santiago; Kunz, Martin; Martinelli, Matteo; Pettorino, Valeria

    2017-12-01

    Modified Gravity theories generally affect the Poisson equation and the gravitational slip in an observable way, that can be parameterized by two generic functions (η and μ) of time and space. We bin their time dependence in redshift and present forecasts on each bin for future surveys like Euclid. We consider both Galaxy Clustering and Weak Lensing surveys, showing the impact of the non-linear regime, with two different semi-analytical approximations. In addition to these future observables, we use a prior covariance matrix derived from the Planck observations of the Cosmic Microwave Background. In this work we neglect the information from the cross correlation of these observables, and treat them as independent. Our results show that η and μ in different redshift bins are significantly correlated, but including non-linear scales reduces or even eliminates the correlation, breaking the degeneracy between Modified Gravity parameters and the overall amplitude of the matter power spectrum. We further apply a Zero-phase Component Analysis and identify which combinations of the Modified Gravity parameter amplitudes, in different redshift bins, are best constrained by future surveys. We extend the analysis to two particular parameterizations of μ and η and consider, in addition to Euclid, also SKA1, SKA2, DESI: we find in this case that future surveys will be able to constrain the current values of η and μ at the 2-5% level when using only linear scales (wavevector k < 0 . 15 h/Mpc), depending on the specific time parameterization; sensitivity improves to about 1% when non-linearities are included.

  3. Using a Hybrid Model to Forecast the Prevalence of Schistosomiasis in Humans.

    PubMed

    Zhou, Lingling; Xia, Jing; Yu, Lijing; Wang, Ying; Shi, Yun; Cai, Shunxiang; Nie, Shaofa

    2016-03-23

    We previously proposed a hybrid model combining both the autoregressive integrated moving average (ARIMA) and the nonlinear autoregressive neural network (NARNN) models in forecasting schistosomiasis. Our purpose in the current study was to forecast the annual prevalence of human schistosomiasis in Yangxin County, using our ARIMA-NARNN model, thereby further certifying the reliability of our hybrid model. We used the ARIMA, NARNN and ARIMA-NARNN models to fit and forecast the annual prevalence of schistosomiasis. The modeling time range included was the annual prevalence from 1956 to 2008 while the testing time range included was from 2009 to 2012. The mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) were used to measure the model performance. We reconstructed the hybrid model to forecast the annual prevalence from 2013 to 2016. The modeling and testing errors generated by the ARIMA-NARNN model were lower than those obtained from either the single ARIMA or NARNN models. The predicted annual prevalence from 2013 to 2016 demonstrated an initial decreasing trend, followed by an increase. The ARIMA-NARNN model can be well applied to analyze surveillance data for early warning systems for the control and elimination of schistosomiasis.

  4. Multiple indices method for real-time tsunami inundation forecast using a dense offshore observation network

    NASA Astrophysics Data System (ADS)

    Yamamoto, N.; Aoi, S.; Hirata, K.; Suzuki, W.; Kunugi, T.; Nakamura, H.

    2015-12-01

    We started to develop a new methodology for real-time tsunami inundation forecast system (Aoi et al., 2015, this meeting) using densely offshore tsunami observations of the Seafloor Observation Network for Earthquakes and Tsunamis (S-net), which is under construction along the Japan Trench (Kanazawa et al., 2012, JpGU; Uehira et al., 2015, IUGG). In our method, the most important concept is involving any type and/or form uncertainties in the tsunami forecast, which cannot be dealt with any of standard linear/nonlinear least square approaches. We first prepare a Tsunami Scenario Bank (TSB), which contains offshore tsunami waveforms at the S-net stations and tsunami inundation information calculated from any possible tsunami source. We then quickly select several acceptable tsunami scenarios that can explain offshore observations by using multiple indices and appropriate thresholds, after a tsunami occurrence. At that time, possible tsunami inundations coupled with selected scenarios are forecasted (Yamamoto et al., 2014, AGU). Currently, we define three indices: correlation coefficient and two variance reductions, whose L2-norm part is normalized either by observations or calculations (Suzuki et al., 2015, JpGU; Yamamoto et al., 2015, IUGG). In this study, we construct the TSB, which contains various tsunami source models prepared for the probabilistic tsunami hazard assessment in the Japan Trench region (Hirata et al., 2014, AGU). To evaluate the propriety of our method, we adopt the fault model based on the 2011 Tohoku earthquake as a pseudo "observation". We also calculate three indices using coastal maximum tsunami height distributions between observation and calculation. We then obtain the correlation between coastal and offshore indices. We notice that the index value of coastal maximum tsunami heights is closer to 1 than the index value of offshore waveforms, i.e., the coastal maximum tsunami height may be predictable within appropriate thresholds defined for offshore indices. We also investigate the effect of rise-time. This work was partially supported by the Council for Science, Technology and Innovation (CSTI) through the Cross-ministerial Strategic Innovation Promotion Program (SIP), titled "Enhancement of societal resiliency against natural disasters" (Funding agency: JST).

  5. Linear and Non-linear Information Flows In Rainfall Field

    NASA Astrophysics Data System (ADS)

    Molini, A.; La Barbera, P.; Lanza, L. G.

    The rainfall process is the result of a complex framework of non-linear dynamical in- teractions between the different components of the atmosphere. It preserves the com- plexity and the intermittent features of the generating system in space and time as well as the strong dependence of these properties on the scale of observations. The understanding and quantification of how the non-linearity of the generating process comes to influence the single rain events constitute relevant research issues in the field of hydro-meteorology, especially in those applications where a timely and effective forecasting of heavy rain events is able to reduce the risk of failure. This work focuses on the characterization of the non-linear properties of the observed rain process and on the influence of these features on hydrological models. Among the goals of such a survey is the research of regular structures of the rainfall phenomenon and the study of the information flows within the rain field. The research focuses on three basic evo- lution directions for the system: in time, in space and between the different scales. In fact, the information flows that force the system to evolve represent in general a connection between the different locations in space, the different instants in time and, unless assuming the hypothesis of scale invariance is verified "a priori", the different characteristic scales. A first phase of the analysis is carried out by means of classic statistical methods, then a survey of the information flows within the field is devel- oped by means of techniques borrowed from the Information Theory, and finally an analysis of the rain signal in the time and frequency domains is performed, with par- ticular reference to its intermittent structure. The methods adopted in this last part of the work are both the classic techniques of statistical inference and a few procedures for the detection of non-linear and non-stationary features within the process starting from measured data.

  6. Dynamics of attitudes and genetic processes.

    PubMed

    Guastello, Stephen J; Guastello, Denise D

    2008-01-01

    Relatively new discoveries of a genetic component to attitudes have challenged the traditional viewpoint that attitudes are primarily learned ideas and behaviors. Attitudes that are regarded by respondents as "more important" tend to have greater genetic components to them, and tend to be more closely associated with authoritarianism. Nonlinear theories, nonetheless, have also been introduced to study attitude change. The objective of this study was to determine whether change in authoritarian attitudes across two generations would be more aptly described by a linear or a nonlinear model. Participants were 372 college students, their mothers, and their fathers who completed an attitude questionnaire. Results indicated that the nonlinear model (R2 = .09) was slightly better than the linear model (R2 = .08), but the two models offered very different forecasts for future generations of US society. The linear model projected a gradual and continuing bifurcation between authoritarians and non-authoritarians. The nonlinear model projected a stabilization of authoritarian attitudes.

  7. Computational modes and the Machenauer N.L.N.M.I. of the GLAS 4th order model. [NonLinear Normal Mode Initialization in numerical weather forecasting

    NASA Technical Reports Server (NTRS)

    Navon, I. M.; Bloom, S.; Takacs, L. L.

    1985-01-01

    An attempt was made to use the GLAS global 4th order shallow water equations to perform a Machenhauer nonlinear normal mode initialization (NLNMI) for the external vertical mode. A new algorithm was defined for identifying and filtering out computational modes which affect the convergence of the Machenhauer iterative procedure. The computational modes and zonal waves were linearly initialized and gravitational modes were nonlinearly initialized. The Machenhauer NLNMI was insensitive to the absence of high zonal wave numbers. The effects of the Machenhauer scheme were evaluated by performing 24 hr integrations with nondissipative and dissipative explicit time integration models. The NLNMI was found to be inferior to the Rasch (1984) pseudo-secant technique for obtaining convergence when the time scales of nonlinear forcing were much smaller than the time scales expected from the natural frequency of the mode.

  8. Assessing artificial neural networks and statistical methods for infilling missing soil moisture records

    NASA Astrophysics Data System (ADS)

    Dumedah, Gift; Walker, Jeffrey P.; Chik, Li

    2014-07-01

    Soil moisture information is critically important for water management operations including flood forecasting, drought monitoring, and groundwater recharge estimation. While an accurate and continuous record of soil moisture is required for these applications, the available soil moisture data, in practice, is typically fraught with missing values. There are a wide range of methods available to infilling hydrologic variables, but a thorough inter-comparison between statistical methods and artificial neural networks has not been made. This study examines 5 statistical methods including monthly averages, weighted Pearson correlation coefficient, a method based on temporal stability of soil moisture, and a weighted merging of the three methods, together with a method based on the concept of rough sets. Additionally, 9 artificial neural networks are examined, broadly categorized into feedforward, dynamic, and radial basis networks. These 14 infilling methods were used to estimate missing soil moisture records and subsequently validated against known values for 13 soil moisture monitoring stations for three different soil layer depths in the Yanco region in southeast Australia. The evaluation results show that the top three highest performing methods are the nonlinear autoregressive neural network, rough sets method, and monthly replacement. A high estimation accuracy (root mean square error (RMSE) of about 0.03 m/m) was found in the nonlinear autoregressive network, due to its regression based dynamic network which allows feedback connections through discrete-time estimation. An equally high accuracy (0.05 m/m RMSE) in the rough sets procedure illustrates the important role of temporal persistence of soil moisture, with the capability to account for different soil moisture conditions.

  9. Comparison of Adaline and Multiple Linear Regression Methods for Rainfall Forecasting

    NASA Astrophysics Data System (ADS)

    Sutawinaya, IP; Astawa, INGA; Hariyanti, NKD

    2018-01-01

    Heavy rainfall can cause disaster, therefore need a forecast to predict rainfall intensity. Main factor that cause flooding is there is a high rainfall intensity and it makes the river become overcapacity. This will cause flooding around the area. Rainfall factor is a dynamic factor, so rainfall is very interesting to be studied. In order to support the rainfall forecasting, there are methods that can be used from Artificial Intelligence (AI) to statistic. In this research, we used Adaline for AI method and Regression for statistic method. The more accurate forecast result shows the method that used is good for forecasting the rainfall. Through those methods, we expected which is the best method for rainfall forecasting here.

  10. Trend time-series modeling and forecasting with neural networks.

    PubMed

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  11. A GLM Post-processor to Adjust Ensemble Forecast Traces

    NASA Astrophysics Data System (ADS)

    Thiemann, M.; Day, G. N.; Schaake, J. C.; Draijer, S.; Wang, L.

    2011-12-01

    The skill of hydrologic ensemble forecasts has improved in the last years through a better understanding of climate variability, better climate forecasts and new data assimilation techniques. Having been extensively utilized for probabilistic water supply forecasting, interest is developing to utilize these forecasts in operational decision making. Hydrologic ensemble forecast members typically have inherent biases in flow timing and volume caused by (1) structural errors in the models used, (2) systematic errors in the data used to calibrate those models, (3) uncertain initial hydrologic conditions, and (4) uncertainties in the forcing datasets. Furthermore, hydrologic models have often not been developed for operational decision points and ensemble forecasts are thus not always available where needed. A statistical post-processor can be used to address these issues. The post-processor should (1) correct for systematic biases in flow timing and volume, (2) preserve the skill of the available raw forecasts, (3) preserve spatial and temporal correlation as well as the uncertainty in the forecasted flow data, (4) produce adjusted forecast ensembles that represent the variability of the observed hydrograph to be predicted, and (5) preserve individual forecast traces as equally likely. The post-processor should also allow for the translation of available ensemble forecasts to hydrologically similar locations where forecasts are not available. This paper introduces an ensemble post-processor (EPP) developed in support of New York City water supply operations. The EPP employs a general linear model (GLM) to (1) adjust available ensemble forecast traces and (2) create new ensembles for (nearby) locations where only historic flow observations are available. The EPP is calibrated by developing daily and aggregated statistical relationships form historical flow observations and model simulations. These are then used in operation to obtain the conditional probability density function (PDF) of the observations to be predicted, thus jointly adjusting individual ensemble members. These steps are executed in a normalized transformed space ('z'-space) to account for the strong non-linearity in the flow observations involved. A data window centered on each calibration date is used to minimize impacts from sampling errors and data noise. Testing on datasets from California and New York suggests that the EPP can successfully minimize biases in ensemble forecasts, while preserving the raw forecast skill in a 'days to weeks' forecast horizon and reproducing the variability of climatology for 'weeks to years' forecast horizons.

  12. Forecasting the Onset Time of Volcanic Eruptions Using Ground Deformation Data

    NASA Astrophysics Data System (ADS)

    Blake, S.; Cortes, J. A.

    2016-12-01

    The pre-eruptive inflation of the ground surface is a well-known phenomenon at many volcanoes. In a number of intensively studied cases, elevation and/or radial tilt increase with time (t) towards a limiting value by following a decaying exponential with characteristic timescale τ (Kilauea and Mauna Loa: Dvorak and Okamura 1987, Lengliné et al., 2008) or, after sufficiently long times, by following the sum of two such functions such that two timescales, τ1 and τ2, are required to describe the temporal pattern of inflation (Axial Seamount: Nooner and Chadwick, 2009). We have used the Levenberg-Marquardt non-linear fit algorithm to analyse data for 18 inflation periods at Krafla volcano, Iceland, (Björnsson and Eysteinsson, 1998) and found the same functional relationship. Pooling all of the available data from 25 eruptions at 4 volcanoes shows that the duration of inflation before an eruption or shallow intrusion (t*) is comparable to τ (or the longer of τ1 and τ2) and follows an almost 1:1 linear relationship (r2 0.8). We also find that this scaling is replicated by Monte Carlo simulations of physics-based forward models of hydraulically connected dual magma chamber systems which erupt when the chamber pressure reaches a threshold value. These results lead to a new forecasting method which we describe and assess here: if τ can be constrained during an on-going inflation period, then the statistical distribution of t*/τ values calibrated from other pre-eruptive inflation periods allows the probability of an eruption starting before (or after) a specified time to be estimated. The time at which there is a specified probability of an eruption starting can also be forecast. These approaches rely on fitting deformation data up to time t in order to obtain τ(t) which is then used to forecast t*. Forecasts can be updated after each new deformation measurement.

  13. Constraining modified theories of gravity with the galaxy bispectrum

    NASA Astrophysics Data System (ADS)

    Yamauchi, Daisuke; Yokoyama, Shuichiro; Tashiro, Hiroyuki

    2017-12-01

    We explore the use of the galaxy bispectrum induced by the nonlinear gravitational evolution as a possible probe to test general scalar-tensor theories with second-order equations of motion. We find that time dependence of the leading second-order kernel is approximately characterized by one parameter, the second-order index, which is expected to trace the higher-order growth history of the Universe. We show that our new parameter can significantly carry new information about the nonlinear growth of structure. We forecast future constraints on the second-order index as well as the equation-of-state parameter and the growth index.

  14. Extraction and prediction of indices for monsoon intraseasonal oscillations: an approach based on nonlinear Laplacian spectral analysis

    NASA Astrophysics Data System (ADS)

    Sabeerali, C. T.; Ajayamohan, R. S.; Giannakis, Dimitrios; Majda, Andrew J.

    2017-11-01

    An improved index for real-time monitoring and forecast verification of monsoon intraseasonal oscillations (MISOs) is introduced using the recently developed nonlinear Laplacian spectral analysis (NLSA) technique. Using NLSA, a hierarchy of Laplace-Beltrami (LB) eigenfunctions are extracted from unfiltered daily rainfall data from the Global Precipitation Climatology Project over the south Asian monsoon region. Two modes representing the full life cycle of the northeastward-propagating boreal summer MISO are identified from the hierarchy of LB eigenfunctions. These modes have a number of advantages over MISO modes extracted via extended empirical orthogonal function analysis including higher memory and predictability, stronger amplitude and higher fractional explained variance over the western Pacific, Western Ghats, and adjoining Arabian Sea regions, and more realistic representation of the regional heat sources over the Indian and Pacific Oceans. Real-time prediction of NLSA-derived MISO indices is demonstrated via extended-range hindcasts based on NCEP Coupled Forecast System version 2 operational output. It is shown that in these hindcasts the NLSA MISO indices remain predictable out to ˜3 weeks.

  15. On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy.

    PubMed

    Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki

    2017-08-01

    In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.

  16. Developing the remote sensing-based early warning system for monitoring TSS concentrations in Lake Mead.

    PubMed

    Imen, Sanaz; Chang, Ni-Bin; Yang, Y Jeffrey

    2015-09-01

    Adjustment of the water treatment process to changes in water quality is a focus area for engineers and managers of water treatment plants. The desired and preferred capability depends on timely and quantitative knowledge of water quality monitoring in terms of total suspended solids (TSS) concentrations. This paper presents the development of a suite of nowcasting and forecasting methods by using high-resolution remote-sensing-based monitoring techniques on a daily basis. First, the integrated data fusion and mining (IDFM) technique was applied to develop a near real-time monitoring system for daily nowcasting of the TSS concentrations. Then a nonlinear autoregressive neural network with external input (NARXNET) model was selected and applied for forecasting analysis of the changes in TSS concentrations over time on a rolling basis onward using the IDFM technique. The implementation of such an integrated forecasting and nowcasting approach was assessed by a case study at Lake Mead hosting the water intake for Las Vegas, Nevada, in the water-stressed western U.S. Long-term monthly averaged results showed no simultaneous impact from forest fire events on accelerating the rise of TSS concentration. However, the results showed a probable impact of a decade of drought on increasing TSS concentration in the Colorado River Arm and Overton Arm. Results of the forecasting model highlight the reservoir water level as a significant parameter in predicting TSS in Lake Mead. In addition, the R-squared value of 0.98 and the root mean square error of 0.5 between the observed and predicted TSS values demonstrates the reliability and application potential of this remote sensing-based early warning system in terms of TSS projections at a drinking water intake. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Future mission studies: Preliminary comparisons of solar flux models

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    The results of comparisons of the solar flux models are presented. (The wavelength lambda = 10.7 cm radio flux is the best indicator of the strength of the ionizing radiations such as solar ultraviolet and x-ray emissions that directly affect the atmospheric density thereby changing the orbit lifetime of satellites. Thus, accurate forecasting of solar flux F sub 10.7 is crucial for orbit determination of spacecrafts.) The measured solar flux recorded by National Oceanic and Atmospheric Administration (NOAA) is compared against the forecasts made by Schatten, MSFC, and NOAA itself. The possibility of a combined linear, unbiased minimum-variance estimation that properly combines all three models into one that minimizes the variance is also discussed. All the physics inherent in each model are combined. This is considered to be the dead-end statistical approach to solar flux forecasting before any nonlinear chaotic approach.

  18. Demystifying the Complexities of Gravity Wave Dynamics in the Middle Atmosphere: a Roadmap to Improved Weather Forecasts through High-Fidelity Modeling

    NASA Astrophysics Data System (ADS)

    Mixa, T.; Fritts, D. C.; Bossert, K.; Laughman, B.; Wang, L.; Lund, T.; Kantha, L. H.

    2017-12-01

    Gravity waves play a profound role in the mixing of the atmosphere, transporting vast amounts of momentum and energy among different altitudes as they propagate vertically. Above 60km in the middle atmosphere, high wave amplitudes enable a series of complex, nonlinear interactions with the background environment that produce highly-localized wind and temperature variations which alter the layering structure of the atmosphere. These small-scale interactions account for a significant portion of energy transport in the middle atmosphere, but they are difficult to characterize, occurring at spatial scales that are both challenging to observe with ground instruments and prohibitively small to include in weather forecasting models. Using high fidelity numerical simulations, these nuanced wave interactions are analyzed to better our understanding of these dynamics and improve the accuracy of long-term weather forecasting.

  19. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  20. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  1. Baseline predictability of daily east Asian summer monsoon circulation indices

    NASA Astrophysics Data System (ADS)

    Ai, Shucong; Chen, Quanliang; Li, Jianping; Ding, Ruiqiang; Zhong, Quanjia

    2017-05-01

    The nonlinear local Lyapunov exponent (NLLE) method is adopted to quantitatively determine the predictability limit of East Asian summer monsoon (EASM) intensity indices on a synoptic timescale. The predictability limit of EASM indices varies widely according to the definitions of indices. EASM indices defined by zonal shear have a limit of around 7 days, which is higher than the predictability limit of EASM indices defined by sea level pressure (SLP) difference and meridional wind shear (about 5 days). The initial error of EASM indices defined by SLP difference and meridional wind shear shows a faster growth than indices defined by zonal wind shear. Furthermore, the indices defined by zonal wind shear appear to fluctuate at lower frequencies, whereas the indices defined by SLP difference and meridional wind shear generally fluctuate at higher frequencies. This result may explain why the daily variability of the EASM indices defined by zonal wind shear tends be more predictable than those defined by SLP difference and meridional wind shear. Analysis of the temporal correlation coefficient (TCC) skill for EASM indices obtained from observations and from NCEP's Global Ensemble Forecasting System (GEFS) historical weather forecast dataset shows that GEFS has a higher forecast skill for the EASM indices defined by zonal wind shear than for indices defined by SLP difference and meridional wind shear. The predictability limit estimated by the NLLE method is shorter than that in GEFS. In addition, the June-September average TCC skill for different daily EASM indices shows significant interannual variations from 1985 to 2015 in GEFS. However, the TCC for different types of EASM indices does not show coherent interannual fluctuations.

  2. Semi-nonparametric VaR forecasts for hedge funds during the recent crisis

    NASA Astrophysics Data System (ADS)

    Del Brio, Esther B.; Mora-Valencia, Andrés; Perote, Javier

    2014-05-01

    The need to provide accurate value-at-risk (VaR) forecasting measures has triggered an important literature in econophysics. Although these accurate VaR models and methodologies are particularly demanded for hedge fund managers, there exist few articles specifically devoted to implement new techniques in hedge fund returns VaR forecasting. This article advances in these issues by comparing the performance of risk measures based on parametric distributions (the normal, Student’s t and skewed-t), semi-nonparametric (SNP) methodologies based on Gram-Charlier (GC) series and the extreme value theory (EVT) approach. Our results show that normal-, Student’s t- and Skewed t- based methodologies fail to forecast hedge fund VaR, whilst SNP and EVT approaches accurately success on it. We extend these results to the multivariate framework by providing an explicit formula for the GC copula and its density that encompasses the Gaussian copula and accounts for non-linear dependences. We show that the VaR obtained by the meta GC accurately captures portfolio risk and outperforms regulatory VaR estimates obtained through the meta Gaussian and Student’s t distributions.

  3. Increasing the temporal resolution of direct normal solar irradiance forecasted series

    NASA Astrophysics Data System (ADS)

    Fernández-Peruchena, Carlos M.; Gastón, Martin; Schroedter-Homscheidt, Marion; Marco, Isabel Martínez; Casado-Rubio, José L.; García-Moya, José Antonio

    2017-06-01

    A detailed knowledge of the solar resource is a critical point in the design and control of Concentrating Solar Power (CSP) plants. In particular, accurate forecasting of solar irradiance is essential for the efficient operation of solar thermal power plants, the management of energy markets, and the widespread implementation of this technology. Numerical weather prediction (NWP) models are commonly used for solar radiation forecasting. In the ECMWF deterministic forecasting system, all forecast parameters are commercially available worldwide at 3-hourly intervals. Unfortunately, as Direct Normal solar Irradiance (DNI) exhibits a great variability due to the dynamic effects of passing clouds, 3-h time resolution is insufficient for accurate simulations of CSP plants due to their nonlinear response to DNI, governed by various thermal inertias due to their complex response characteristics. DNI series of hourly or sub-hourly frequency resolution are normally used for an accurate modeling and analysis of transient processes in CSP technologies. In this context, the objective of this study is to propose a methodology for generating synthetic DNI time series at 1-h (or higher) temporal resolution from 3-h DNI series. The methodology is based upon patterns as being defined with help of the clear-sky envelope approach together with a forecast of maximum DNI value, and it has been validated with high quality measured DNI data.

  4. Nonlinear dynamics of the magnetosphere and space weather

    NASA Technical Reports Server (NTRS)

    Sharma, A. Surjalal

    1996-01-01

    The solar wind-magnetosphere system exhibits coherence on the global scale and such behavior can arise from nonlinearity on the dynamics. The observational time series data were used together with phase space reconstruction techniques to analyze the magnetospheric dynamics. Analysis of the solar wind, auroral electrojet and Dst indices showed low dimensionality of the dynamics and accurate prediction can be made with an input/output model. The predictability of the magnetosphere in spite of the apparent complexity arises from its dynamical synchronism with the solar wind. The electrodynamic coupling between different regions of the magnetosphere yields its coherent, low dimensional behavior. The data from multiple satellites and ground stations can be used to develop a spatio-temporal model that identifies the coupling between different regions. These nonlinear dynamical models provide space weather forecasting capabilities.

  5. Nonlinear modulation of the HI power spectrum on ultra-large scales. I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umeh, Obinna; Maartens, Roy; Santos, Mario, E-mail: umeobinna@gmail.com, E-mail: roy.maartens@gmail.com, E-mail: mgrsantos@uwc.ac.za

    2016-03-01

    Intensity mapping of the neutral hydrogen brightness temperature promises to provide a three-dimensional view of the universe on very large scales. Nonlinear effects are typically thought to alter only the small-scale power, but we show how they may bias the extraction of cosmological information contained in the power spectrum on ultra-large scales. For linear perturbations to remain valid on large scales, we need to renormalize perturbations at higher order. In the case of intensity mapping, the second-order contribution to clustering from weak lensing dominates the nonlinear contribution at high redshift. Renormalization modifies the mean brightness temperature and therefore the evolutionmore » bias. It also introduces a term that mimics white noise. These effects may influence forecasting analysis on ultra-large scales.« less

  6. Time-series-based hybrid mathematical modelling method adapted to forecast automotive and medical waste generation: Case study of Lithuania.

    PubMed

    Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras

    2018-05-01

    The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.

  7. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    PubMed

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  8. Forecasting daily patient volumes in the emergency department.

    PubMed

    Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L

    2008-02-01

    Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.

  9. Performance Comparison of the European Storm Surge Models and Chaotic Model in Forecasting Extreme Storm Surges

    NASA Astrophysics Data System (ADS)

    Siek, M. B.; Solomatine, D. P.

    2009-04-01

    Storm surge modeling has rapidly developed considerably over the past 30 years. A number of significant advances on operational storm surge models have been implemented and tested, consisting of: refining computational grids, calibrating the model, using a better numerical scheme (i.e. more realistic model physics for air-sea interaction), implementing data assimilation and ensemble model forecasts. This paper addresses the performance comparison between the existing European storm surge models and the recently developed methods of nonlinear dynamics and chaos theory in forecasting storm surge dynamics. The chaotic model is built using adaptive local models based on the dynamical neighbours in the reconstructed phase space of observed time series data. The comparison focused on the model accuracy in forecasting a recently extreme storm surge in the North Sea on November 9th, 2007 that hit the coastlines of several European countries. The combination of a high tide, north-westerly winds exceeding 50 mph and low pressure produced an exceptional storm tide. The tidal level was exceeded 3 meters above normal sea levels. Flood warnings were issued for the east coast of Britain and the entire Dutch coast. The Maeslant barrier's two arc-shaped steel doors in the Europe's biggest port of Rotterdam was closed for the first time since its construction in 1997 due to this storm surge. In comparison to the chaotic model performance, the forecast data from several European physically-based storm surge models were provided from: BSH Germany, DMI Denmark, DNMI Norway, KNMI Netherlands and MUMM Belgium. The performance comparison was made over testing datasets for two periods/conditions: non-stormy period (1-Sep-2007 till 14-Oct-2007) and stormy period (15-Oct-2007 till 20-Nov-2007). A scalar chaotic model with optimized parameters was developed by utilizing an hourly training dataset of observations (11-Sep-2005 till 31-Aug-2007). The comparison results indicated the chaotic model yields better forecasts than the existing European storm surge models. The best performance of European storm surge models for non-storm and storm conditions was achieved by KNMI (with Kalman filter data assimilation) and BSH with errors of 8.95cm and 10.92cm, respectively. Whereas the chaotic model can provide 6 and 48 hours forecasts with errors of 3.10cm and 8.55cm for non-storm condition and 5.04cm and 15.21cm for storm condition, respectively. The chaotic model can provide better forecasts primarily due to the fact that the chaotic model forecasting are estimated by local models which model and identify the similar development of storm surges in the past. In practice, the chaotic model can serve as a reliable and accurate model to support decision-makers in operational ship navigation and flood forecasting.

  10. A Solar Time-Based Analog Ensemble Method for Regional Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Zhang, Xinmin; Li, Yuan

    This paper presents a new analog ensemble method for day-ahead regional photovoltaic (PV) power forecasting with hourly resolution. By utilizing open weather forecast and power measurement data, this prediction method is processed within a set of historical data with similar meteorological data (temperature and irradiance), and astronomical date (solar time and earth declination angle). Further, clustering and blending strategies are applied to improve its accuracy in regional PV forecasting. The robustness of the proposed method is demonstrated with three different numerical weather prediction models, the North American Mesoscale Forecast System, the Global Forecast System, and the Short-Range Ensemble Forecast, formore » both region level and single site level PV forecasts. Using real measured data, the new forecasting approach is applied to the load zone in Southeastern Massachusetts as a case study. The normalized root mean square error (NRMSE) has been reduced by 13.80%-61.21% when compared with three tested baselines.« less

  11. How is the weather? Forecasting inpatient glycemic control

    PubMed Central

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B; Thompson, Bithika M

    2017-01-01

    Aim: Apply methods of damped trend analysis to forecast inpatient glycemic control. Method: Observed and calculated point-of-care blood glucose data trends were determined over 62 weeks. Mean absolute percent error was used to calculate differences between observed and forecasted values. Comparisons were drawn between model results and linear regression forecasting. Results: The forecasted mean glucose trends observed during the first 24 and 48 weeks of projections compared favorably to the results provided by linear regression forecasting. However, in some scenarios, the damped trend method changed inferences compared with linear regression. In all scenarios, mean absolute percent error values remained below the 10% accepted by demand industries. Conclusion: Results indicate that forecasting methods historically applied within demand industries can project future inpatient glycemic control. Additional study is needed to determine if forecasting is useful in the analyses of other glucometric parameters and, if so, how to apply the techniques to quality improvement. PMID:29134125

  12. Spatio-Temporal Change Modeling of Lulc: a Semantic Kriging Approach

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, S.; Ghosh, S. K.

    2015-07-01

    Spatio-temporal land-use/ land-cover (LULC) change modeling is important to forecast the future LULC distribution, which may facilitate natural resource management, urban planning, etc. The spatio-temporal change in LULC trend often exhibits non-linear behavior, due to various dynamic factors, such as, human intervention (e.g., urbanization), environmental factors, etc. Hence, proper forecasting of LULC distribution should involve the study and trend modeling of historical data. Existing literatures have reported that the meteorological attributes (e.g., NDVI, LST, MSI), are semantically related to the terrain. Being influenced by the terrestrial dynamics, the temporal changes of these attributes depend on the LULC properties. Hence, incorporating meteorological knowledge into the temporal prediction process may help in developing an accurate forecasting model. This work attempts to study the change in inter-annual LULC pattern and the distribution of different meteorological attributes of a region in Kolkata (a metropolitan city in India) during the years 2000-2010 and forecast the future spread of LULC using semantic kriging (SemK) approach. A new variant of time-series SemK is proposed, namely Rev-SemKts to capture the multivariate semantic associations between different attributes. From empirical analysis, it may be observed that the augmentation of semantic knowledge in spatio-temporal modeling of meteorological attributes facilitate more precise forecasting of LULC pattern.

  13. Forecasting the mortality rates of Malaysian population using Heligman-Pollard model

    NASA Astrophysics Data System (ADS)

    Ibrahim, Rose Irnawaty; Mohd, Razak; Ngataman, Nuraini; Abrisam, Wan Nur Azifah Wan Mohd

    2017-08-01

    Actuaries, demographers and other professionals have always been aware of the critical importance of mortality forecasting due to declining trend of mortality and continuous increases in life expectancy. Heligman-Pollard model was introduced in 1980 and has been widely used by researchers in modelling and forecasting future mortality. This paper aims to estimate an eight-parameter model based on Heligman and Pollard's law of mortality. Since the model involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 7.0 (MATLAB 7.0) software will be used in order to estimate the parameters. Statistical Package for the Social Sciences (SPSS) will be applied to forecast all the parameters according to Autoregressive Integrated Moving Average (ARIMA). The empirical data sets of Malaysian population for period of 1981 to 2015 for both genders will be considered, which the period of 1981 to 2010 will be used as "training set" and the period of 2011 to 2015 as "testing set". In order to investigate the accuracy of the estimation, the forecast results will be compared against actual data of mortality rates. The result shows that Heligman-Pollard model fit well for male population at all ages while the model seems to underestimate the mortality rates for female population at the older ages.

  14. Use of bias correction techniques to improve seasonal forecasts for reservoirs - A case-study in northwestern Mediterranean.

    PubMed

    Marcos, Raül; Llasat, Ma Carmen; Quintana-Seguí, Pere; Turco, Marco

    2018-01-01

    In this paper, we have compared different bias correction methodologies to assess whether they could be advantageous for improving the performance of a seasonal prediction model for volume anomalies in the Boadella reservoir (northwestern Mediterranean). The bias correction adjustments have been applied on precipitation and temperature from the European Centre for Middle-range Weather Forecasting System 4 (S4). We have used three bias correction strategies: two linear (mean bias correction, BC, and linear regression, LR) and one non-linear (Model Output Statistics analogs, MOS-analog). The results have been compared with climatology and persistence. The volume-anomaly model is a previously computed Multiple Linear Regression that ingests precipitation, temperature and in-flow anomaly data to simulate monthly volume anomalies. The potential utility for end-users has been assessed using economic value curve areas. We have studied the S4 hindcast period 1981-2010 for each month of the year and up to seven months ahead considering an ensemble of 15 members. We have shown that the MOS-analog and LR bias corrections can improve the original S4. The application to volume anomalies points towards the possibility to introduce bias correction methods as a tool to improve water resource seasonal forecasts in an end-user context of climate services. Particularly, the MOS-analog approach gives generally better results than the other approaches in late autumn and early winter. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Wave Forecasting in Muddy Coastal Environments: Model Development Based on Real-Time Observations

    DTIC Science & Technology

    2003-09-30

    oversees the operation of WAVCIS and tasks pertaining to it. Sheremet is responsible for data analysis . WORK COMPLETED The main effort has been...8, 1121-1131, 1978. Foda , M.A., J.R. Hunt and H.-T. Chou, A nonlinear model for the fluidization of marine mud by waves, J. Geophys. Res. 98

  16. Nonlinear dynamics of global atmospheric and Earth-system processes

    NASA Technical Reports Server (NTRS)

    Saltzman, Barry; Ebisuzaki, Wesley; Maasch, Kirk A.; Oglesby, Robert; Pandolfo, Lionel

    1990-01-01

    Researchers are continuing their studies of the nonlinear dynamics of global weather systems. Sensitivity analyses of large-scale dynamical models of the atmosphere (i.e., general circulation models i.e., GCM's) were performed to establish the role of satellite-signatures of soil moisture, sea surface temperature, snow cover, and sea ice as crucial boundary conditions determining global weather variability. To complete their study of the bimodality of the planetary wave states, they are using the dynamical systems approach to construct a low-order theoretical explanation of this phenomenon. This work should have important implications for extended range forecasting of low-frequency oscillations, elucidating the mechanisms for the transitions between the two wave modes. Researchers are using the methods of jump analysis and attractor dimension analysis to examine the long-term satellite records of significant variables (e.g., long wave radiation, and cloud amount), to explore the nature of mode transitions in the atmosphere, and to determine the minimum number of equations needed to describe the main weather variations with a low-order dynamical system. Where feasible they will continue to explore the applicability of the methods of complex dynamical systems analysis to the study of the global earth-system from an integrative viewpoint involving the roles of geochemical cycling and the interactive behavior of the atmosphere, hydrosphere, and biosphere.

  17. Hurst exponent: A Brownian approach to characterize the nonlinear behavior of red blood cells deformability

    NASA Astrophysics Data System (ADS)

    Mancilla Canales, M. A.; Leguto, A. J.; Riquelme, B. D.; León, P. Ponce de; Bortolato, S. A.; Korol, A. M.

    2017-12-01

    Ektacytometry techniques quantifies red blood cells (RBCs) deformability by measuring the elongation of suspended RBCs subjected to shear stress. Raw shear stress elongation plots are difficult to understand, thus most research papers apply data reduction methods characterizing the relationship between curve fitting. Our approach works with the naturally generated photometrically recorded time series of the diffraction pattern of several million of RBCs subjected to shear stress, and applies nonlinear quantifiers to study the fluctuations of these elongations. The development of new quantitative methods is crucial for restricting the subjectivity in the study of the cells behavior, mainly if they are capable of analyze at the same time biological and mechanical aspects of the cells in flowing conditions and compare their dynamics. A patented optical system called Erythrocyte Rheometer was used to evaluate viscoelastic properties of erythrocytes by Ektacytometry. To analyze cell dynamics we used the technique of Time Delay Coordinates, False Nearest Neighbors, the forecasting procedure proposed by Sugihara and May, and Hurst exponent. The results have expressive meaning on comparing healthy samples with parasite treated samples, suggesting that apparent noise associated with deterministic chaos can be used not only to distinguish but also to characterize biological and mechanical aspects of cells at the same time in flowing conditions.

  18. Forecasting peaks of seasonal influenza epidemics.

    PubMed

    Nsoesie, Elaine; Mararthe, Madhav; Brownstein, John

    2013-06-21

    We present a framework for near real-time forecast of influenza epidemics using a simulation optimization approach. The method combines an individual-based model and a simple root finding optimization method for parameter estimation and forecasting. In this study, retrospective forecasts were generated for seasonal influenza epidemics using web-based estimates of influenza activity from Google Flu Trends for 2004-2005, 2007-2008 and 2012-2013 flu seasons. In some cases, the peak could be forecasted 5-6 weeks ahead. This study adds to existing resources for influenza forecasting and the proposed method can be used in conjunction with other approaches in an ensemble framework.

  19. A proposal of monitoring and forecasting system for crustal activity in and around Japan using a large-scale high-fidelity finite element simulation codes

    NASA Astrophysics Data System (ADS)

    Hori, T.; Ichimura, T.

    2015-12-01

    Here we propose a system for monitoring and forecasting of crustal activity, especially great interplate earthquake generation and its preparation processes in subduction zone. Basically, we model great earthquake generation as frictional instability on the subjecting plate boundary. So, spatio-temporal variation in slip velocity on the plate interface should be monitored and forecasted. Although, we can obtain continuous dense surface deformation data on land and partly at the sea bottom, the data obtained are not fully utilized for monitoring and forecasting. It is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate interface and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1)&(2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Actually, Ichimura et al. (2014, SC14) has developed unstructured FE non-linear seismic wave simulation code, which achieved physics-based urban earthquake simulation enhanced by 10.7 BlnDOF x 30 K time-step. Ichimura et al. (2013, GJI) has developed high fidelity FEM simulation code with mesh generator to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. Further, for inverse analyses, Errol et al. (2012, BSSA) has developed waveform inversion code for modeling 3D crustal structure, and Agata et al. (2015, this meeting) has improved the high fidelity FEM code to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. Furthermore, we are developing the methods for forecasting the slip velocity variation on the plate interface. Basic concept is given in Hori et al. (2014, Oceanography) introducing ensemble based sequential data assimilation procedure. Although the prototype described there is for elastic half space model, we will apply it for 3D heterogeneous structure with the high fidelity FE model.

  20. Hybrid Intrusion Forecasting Framework for Early Warning System

    NASA Astrophysics Data System (ADS)

    Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo

    Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.

  1. Ensemble-sensitivity Analysis Based Observation Targeting for Mesoscale Convection Forecasts and Factors Influencing Observation-Impact Prediction

    NASA Astrophysics Data System (ADS)

    Hill, A.; Weiss, C.; Ancell, B. C.

    2017-12-01

    The basic premise of observation targeting is that additional observations, when gathered and assimilated with a numerical weather prediction (NWP) model, will produce a more accurate forecast related to a specific phenomenon. Ensemble-sensitivity analysis (ESA; Ancell and Hakim 2007; Torn and Hakim 2008) is a tool capable of accurately estimating the proper location of targeted observations in areas that have initial model uncertainty and large error growth, as well as predicting the reduction of forecast variance due to the assimilated observation. ESA relates an ensemble of NWP model forecasts, specifically an ensemble of scalar forecast metrics, linearly to earlier model states. A thorough investigation is presented to determine how different factors of the forecast process are impacting our ability to successfully target new observations for mesoscale convection forecasts. Our primary goals for this work are to determine: (1) If targeted observations hold more positive impact over non-targeted (i.e. randomly chosen) observations; (2) If there are lead-time constraints to targeting for convection; (3) How inflation, localization, and the assimilation filter influence impact prediction and realized results; (4) If there exist differences between targeted observations at the surface versus aloft; and (5) how physics errors and nonlinearity may augment observation impacts.Ten cases of dryline-initiated convection between 2011 to 2013 are simulated within a simplified OSSE framework and presented here. Ensemble simulations are produced from a cycling system that utilizes the Weather Research and Forecasting (WRF) model v3.8.1 within the Data Assimilation Research Testbed (DART). A "truth" (nature) simulation is produced by supplying a 3-km WRF run with GFS analyses and integrating the model forward 90 hours, from the beginning of ensemble initialization through the end of the forecast. Target locations for surface and radiosonde observations are computed 6, 12, and 18 hours into the forecast based on a chosen scalar forecast response metric (e.g., maximum reflectivity at convection initiation). A variety of experiments are designed to achieve the aforementioned goals and will be presented, along with their results, detailing the feasibility of targeting for mesoscale convection forecasts.

  2. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  3. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  4. Intermittent Demand Forecasting in a Tertiary Pediatric Intensive Care Unit.

    PubMed

    Cheng, Chen-Yang; Chiang, Kuo-Liang; Chen, Meng-Yin

    2016-10-01

    Forecasts of the demand for medical supplies both directly and indirectly affect the operating costs and the quality of the care provided by health care institutions. Specifically, overestimating demand induces an inventory surplus, whereas underestimating demand possibly compromises patient safety. Uncertainty in forecasting the consumption of medical supplies generates intermittent demand events. The intermittent demand patterns for medical supplies are generally classified as lumpy, erratic, smooth, and slow-moving demand. This study was conducted with the purpose of advancing a tertiary pediatric intensive care unit's efforts to achieve a high level of accuracy in its forecasting of the demand for medical supplies. On this point, several demand forecasting methods were compared in terms of the forecast accuracy of each. The results confirm that applying Croston's method combined with a single exponential smoothing method yields the most accurate results for forecasting lumpy, erratic, and slow-moving demand, whereas the Simple Moving Average (SMA) method is the most suitable for forecasting smooth demand. In addition, when the classification of demand consumption patterns were combined with the demand forecasting models, the forecasting errors were minimized, indicating that this classification framework can play a role in improving patient safety and reducing inventory management costs in health care institutions.

  5. Model-free aftershock forecasts constructed from similar sequences in the past

    NASA Astrophysics Data System (ADS)

    van der Elst, N.; Page, M. T.

    2017-12-01

    The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.

  6. Elevated nonlinearity as an indicator of shifts in the dynamics of populations under stress.

    PubMed

    Dakos, Vasilis; Glaser, Sarah M; Hsieh, Chih-Hao; Sugihara, George

    2017-03-01

    Populations occasionally experience abrupt changes, such as local extinctions, strong declines in abundance or transitions from stable dynamics to strongly irregular fluctuations. Although most of these changes have important ecological and at times economic implications, they remain notoriously difficult to detect in advance. Here, we study changes in the stability of populations under stress across a variety of transitions. Using a Ricker-type model, we simulate shifts from stable point equilibrium dynamics to cyclic and irregular boom-bust oscillations as well as abrupt shifts between alternative attractors. Our aim is to infer the loss of population stability before such shifts based on changes in nonlinearity of population dynamics. We measure nonlinearity by comparing forecast performance between linear and nonlinear models fitted on reconstructed attractors directly from observed time series. We compare nonlinearity to other suggested leading indicators of instability (variance and autocorrelation). We find that nonlinearity and variance increase in a similar way prior to the shifts. By contrast, autocorrelation is strongly affected by oscillations. Finally, we test these theoretical patterns in datasets of fisheries populations. Our results suggest that elevated nonlinearity could be used as an additional indicator to infer changes in the dynamics of populations under stress. © 2017 The Author(s).

  7. A novel stock forecasting model based on High-order-fuzzy-fluctuation Trends and Back Propagation Neural Network

    PubMed Central

    Dai, Zongli; Zhao, Aiwu; He, Jie

    2018-01-01

    In this paper, we propose a hybrid method to forecast the stock prices called High-order-fuzzy-fluctuation-Trends-based Back Propagation(HTBP)Neural Network model. First, we compare each value of the historical training data with the previous day's value to obtain a fluctuation trend time series (FTTS). On this basis, the FTTS blur into fuzzy time series (FFTS) based on the fluctuation of the increasing, equality, decreasing amplitude and direction. Since the relationship between FFTS and future wave trends is nonlinear, the HTBP neural network algorithm is used to find the mapping rules in the form of self-learning. Finally, the results of the algorithm output are used to predict future fluctuations. The proposed model provides some innovative features:(1)It combines fuzzy set theory and neural network algorithm to avoid overfitting problems existed in traditional models. (2)BP neural network algorithm can intelligently explore the internal rules of the actual existence of sequential data, without the need to analyze the influence factors of specific rules and the path of action. (3)The hybrid modal can reasonably remove noises from the internal rules by proper fuzzy treatment. This paper takes the TAIEX data set of Taiwan stock exchange as an example, and compares and analyzes the prediction performance of the model. The experimental results show that this method can predict the stock market in a very simple way. At the same time, we use this method to predict the Shanghai stock exchange composite index, and further verify the effectiveness and universality of the method. PMID:29420584

  8. A novel stock forecasting model based on High-order-fuzzy-fluctuation Trends and Back Propagation Neural Network.

    PubMed

    Guan, Hongjun; Dai, Zongli; Zhao, Aiwu; He, Jie

    2018-01-01

    In this paper, we propose a hybrid method to forecast the stock prices called High-order-fuzzy-fluctuation-Trends-based Back Propagation(HTBP)Neural Network model. First, we compare each value of the historical training data with the previous day's value to obtain a fluctuation trend time series (FTTS). On this basis, the FTTS blur into fuzzy time series (FFTS) based on the fluctuation of the increasing, equality, decreasing amplitude and direction. Since the relationship between FFTS and future wave trends is nonlinear, the HTBP neural network algorithm is used to find the mapping rules in the form of self-learning. Finally, the results of the algorithm output are used to predict future fluctuations. The proposed model provides some innovative features:(1)It combines fuzzy set theory and neural network algorithm to avoid overfitting problems existed in traditional models. (2)BP neural network algorithm can intelligently explore the internal rules of the actual existence of sequential data, without the need to analyze the influence factors of specific rules and the path of action. (3)The hybrid modal can reasonably remove noises from the internal rules by proper fuzzy treatment. This paper takes the TAIEX data set of Taiwan stock exchange as an example, and compares and analyzes the prediction performance of the model. The experimental results show that this method can predict the stock market in a very simple way. At the same time, we use this method to predict the Shanghai stock exchange composite index, and further verify the effectiveness and universality of the method.

  9. Chaos in the sunspot cycle - Analysis and prediction

    NASA Technical Reports Server (NTRS)

    Mundt, Michael D.; Maguire, W. Bruce, II; Chase, Robert R. P.

    1991-01-01

    The variability of solar activity over long time scales, given semiquantitatively by measurements of sunspot numbers, is examined as a nonlinear dynamical system. First, a discussion of the data set used and the techniques utilized to reduce the noise and capture the long-term dynamics inherent in the data is presented. Subsequently, an attractor is reconstructed from the data set using the method of time delays. The reconstructed attractor is then used to determine both the dimension of the underlying system and also the largest Lyapunov exponent, which together indicate that the sunspot cycle is indeed chaotic and also low dimensional. In addition, recent techniques of exploiting chaotic dynamics to provide accurate, short-term predictions are utilized in order to improve upon current forecasting methods and also to place theoretical limits on predictability extent. The results are compared to chaotic solar-dynamo models as a possible physically motivated source of this chaotic behavior.

  10. Predictability of Extreme Climate Events via a Complex Network Approach

    NASA Astrophysics Data System (ADS)

    Muhkin, D.; Kurths, J.

    2017-12-01

    We analyse climate dynamics from a complex network approach. This leads to an inverse problem: Is there a backbone-like structure underlying the climate system? For this we propose a method to reconstruct and analyze a complex network from data generated by a spatio-temporal dynamical system. This approach enables us to uncover relations to global circulation patterns in oceans and atmosphere. This concept is then applied to Monsoon data; in particular, we develop a general framework to predict extreme events by combining a non-linear synchronization technique with complex networks. Applying this method, we uncover a new mechanism of extreme floods in the eastern Central Andes which could be used for operational forecasts. Moreover, we analyze the Indian Summer Monsoon (ISM) and identify two regions of high importance. By estimating an underlying critical point, this leads to an improved prediction of the onset of the ISM; this scheme was successful in 2016 and 2017.

  11. Wind power prediction based on genetic neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Suhan

    2017-04-01

    The scale of grid connected wind farms keeps increasing. To ensure the stability of power system operation, make a reasonable scheduling scheme and improve the competitiveness of wind farm in the electricity generation market, it's important to accurately forecast the short-term wind power. To reduce the influence of the nonlinear relationship between the disturbance factor and the wind power, the improved prediction model based on genetic algorithm and neural network method is established. To overcome the shortcomings of long training time of BP neural network and easy to fall into local minimum and improve the accuracy of the neural network, genetic algorithm is adopted to optimize the parameters and topology of neural network. The historical data is used as input to predict short-term wind power. The effectiveness and feasibility of the method is verified by the actual data of a certain wind farm as an example.

  12. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China

    PubMed Central

    Liu, Dong-jun; Li, Li

    2015-01-01

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332

  13. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China.

    PubMed

    Liu, Dong-jun; Li, Li

    2015-06-23

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.

  14. An impact analysis of forecasting methods and forecasting parameters on bullwhip effect

    NASA Astrophysics Data System (ADS)

    Silitonga, R. Y. H.; Jelly, N.

    2018-04-01

    Bullwhip effect is an increase of variance of demand fluctuation from downstream to upstream of supply chain. Forecasting methods and forecasting parameters were recognized as some factors that affect bullwhip phenomena. To study these factors, we can develop simulations. There are several ways to simulate bullwhip effect in previous studies, such as mathematical equation modelling, information control modelling, computer program, and many more. In this study a spreadsheet program named Bullwhip Explorer was used to simulate bullwhip effect. Several scenarios were developed to show the change in bullwhip effect ratio because of the difference in forecasting methods and forecasting parameters. Forecasting methods used were mean demand, moving average, exponential smoothing, demand signalling, and minimum expected mean squared error. Forecasting parameters were moving average period, smoothing parameter, signalling factor, and safety stock factor. It showed that decreasing moving average period, increasing smoothing parameter, increasing signalling factor can create bigger bullwhip effect ratio. Meanwhile, safety stock factor had no impact to bullwhip effect.

  15. Nonlinear dynamics of homeothermic temperature control in skunk cabbage, Symplocarpus foetidus

    NASA Astrophysics Data System (ADS)

    Ito, Takanori; Ito, Kikukatsu

    2005-11-01

    Certain primitive plants undergo orchestrated temperature control during flowering. Skunk cabbage, Symplocarpus foetidus, has been demonstrated to maintain an internal temperature of around 20 °C even when the ambient temperature drops below freezing. However, it is not clear whether a unique algorithm controls the homeothermic behavior of S. foetidus, or whether such an algorithm might exhibit linear or nonlinear thermoregulatory dynamics. Here we report the underlying dynamics of temperature control in S. foetidus using nonlinear forecasting, attractor and correlation dimension analyses. It was shown that thermoregulation in S. foetidus was governed by low-dimensional chaotic dynamics, the geometry of which showed a strange attractor named the “Zazen attractor.” Our data suggest that the chaotic thermoregulation in S. foetidus is inherent and that it is an adaptive response to the natural environment.

  16. The development rainfall forecasting using kalman filter

    NASA Astrophysics Data System (ADS)

    Zulfi, Mohammad; Hasan, Moh.; Dwidja Purnomo, Kosala

    2018-04-01

    Rainfall forecasting is very interesting for agricultural planing. Rainfall information is useful to make decisions about the plan planting certain commodities. In this studies, the rainfall forecasting by ARIMA and Kalman Filter method. Kalman Filter method is used to declare a time series model of which is shown in the form of linear state space to determine the future forecast. This method used a recursive solution to minimize error. The rainfall data in this research clustered by K-means clustering. Implementation of Kalman Filter method is for modelling and forecasting rainfall in each cluster. We used ARIMA (p,d,q) to construct a state space for KalmanFilter model. So, we have four group of the data and one model in each group. In conclusions, Kalman Filter method is better than ARIMA model for rainfall forecasting in each group. It can be showed from error of Kalman Filter method that smaller than error of ARIMA model.

  17. Forecasting electricity usage using univariate time series models

    NASA Astrophysics Data System (ADS)

    Hock-Eam, Lim; Chee-Yin, Yip

    2014-12-01

    Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.

  18. Compensated Box-Jenkins transfer function for short term load forecast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breipohl, A.; Yu, Z.; Lee, F.N.

    In the past years, the Box-Jenkins ARIMA method and the Box-Jenkins transfer function method (BJTF) have been among the most commonly used methods for short term electrical load forecasting. But when there exists a sudden change in the temperature, both methods tend to exhibit larger errors in the forecast. This paper demonstrates that the load forecasting errors resulting from either the BJ ARIMA model or the BJTF model are not simply white noise, but rather well-patterned noise, and the patterns in the noise can be used to improve the forecasts. Thus a compensated Box-Jenkins transfer method (CBJTF) is proposed tomore » improve the accuracy of the load prediction. Some case studies have been made which result in about a 14-33% reduction of the root mean square (RMS) errors of the forecasts, depending on the compensation time period as well as the compensation method used.« less

  19. Obesity and severe obesity forecasts through 2030.

    PubMed

    Finkelstein, Eric A; Khavjou, Olga A; Thompson, Hope; Trogdon, Justin G; Pan, Liping; Sherry, Bettylou; Dietz, William

    2012-06-01

    Previous efforts to forecast future trends in obesity applied linear forecasts assuming that the rise in obesity would continue unabated. However, evidence suggests that obesity prevalence may be leveling off. This study presents estimates of adult obesity and severe obesity prevalence through 2030 based on nonlinear regression models. The forecasted results are then used to simulate the savings that could be achieved through modestly successful obesity prevention efforts. The study was conducted in 2009-2010 and used data from the 1990 through 2008 Behavioral Risk Factor Surveillance System (BRFSS). The analysis sample included nonpregnant adults aged ≥ 18 years. The individual-level BRFSS variables were supplemented with state-level variables from the U.S. Bureau of Labor Statistics, the American Chamber of Commerce Research Association, and the Census of Retail Trade. Future obesity and severe obesity prevalence were estimated through regression modeling by projecting trends in explanatory variables expected to influence obesity prevalence. Linear time trend forecasts suggest that by 2030, 51% of the population will be obese. The model estimates a much lower obesity prevalence of 42% and severe obesity prevalence of 11%. If obesity were to remain at 2010 levels, the combined savings in medical expenditures over the next 2 decades would be $549.5 billion. The study estimates a 33% increase in obesity prevalence and a 130% increase in severe obesity prevalence over the next 2 decades. If these forecasts prove accurate, this will further hinder efforts for healthcare cost containment. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. A stepwise-cluster microbial biomass inference model in food waste composting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun Wei; Huang, Guo H., E-mail: huangg@iseis.or; Chinese Research Academy of Environmental Science, North China Electric Power University, Beijing 100012-102206

    2009-12-15

    A stepwise-cluster microbial biomass inference (SMI) model was developed through introducing stepwise-cluster analysis (SCA) into composting process modeling to tackle the nonlinear relationships among state variables and microbial activities. The essence of SCA is to form a classification tree based on a series of cutting or mergence processes according to given statistical criteria. Eight runs of designed experiments in bench-scale reactors in a laboratory were constructed to demonstrate the feasibility of the proposed method. The results indicated that SMI could help establish a statistical relationship between state variables and composting microbial characteristics, where discrete and nonlinear complexities exist. Significance levelsmore » of cutting/merging were provided such that the accuracies of the developed forecasting trees were controllable. Through an attempted definition of input effects on the output in SMI, the effects of the state variables on thermophilic bacteria were ranged in a descending order as: Time (day) > moisture content (%) > ash content (%, dry) > Lower Temperature (deg. C) > pH > NH{sub 4}{sup +}-N (mg/Kg, dry) > Total N (%, dry) > Total C (%, dry); the effects on mesophilic bacteria were ordered as: Time > Upper Temperature (deg. C) > Total N > moisture content > NH{sub 4}{sup +}-N > Total C > pH. This study made the first attempt in applying SCA to mapping the nonlinear and discrete relationships in composting processes.« less

  1. Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…

  2. An Initial Assessment of the Impact of CYGNSS Ocean Surface Wind Assimilation on Navy Global and Mesoscale Numerical Weather Prediction

    NASA Astrophysics Data System (ADS)

    Baker, N. L.; Tsu, J.; Swadley, S. D.

    2017-12-01

    We assess the impact of assimilation of CYclone Global Navigation Satellite System (CYGNSS) ocean surface winds observations into the NAVGEM[i] global and COAMPS®[ii] mesoscale numerical weather prediction (NWP) systems. Both NAVGEM and COAMPS® used the NRL 4DVar assimilation system NAVDAS-AR[iii]. Long term monitoring of the NAVGEM Forecast Sensitivity Observation Impact (FSOI) indicates that the forecast error reduction for ocean surface wind vectors (ASCAT and WindSat) are significantly larger than for SSMIS wind speed observations. These differences are larger than can be explained by simply two pieces of information (for wind vectors) versus one (wind speed). To help understand these results, we conducted a series of Observing System Experiments (OSEs) to compare the assimilation of ASCAT wind vectors with the equivalent (computed) ASCAT wind speed observations. We found that wind vector assimilation was typically 3 times more effective at reducing the NAVGEM forecast error, with a higher percentage of beneficial observations. These results suggested that 4DVar, in the absence of an additional nonlinear outer loop, has limited ability to modify the analysis wind direction. We examined several strategies for assimilating CYGNSS ocean surface wind speed observations. In the first approach, we assimilated CYGNSS as wind speed observations, following the same methodology used for SSMIS winds. The next two approaches converted CYGNSS wind speed to wind vectors, using NAVGEM sea level pressure fields (following Holton, 1979), and using NAVGEM 10-m wind fields with the AER Variational Analysis Method. Finally, we compared these methods to CYGNSS wind speed assimilation using multiple outer loops with NAVGEM Hybrid 4DVar. Results support the earlier studies suggesting that NAVDAS-AR wind speed assimilation is sub-optimal. We present detailed results from multi-month NAVGEM assimilation runs along with case studies using COAMPS®. Comparisons include the fit of analyses and forecasts with in-situ observations and analyses from other NWP centers (e.g. ECMWF and GFS). [i] NAVy Global Environmental Model [ii] COAMPS® is a registered trademark of the Naval Research Laboratory for the Navy's Coupled Ocean Atmosphere Mesoscale Prediction System. [iii] NRL Atmospheric Variational Data Assimilation System

  3. Implementation of Automatic Clustering Algorithm and Fuzzy Time Series in Motorcycle Sales Forecasting

    NASA Astrophysics Data System (ADS)

    Rasim; Junaeti, E.; Wirantika, R.

    2018-01-01

    Accurate forecasting for the sale of a product depends on the forecasting method used. The purpose of this research is to build motorcycle sales forecasting application using Fuzzy Time Series method combined with interval determination using automatic clustering algorithm. Forecasting is done using the sales data of motorcycle sales in the last ten years. Then the error rate of forecasting is measured using Means Percentage Error (MPE) and Means Absolute Percentage Error (MAPE). The results of forecasting in the one-year period obtained in this study are included in good accuracy.

  4. Embracing interactions in ocean acidification research: confronting multiple stressor scenarios and context dependence

    PubMed Central

    Kordas, Rebecca L.; Harley, Christopher D. G.

    2017-01-01

    Changes in the Earth's environment are now sufficiently complex that our ability to forecast the emergent ecological consequences of ocean acidification (OA) is limited. Such projections are challenging because the effects of OA may be enhanced, reduced or even reversed by other environmental stressors or interactions among species. Despite an increasing emphasis on multifactor and multispecies studies in global change biology, our ability to forecast outcomes at higher levels of organization remains low. Much of our failure lies in a poor mechanistic understanding of nonlinear responses, a lack of specificity regarding the levels of organization at which interactions can arise, and an incomplete appreciation for linkages across these levels. To move forward, we need to fully embrace interactions. Mechanistic studies on physiological processes and individual performance in response to OA must be complemented by work on population and community dynamics. We must also increase our understanding of how linkages and feedback among multiple environmental stressors and levels of organization can generate nonlinear responses to OA. This will not be a simple undertaking, but advances are of the utmost importance as we attempt to mitigate the effects of ongoing global change. PMID:28356409

  5. Embracing interactions in ocean acidification research: confronting multiple stressor scenarios and context dependence.

    PubMed

    Kroeker, Kristy J; Kordas, Rebecca L; Harley, Christopher D G

    2017-03-01

    Changes in the Earth's environment are now sufficiently complex that our ability to forecast the emergent ecological consequences of ocean acidification (OA) is limited. Such projections are challenging because the effects of OA may be enhanced, reduced or even reversed by other environmental stressors or interactions among species. Despite an increasing emphasis on multifactor and multispecies studies in global change biology, our ability to forecast outcomes at higher levels of organization remains low. Much of our failure lies in a poor mechanistic understanding of nonlinear responses, a lack of specificity regarding the levels of organization at which interactions can arise, and an incomplete appreciation for linkages across these levels. To move forward, we need to fully embrace interactions. Mechanistic studies on physiological processes and individual performance in response to OA must be complemented by work on population and community dynamics. We must also increase our understanding of how linkages and feedback among multiple environmental stressors and levels of organization can generate nonlinear responses to OA. This will not be a simple undertaking, but advances are of the utmost importance as we attempt to mitigate the effects of ongoing global change. © 2017 The Authors.

  6. Investigation of prospects for forecasting non-linear time series by example of drilling oil and gas wells

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.

    2018-05-01

    Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.

  7. School District Enrollment Projections: A Comparison of Three Methods.

    ERIC Educational Resources Information Center

    Pettibone, Timothy J.; Bushan, Latha

    This study assesses three methods of forecasting school enrollments: the cohort-sruvival method (grade progression), the statistical forecasting procedure developed by the Statistical Analysis System (SAS) Institute, and a simple ratio computation. The three methods were used to forecast school enrollments for kindergarten through grade 12 in a…

  8. Seasonal forecasts of impact-relevant climate information indices developed as part of the EUPORIAS project

    NASA Astrophysics Data System (ADS)

    Spirig, Christoph; Bhend, Jonas

    2015-04-01

    Climate information indices (CIIs) represent a way to communicate climate conditions to specific sectors and the public. As such, CIIs provide actionable information to stakeholders in an efficient way. Due to their non-linear nature, such CIIs can behave differently than the underlying variables, such as temperature. At the same time, CIIs do not involve impact models with different sources of uncertainties. As part of the EU project EUPORIAS (EUropean Provision Of Regional Impact Assessment on a Seasonal-to-decadal timescale) we have developed examples of seasonal forecasts of CIIs. We present forecasts and analyses of the skill of seasonal forecasts for CIIs that are relevant to a variety of economic sectors and a range of stakeholders: heating and cooling degree days as proxies for energy demand, various precipitation and drought-related measures relevant to agriculture and hydrology, a wild fire index, a climate-driven mortality index and wind-related indices tailored to renewable energy producers. Common to all examples is the finding of limited forecast skill over Europe, highlighting the challenge for providing added-value services to stakeholders operating in Europe. The reasons for the lack of forecast skill vary: often we find little skill in the underlying variable(s) precisely in those areas that are relevant for the CII, in other cases the nature of the CII is particularly demanding for predictions, as seen in the case of counting measures such as frost days or cool nights. On the other hand, several results suggest there may be some predictability in sub-regions for certain indices. Several of the exemplary analyses show potential for skillful forecasts and prospect for improvements by investing in post-processing. Furthermore, those cases for which CII forecasts showed similar skill values as those of the underlying meteorological variables, forecasts of CIIs provide added value from a user perspective.

  9. A Proposal of Monitoring and Forecasting Method for Crustal Activity in and around Japan with 3-dimensional Heterogeneous Medium Using a Large-scale High-fidelity Finite Element Simulation

    NASA Astrophysics Data System (ADS)

    Hori, T.; Agata, R.; Ichimura, T.; Fujita, K.; Yamaguchi, T.; Takahashi, N.

    2017-12-01

    Recently, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. For construct a system for monitoring and forecasting, it is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate inter-face and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Unstructured FE non-linear seismic wave simulation code has been developed. This achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. A high fidelity FEM simulation code with mesh generator has also been developed to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. This code has been improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, waveform inversion code for modeling 3D crustal structure has been developed, and the high-fidelity FEM code has been improved to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. We are developing the methods for forecasting the slip velocity variation on the plate interface. Although the prototype is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model. Furthermore, large-scale simulation codes for monitoring are being implemented on the GPU clusters and analysis tools are developing to include other functions such as examination in model errors.

  10. Daily Peak Load Forecasting of Next Day using Weather Distribution and Comparison Value of Each Nearby Date Data

    NASA Astrophysics Data System (ADS)

    Ito, Shigenobu; Yukita, Kazuto; Goto, Yasuyuki; Ichiyanagi, Katsuhiro; Nakano, Hiroyuki

    By the development of industry, in recent years; dependence to electric energy is growing year by year. Therefore, reliable electric power supply is in need. However, to stock a huge amount of electric energy is very difficult. Also, there is a necessity to keep balance between the demand and supply, which changes hour after hour. Consequently, to supply the high quality and highly dependable electric power supply, economically, and with high efficiency, there is a need to forecast the movement of the electric power demand carefully in advance. And using that forecast as the source, supply and demand management plan should be made. Thus load forecasting is said to be an important job among demand investment of electric power companies. So far, forecasting method using Fuzzy logic, Neural Net Work, Regression model has been suggested for the development of forecasting accuracy. Those forecasting accuracy is in a high level. But to invest electric power in higher accuracy more economically, a new forecasting method with higher accuracy is needed. In this paper, to develop the forecasting accuracy of the former methods, the daily peak load forecasting method using the weather distribution of highest and lowest temperatures, and comparison value of each nearby date data is suggested.

  11. Methods of sequential estimation for determining initial data in numerical weather prediction. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Cohn, S. E.

    1982-01-01

    Numerical weather prediction (NWP) is an initial-value problem for a system of nonlinear differential equations, in which initial values are known incompletely and inaccurately. Observational data available at the initial time must therefore be supplemented by data available prior to the initial time, a problem known as meteorological data assimilation. A further complication in NWP is that solutions of the governing equations evolve on two different time scales, a fast one and a slow one, whereas fast scale motions in the atmosphere are not reliably observed. This leads to the so called initialization problem: initial values must be constrained to result in a slowly evolving forecast. The theory of estimation of stochastic dynamic systems provides a natural approach to such problems. For linear stochastic dynamic models, the Kalman-Bucy (KB) sequential filter is the optimal data assimilation method, for linear models, the optimal combined data assimilation-initialization method is a modified version of the KB filter.

  12. Adaptive spline autoregression threshold method in forecasting Mitsubishi car sales volume at PT Srikandi Diamond Motors

    NASA Astrophysics Data System (ADS)

    Susanti, D.; Hartini, E.; Permana, A.

    2017-01-01

    Sale and purchase of the growing competition between companies in Indonesian, make every company should have a proper planning in order to win the competition with other companies. One of the things that can be done to design the plan is to make car sales forecast for the next few periods, it’s required that the amount of inventory of cars that will be sold in proportion to the number of cars needed. While to get the correct forecasting, on of the methods that can be used is the method of Adaptive Spline Threshold Autoregression (ASTAR). Therefore, this time the discussion will focus on the use of Adaptive Spline Threshold Autoregression (ASTAR) method in forecasting the volume of car sales in PT.Srikandi Diamond Motors using time series data.In the discussion of this research, forecasting using the method of forecasting value Adaptive Spline Threshold Autoregression (ASTAR) produce approximately correct.

  13. [Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].

    PubMed

    Zheng, Chang-song; Ma, Biao

    2009-04-01

    The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.

  14. Applying different independent component analysis algorithms and support vector regression for IT chain store sales forecasting.

    PubMed

    Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  15. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    PubMed Central

    Dai, Wensheng

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740

  16. Verification of Weather Running Estimate-Nowcast (WRE-N) Forecasts Using a Spatial-Categorical Method

    DTIC Science & Technology

    2017-07-01

    forecasts and observations on a common grid, which enables the application a number of different spatial verification methods that reveal various...forecasts of continuous meteorological variables using categorical and object-based methods . White Sands Missile Range (NM): Army Research Laboratory (US... Research version of the Weather Research and Forecasting Model adapted for generating short-range nowcasts and gridded observations produced by the

  17. A scoping review of malaria forecasting: past work and future directions

    PubMed Central

    Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L

    2012-01-01

    Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505

  18. Combination of synoptical-analogous and dynamical methods to increase skill score of monthly air temperature forecasts over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Khan, Valentina; Tscepelev, Valery; Vilfand, Roman; Kulikova, Irina; Kruglova, Ekaterina; Tischenko, Vladimir

    2016-04-01

    Long-range forecasts at monthly-seasonal time scale are in great demand of socio-economic sectors for exploiting climate-related risks and opportunities. At the same time, the quality of long-range forecasts is not fully responding to user application necessities. Different approaches, including combination of different prognostic models, are used in forecast centers to increase the prediction skill for specific regions and globally. In the present study, two forecasting methods are considered which are exploited in operational practice of Hydrometeorological Center of Russia. One of them is synoptical-analogous method of forecasting of surface air temperature at monthly scale. Another one is dynamical system based on the global semi-Lagrangian model SL-AV, developed in collaboration of Institute of Numerical Mathematics and Hydrometeorological Centre of Russia. The seasonal version of this model has been used to issue global and regional forecasts at monthly-seasonal time scales. This study presents results of the evaluation of surface air temperature forecasts generated with using above mentioned synoptical-statistical and dynamical models, and their combination to potentially increase skill score over Northern Eurasia. The test sample of operational forecasts is encompassing period from 2010 through 2015. The seasonal and interannual variability of skill scores of these methods has been discussed. It was noticed that the quality of all forecasts is highly dependent on the inertia of macro-circulation processes. The skill scores of forecasts are decreasing during significant alterations of synoptical fields for both dynamical and empirical schemes. Procedure of combination of forecasts from different methods, in some cases, has demonstrated its effectiveness. For this study the support has been provided by Grant of Russian Science Foundation (№14-37-00053).

  19. Forecasting Caspian Sea level changes using satellite altimetry data (June 1992-December 2013) based on evolutionary support vector regression algorithms and gene expression programming

    NASA Astrophysics Data System (ADS)

    Imani, Moslem; You, Rey-Jer; Kuo, Chung-Yen

    2014-10-01

    Sea level forecasting at various time intervals is of great importance in water supply management. Evolutionary artificial intelligence (AI) approaches have been accepted as an appropriate tool for modeling complex nonlinear phenomena in water bodies. In the study, we investigated the ability of two AI techniques: support vector machine (SVM), which is mathematically well-founded and provides new insights into function approximation, and gene expression programming (GEP), which is used to forecast Caspian Sea level anomalies using satellite altimetry observations from June 1992 to December 2013. SVM demonstrates the best performance in predicting Caspian Sea level anomalies, given the minimum root mean square error (RMSE = 0.035) and maximum coefficient of determination (R2 = 0.96) during the prediction periods. A comparison between the proposed AI approaches and the cascade correlation neural network (CCNN) model also shows the superiority of the GEP and SVM models over the CCNN.

  20. Electron Flux Models for Different Energies at Geostationary Orbit

    NASA Technical Reports Server (NTRS)

    Boynton, R. J.; Balikhin, M. A.; Sibeck, D. G.; Walker, S. N.; Billings, S. A.; Ganushkina, N.

    2016-01-01

    Forecast models were derived for energetic electrons at all energy ranges sampled by the third-generation Geostationary Operational Environmental Satellites (GOES). These models were based on Multi-Input Single-Output Nonlinear Autoregressive Moving Average with Exogenous inputs methodologies. The model inputs include the solar wind velocity, density and pressure, the fraction of time that the interplanetary magnetic field (IMF) was southward, the IMF contribution of a solar wind-magnetosphere coupling function proposed by Boynton et al. (2011b), and the Dst index. As such, this study has deduced five new 1 h resolution models for the low-energy electrons measured by GOES (30-50 keV, 50-100 keV, 100-200 keV, 200-350 keV, and 350-600 keV) and extended the existing >800 keV and >2 MeV Geostationary Earth Orbit electron fluxes models to forecast at a 1 h resolution. All of these models were shown to provide accurate forecasts, with prediction efficiencies ranging between 66.9% and 82.3%.

  1. A stochastic post-processing method for solar irradiance forecasts derived from NWPs models

    NASA Astrophysics Data System (ADS)

    Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.

    2010-09-01

    Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.

  2. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  3. Post-processing method for wind speed ensemble forecast using wind speed and direction

    NASA Astrophysics Data System (ADS)

    Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin

    2017-04-01

    Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.

  4. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-07-25

    This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load valuesmore » to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  5. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.

    2005-12-01

    Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.

  6. Forecasting short-term data center network traffic load with convolutional neural networks.

    PubMed

    Mozo, Alberto; Ordozgoiti, Bruno; Gómez-Canaval, Sandra

    2018-01-01

    Efficient resource management in data centers is of central importance to content service providers as 90 percent of the network traffic is expected to go through them in the coming years. In this context we propose the use of convolutional neural networks (CNNs) to forecast short-term changes in the amount of traffic crossing a data center network. This value is an indicator of virtual machine activity and can be utilized to shape the data center infrastructure accordingly. The behaviour of network traffic at the seconds scale is highly chaotic and therefore traditional time-series-analysis approaches such as ARIMA fail to obtain accurate forecasts. We show that our convolutional neural network approach can exploit the non-linear regularities of network traffic, providing significant improvements with respect to the mean absolute and standard deviation of the data, and outperforming ARIMA by an increasingly significant margin as the forecasting granularity is above the 16-second resolution. In order to increase the accuracy of the forecasting model, we exploit the architecture of the CNNs using multiresolution input distributed among separate channels of the first convolutional layer. We validate our approach with an extensive set of experiments using a data set collected at the core network of an Internet Service Provider over a period of 5 months, totalling 70 days of traffic at the one-second resolution.

  7. Modeling Markov switching ARMA-GARCH neural networks models and an application to forecasting stock returns.

    PubMed

    Bildirici, Melike; Ersin, Özgür

    2014-01-01

    The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100). Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray's MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray's MS-GARCH model. Therefore, the models are promising for various economic applications.

  8. Modeling Markov Switching ARMA-GARCH Neural Networks Models and an Application to Forecasting Stock Returns

    PubMed Central

    Bildirici, Melike; Ersin, Özgür

    2014-01-01

    The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100). Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray's MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray's MS-GARCH model. Therefore, the models are promising for various economic applications. PMID:24977200

  9. Forecasting short-term data center network traffic load with convolutional neural networks

    PubMed Central

    Ordozgoiti, Bruno; Gómez-Canaval, Sandra

    2018-01-01

    Efficient resource management in data centers is of central importance to content service providers as 90 percent of the network traffic is expected to go through them in the coming years. In this context we propose the use of convolutional neural networks (CNNs) to forecast short-term changes in the amount of traffic crossing a data center network. This value is an indicator of virtual machine activity and can be utilized to shape the data center infrastructure accordingly. The behaviour of network traffic at the seconds scale is highly chaotic and therefore traditional time-series-analysis approaches such as ARIMA fail to obtain accurate forecasts. We show that our convolutional neural network approach can exploit the non-linear regularities of network traffic, providing significant improvements with respect to the mean absolute and standard deviation of the data, and outperforming ARIMA by an increasingly significant margin as the forecasting granularity is above the 16-second resolution. In order to increase the accuracy of the forecasting model, we exploit the architecture of the CNNs using multiresolution input distributed among separate channels of the first convolutional layer. We validate our approach with an extensive set of experiments using a data set collected at the core network of an Internet Service Provider over a period of 5 months, totalling 70 days of traffic at the one-second resolution. PMID:29408936

  10. Evolving forecasting classifications and applications in health forecasting

    PubMed Central

    Soyiri, Ireneous N; Reidpath, Daniel D

    2012-01-01

    Health forecasting forewarns the health community about future health situations and disease episodes so that health systems can better allocate resources and manage demand. The tools used for developing and measuring the accuracy and validity of health forecasts commonly are not defined although they are usually adapted forms of statistical procedures. This review identifies previous typologies used in classifying the forecasting methods commonly used in forecasting health conditions or situations. It then discusses the strengths and weaknesses of these methods and presents the choices available for measuring the accuracy of health-forecasting models, including a note on the discrepancies in the modes of validation. PMID:22615533

  11. Automated time series forecasting for biosurveillance.

    PubMed

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  12. Real-time eruption forecasting using the material Failure Forecast Method with a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.

    2015-04-01

    Many attempts for deterministic forecasting of eruptions and landslides have been performed using the material Failure Forecast Method (FFM). This method consists in adjusting an empirical power law on precursory patterns of seismicity or deformation. Until now, most of the studies have presented hindsight forecasts based on complete time series of precursors and do not evaluate the ability of the method for carrying out real-time forecasting with partial precursory sequences. In this study, we present a rigorous approach of the FFM designed for real-time applications on volcano-seismic precursors. We use a Bayesian approach based on the FFM theory and an automatic classification of seismic events. The probability distributions of the data deduced from the performance of this classification are used as input. As output, it provides the probability of the forecast time at each observation time before the eruption. The spread of the a posteriori probability density function of the prediction time and its stability with respect to the observation time are used as criteria to evaluate the reliability of the forecast. We test the method on precursory accelerations of long-period seismicity prior to vulcanian explosions at Volcán de Colima (Mexico). For explosions preceded by a single phase of seismic acceleration, we obtain accurate and reliable forecasts using approximately 80% of the whole precursory sequence. It is, however, more difficult to apply the method to multiple acceleration patterns.

  13. Forecasting the short-term passenger flow on high-speed railway with neural networks.

    PubMed

    Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing

    2014-01-01

    Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway.

  14. Data-driven outbreak forecasting with a simple nonlinear growth model

    PubMed Central

    Lega, Joceline; Brown, Heidi E.

    2016-01-01

    Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. PMID:27770752

  15. A stochastic delay model for pricing debt and equity: Numerical techniques and applications

    NASA Astrophysics Data System (ADS)

    Tambue, Antoine; Kemajou Brown, Elisabeth; Mohammed, Salah

    2015-01-01

    Delayed nonlinear models for pricing corporate liabilities and European options were recently developed. Using self-financed strategy and duplication we were able to derive a Random Partial Differential Equation (RPDE) whose solutions describe the evolution of debt and equity values of a corporate in the last delay period interval in the accompanied paper (Kemajou et al., 2012) [14]. In this paper, we provide robust numerical techniques to solve the delayed nonlinear model for the corporate value, along with the corresponding RPDEs modeling the debt and equity values of the corporate. Using financial data from some firms, we forecast and compare numerical solutions from both the nonlinear delayed model and classical Merton model with the real corporate data. From this comparison, it comes up that in corporate finance the past dependence of the firm value process may be an important feature and therefore should not be ignored.

  16. Nonlinear GARCH model and 1 / f noise

    NASA Astrophysics Data System (ADS)

    Kononovicius, A.; Ruseckas, J.

    2015-06-01

    Auto-regressive conditionally heteroskedastic (ARCH) family models are still used, by practitioners in business and economic policy making, as a conditional volatility forecasting models. Furthermore ARCH models still are attracting an interest of the researchers. In this contribution we consider the well known GARCH(1,1) process and its nonlinear modifications, reminiscent of NGARCH model. We investigate the possibility to reproduce power law statistics, probability density function and power spectral density, using ARCH family models. For this purpose we derive stochastic differential equations from the GARCH processes in consideration. We find the obtained equations to be similar to a general class of stochastic differential equations known to reproduce power law statistics. We show that linear GARCH(1,1) process has power law distribution, but its power spectral density is Brownian noise-like. However, the nonlinear modifications exhibit both power law distribution and power spectral density of the 1 /fβ form, including 1 / f noise.

  17. Sensitivity of monthly streamflow forecasts to the quality of rainfall forcing: When do dynamical climate forecasts outperform the Ensemble Streamflow Prediction (ESP) method?

    NASA Astrophysics Data System (ADS)

    Tanguy, M.; Prudhomme, C.; Harrigan, S.; Smith, K. A.; Parry, S.

    2017-12-01

    Forecasting hydrological extremes is challenging, especially at lead times over 1 month for catchments with limited hydrological memory and variable climates. One simple way to derive monthly or seasonal hydrological forecasts is to use historical climate data to drive hydrological models using the Ensemble Streamflow Prediction (ESP) method. This gives a range of possible future streamflow given known initial hydrologic conditions alone. The degree of skill of ESP depends highly on the forecast initialisation month and catchment type. Using dynamic rainfall forecasts as driving data instead of historical data could potentially improve streamflow predictions. A lot of effort is being invested within the meteorological community to improve these forecasts. However, while recent progress shows promise (e.g. NAO in winter), the skill of these forecasts at monthly to seasonal timescales is generally still limited, and the extent to which they might lead to improved hydrological forecasts is an area of active research. Additionally, these meteorological forecasts are currently being produced at 1 month or seasonal time-steps in the UK, whereas hydrological models require forcings at daily or sub-daily time-steps. Keeping in mind these limitations of available rainfall forecasts, the objectives of this study are to find out (i) how accurate monthly dynamical rainfall forecasts need to be to outperform ESP, and (ii) how the method used to disaggregate monthly rainfall forecasts into daily rainfall time series affects results. For the first objective, synthetic rainfall time series were created by increasingly degrading observed data (proxy for a `perfect forecast') from 0 % to +/-50 % error. For the second objective, three different methods were used to disaggregate monthly rainfall data into daily time series. These were used to force a simple lumped hydrological model (GR4J) to generate streamflow predictions at a one-month lead time for over 300 catchments representative of the range of UK's hydro-climatic conditions. These forecasts were then benchmarked against the traditional ESP method. It is hoped that the results of this work will help the meteorological community to identify where to focus their efforts in order to increase the usefulness of their forecasts within hydrological forecasting systems.

  18. Forecasting Consumer Adoption of Information Technology and Services--Lessons from Home Video Forecasting.

    ERIC Educational Resources Information Center

    Klopfenstein, Bruce C.

    1989-01-01

    Describes research that examined the strengths and weaknesses of technological forecasting methods by analyzing forecasting studies made for home video players. The discussion covers assessments and explications of correct and incorrect forecasting assumptions, and their implications for forecasting the adoption of home information technologies…

  19. Real-time emergency forecasting technique for situation management systems

    NASA Astrophysics Data System (ADS)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  20. A Delphi forecast of technology in education

    NASA Technical Reports Server (NTRS)

    Robinson, B. E.

    1973-01-01

    The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.

  1. Applications of the gambling score in evaluating earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  2. An Application to the Prediction of LOD Change Based on General Regression Neural Network

    NASA Astrophysics Data System (ADS)

    Zhang, X. H.; Wang, Q. J.; Zhu, J. J.; Zhang, H.

    2011-07-01

    Traditional prediction of the LOD (length of day) change was based on linear models, such as the least square model and the autoregressive technique, etc. Due to the complex non-linear features of the LOD variation, the performances of the linear model predictors are not fully satisfactory. This paper applies a non-linear neural network - general regression neural network (GRNN) model to forecast the LOD change, and the results are analyzed and compared with those obtained with the back propagation neural network and other models. The comparison shows that the performance of the GRNN model in the prediction of the LOD change is efficient and feasible.

  3. Gambling scores for earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  4. Forecasting space weather over short horizons: Revised and updated estimates

    NASA Astrophysics Data System (ADS)

    Reikard, Gordon

    2018-07-01

    Space weather reflects multiple causes. There is a clear influence for the sun on the near-earth environment. Solar activity shows evidence of chaotic properties, implying that prediction may be limited beyond short horizons. At the same time, geomagnetic activity also reflects the rotation of the earth's core, and local currents in the ionosphere. The combination of influences means that geomagnetic indexes behave like multifractals, exhibiting nonlinear variability, with intermittent outliers. This study tests a range of models: regressions, neural networks, and a frequency domain algorithm. Forecasting tests are run for sunspots and irradiance from 1820 onward, for the Aa geomagnetic index from 1868 onward, and the Am index from 1959 onward, over horizons of 1-7 days. For irradiance and sunspots, persistence actually does better over short horizons. None of the other models really dominate. For the geomagnetic indexes, the persistence method does badly, while the neural net also shows large errors. The remaining models all achieve about the same level of accuracy. The errors are in the range of 48% at 1 day, and 54% at all later horizons. Additional tests are run over horizons of 1-4 weeks. At 1 week, the best models reduce the error to about 35%. Over horizons of four weeks, the model errors increase. The findings are somewhat pessimistic. Over short horizons, geomagnetic activity exhibits so much random variation that the forecast errors are extremely high. Over slightly longer horizons, there is some improvement from estimating in the frequency domain, but not a great deal. Including solar activity in the models does not yield any improvement in accuracy.

  5. Load Forecasting of Central Urban Area Power Grid Based on Saturated Load Density Index

    NASA Astrophysics Data System (ADS)

    Huping, Yang; Chengyi, Tang; Meng, Yu

    2018-03-01

    In the current society, coordination between urban power grid development and city development has become more and more prominent. Electricity saturated load forecasting plays an important role in the planning and development of power grids. Electricity saturated load forecasting is a new concept put forward by China in recent years in the field of grid planning. Urban saturation load forecast is different from the traditional load forecasting method for specific years, the time span of it often relatively large, and involves a wide range of aspects. This study takes a county in eastern Jiangxi as an example, this paper chooses a variety of load forecasting methods to carry on the recent load forecasting calculation to central urban area. At the same time, this paper uses load density index method to predict the Longterm load forecasting of electric saturation load of central urban area lasted until 2030. And further study shows the general distribution of the urban saturation load in space.

  6. Long- Range Forecasting Of The Onset Of Southwest Monsoon Winds And Waves Near The Horn Of Africa

    DTIC Science & Technology

    2017-12-01

    SUMMARY OF CLIMATE ANALYSIS AND LONG-RANGE FORECAST METHODOLOGY Prior theses from Heidt (2006) and Lemke (2010) used methods similar to ours and to...6 II. DATA AND METHODS .......................................................................................7 A...9 D. ANALYSIS AND FORECAST METHODS .........................................10 1. Predictand Selection

  7. Predicting Academic Library Circulations: A Forecasting Methods Competition.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    Based on sample data representing five years of monthly circulation totals from 50 academic libraries in Illinois, Iowa, Michigan, Minnesota, Missouri, and Ohio, a study was conducted to determine the most efficient smoothing forecasting methods for academic libraries. Smoothing forecasting methods were chosen because they have been characterized…

  8. Statistical Short-Range Forecast Guidance for Cloud Ceilings Over the Shuttle Landing Facility

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2001-01-01

    This report describes the results of the AMU's Short-Range Statistical Forecasting task. The cloud ceiling forecast over the Shuttle Landing Facility (SLF) is a critical element in determining whether a Shuttle should land. Spaceflight Meteorology Group (SMG) forecasters find that ceilings at the SLF are challenging to forecast. The AMU was tasked to develop ceiling forecast equations to minimize the challenge. Studies in the literature that showed success in improving short-term forecasts of ceiling provided the basis for the AMU task. A 20-year record of cool-season hourly surface observations from stations in east-central Florida was used for the equation development. Two methods were used: an observations-based (OBS) method that incorporated data from all stations, and a persistence climatology (PCL) method used as the benchmark. Equations were developed for 1-, 2-, and 3-hour lead times at each hour of the day. A comparison between the two methods indicated that the OBS equations performed well and produced an improvement over the PCL equations. Therefore, the conclusion of the AMU study is that OBS equations produced more accurate forecasts than the PCL equations, and can be used in operations. They provide another tool with which to make the ceiling forecasts that are critical to safe Shuttle landings at KSC.

  9. A framework for improving a seasonal hydrological forecasting system using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Pappenberger, Florian; Smith, Paul; Cloke, Hannah

    2017-04-01

    Seasonal streamflow forecasts are of great value for the socio-economic sector, for applications such as navigation, flood and drought mitigation and reservoir management for hydropower generation and water allocation to agriculture and drinking water. However, as we speak, the performance of dynamical seasonal hydrological forecasting systems (systems based on running seasonal meteorological forecasts through a hydrological model to produce seasonal hydrological forecasts) is still limited in space and time. In this context, the ESP (Ensemble Streamflow Prediction) remains an attractive forecasting method for seasonal streamflow forecasting as it relies on forcing a hydrological model (starting from the latest observed or simulated initial hydrological conditions) with historical meteorological observations. This makes it cheaper to run than a standard dynamical seasonal hydrological forecasting system, for which the seasonal meteorological forecasts will first have to be produced, while still producing skilful forecasts. There is thus the need to focus resources and time towards improvements in dynamical seasonal hydrological forecasting systems which will eventually lead to significant improvements in the skill of the streamflow forecasts generated. Sensitivity analyses are a powerful tool that can be used to disentangle the relative contributions of the two main sources of errors in seasonal streamflow forecasts, namely the initial hydrological conditions (IHC; e.g., soil moisture, snow cover, initial streamflow, among others) and the meteorological forcing (MF; i.e., seasonal meteorological forecasts of precipitation and temperature, input to the hydrological model). Sensitivity analyses are however most useful if they inform and change current operational practices. To this end, we propose a method to improve the design of a seasonal hydrological forecasting system. This method is based on sensitivity analyses, informing the forecasters as to which element of the forecasting chain (i.e., IHC or MF) could potentially lead to the highest increase in seasonal hydrological forecasting performance, after each forecast update.

  10. Non-parametric characterization of long-term rainfall time series

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  11. Assessment of the uncertainty and predictive power of large-scale predictors for nonlinear precipitation downscaling in the European Arctic (Invited)

    NASA Astrophysics Data System (ADS)

    Sauter, T.

    2013-12-01

    Despite the extensive research on downscaling methods there is still little consensus about the choice of useful atmospheric predictor variables. Besides the general decision of a proper statistical downscaling model, the selection of an informative predictor set is crucial for the accuracy and stability of the resulting downscaled time series. These requirements must be fullfilled by both the atmospheric variables and the predictor domains in terms of geographical location and spatial extend, to which in general not much attention is paid. However, only a limited number of studies is interested in the predictive capability of the predictor domain size or shape, and the question to what extent variability of neighboring grid points influence local-scale events. In this study we emphasized the spatial relationships between observed daily precipitation and selected number of atmospheric variables for the European Arctic. Several nonlinear regression models are used to link the large-scale predictors obtained from reanalysed Weather Research and Forecast model runs to the local-scale observed precipitation. Inferences on the sources of uncertainty are then drawn from variance based sensitivity measures, which also permit to capture interaction effects between individual predictors. The information is further used to develop more parsimonious downscaling models with only small decreases in accuracy. Individual predictors (without interactions) account for almost 2/3 of the total output variance, while the remaining fraction is solely due to interactions. Neglecting predictor interactions in the screening process will lead to some loss of information. Hence, linear screening methods are insufficient as they neither account for interactions nor for non-additivity as given by many nonlinear prediction algorithms.

  12. Monthly water quality forecasting and uncertainty assessment via bootstrapped wavelet neural networks under missing data for Harbin, China.

    PubMed

    Wang, Yi; Zheng, Tong; Zhao, Ying; Jiang, Jiping; Wang, Yuanyuan; Guo, Liang; Wang, Peng

    2013-12-01

    In this paper, bootstrapped wavelet neural network (BWNN) was developed for predicting monthly ammonia nitrogen (NH(4+)-N) and dissolved oxygen (DO) in Harbin region, northeast of China. The Morlet wavelet basis function (WBF) was employed as a nonlinear activation function of traditional three-layer artificial neural network (ANN) structure. Prediction intervals (PI) were constructed according to the calculated uncertainties from the model structure and data noise. Performance of BWNN model was also compared with four different models: traditional ANN, WNN, bootstrapped ANN, and autoregressive integrated moving average model. The results showed that BWNN could handle the severely fluctuating and non-seasonal time series data of water quality, and it produced better performance than the other four models. The uncertainty from data noise was smaller than that from the model structure for NH(4+)-N; conversely, the uncertainty from data noise was larger for DO series. Besides, total uncertainties in the low-flow period were the biggest due to complicated processes during the freeze-up period of the Songhua River. Further, a data missing-refilling scheme was designed, and better performances of BWNNs for structural data missing (SD) were observed than incidental data missing (ID). For both ID and SD, temporal method was satisfactory for filling NH(4+)-N series, whereas spatial imputation was fit for DO series. This filling BWNN forecasting method was applied to other areas suffering "real" data missing, and the results demonstrated its efficiency. Thus, the methods introduced here will help managers to obtain informed decisions.

  13. Inverse modelling for real-time estimation of radiological consequences in the early stage of an accidental radioactivity release.

    PubMed

    Pecha, Petr; Šmídl, Václav

    2016-11-01

    A stepwise sequential assimilation algorithm is proposed based on an optimisation approach for recursive parameter estimation and tracking of radioactive plume propagation in the early stage of a radiation accident. Predictions of the radiological situation in each time step of the plume propagation are driven by an existing short-term meteorological forecast and the assimilation procedure manipulates the model parameters to match the observations incoming concurrently from the terrain. Mathematically, the task is a typical ill-posed inverse problem of estimating the parameters of the release. The proposed method is designated as a stepwise re-estimation of the source term release dynamics and an improvement of several input model parameters. It results in a more precise determination of the adversely affected areas in the terrain. The nonlinear least-squares regression methodology is applied for estimation of the unknowns. The fast and adequately accurate segmented Gaussian plume model (SGPM) is used in the first stage of direct (forward) modelling. The subsequent inverse procedure infers (re-estimates) the values of important model parameters from the actual observations. Accuracy and sensitivity of the proposed method for real-time forecasting of the accident propagation is studied. First, a twin experiment generating noiseless simulated "artificial" observations is studied to verify the minimisation algorithm. Second, the impact of the measurement noise on the re-estimated source release rate is examined. In addition, the presented method can be used as a proposal for more advanced statistical techniques using, e.g., importance sampling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A simplified real time method to forecast semi-enclosed basins storm surge

    NASA Astrophysics Data System (ADS)

    Pasquali, D.; Di Risio, M.; De Girolamo, P.

    2015-11-01

    Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.

  15. Gas demand forecasting by a new artificial intelligent algorithm

    NASA Astrophysics Data System (ADS)

    Khatibi. B, Vahid; Khatibi, Elham

    2012-01-01

    Energy demand forecasting is a key issue for consumers and generators in all energy markets in the world. This paper presents a new forecasting algorithm for daily gas demand prediction. This algorithm combines a wavelet transform and forecasting models such as multi-layer perceptron (MLP), linear regression or GARCH. The proposed method is applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the proposed method.

  16. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  17. A Flux-Corrected Transport Based Hydrodynamic Model for the Plasmasphere Refilling Problem following Geomagnetic Storms

    NASA Astrophysics Data System (ADS)

    Chatterjee, K.; Schunk, R. W.

    2017-12-01

    The refilling of the plasmasphere following a geomagnetic storm remains one of the longstanding problems in the area of ionosphere-magnetosphere coupling. Both diffusion and hydrodynamic approximations have been adopted for the modeling and solution of this problem. The diffusion approximation neglects the nonlinear inertial term in the momentum equation and so this approximation is not rigorously valid immediately after the storm. Over the last few years, we have developed a hydrodynamic refilling model using the flux-corrected transport method, a numerical method that is extremely well suited to handling nonlinear problems with shocks and discontinuities. The plasma transport equations are solved along 1D closed magnetic field lines that connect conjugate ionospheres and the model currently includes three ion (H+, O+, He+) and two neutral (O, H) species. In this work, each ion species under consideration has been modeled as two separate streams emanating from the conjugate hemispheres and the model correctly predicts supersonic ion speeds and the presence of high levels of Helium during the early hours of refilling. The ultimate objective of this research is the development of a 3D model for the plasmasphere refilling problem and with additional development, the same methodology can potentially be applied to the study of other complex space plasma coupling problems in closed flux tube geometries. Index Terms: 2447 Modeling and forecasting [IONOSPHERE] 2753 Numerical modeling [MAGNETOSPHERIC PHYSICS] 7959 Models [SPACE WEATHER

  18. Development of a stacked ensemble model for forecasting and analyzing daily average PM2.5 concentrations in Beijing, China.

    PubMed

    Zhai, Binxu; Chen, Jianguo

    2018-04-18

    A stacked ensemble model is developed for forecasting and analyzing the daily average concentrations of fine particulate matter (PM 2.5 ) in Beijing, China. Special feature extraction procedures, including those of simplification, polynomial, transformation and combination, are conducted before modeling to identify potentially significant features based on an exploratory data analysis. Stability feature selection and tree-based feature selection methods are applied to select important variables and evaluate the degrees of feature importance. Single models including LASSO, Adaboost, XGBoost and multi-layer perceptron optimized by the genetic algorithm (GA-MLP) are established in the level 0 space and are then integrated by support vector regression (SVR) in the level 1 space via stacked generalization. A feature importance analysis reveals that nitrogen dioxide (NO 2 ) and carbon monoxide (CO) concentrations measured from the city of Zhangjiakou are taken as the most important elements of pollution factors for forecasting PM 2.5 concentrations. Local extreme wind speeds and maximal wind speeds are considered to extend the most effects of meteorological factors to the cross-regional transportation of contaminants. Pollutants found in the cities of Zhangjiakou and Chengde have a stronger impact on air quality in Beijing than other surrounding factors. Our model evaluation shows that the ensemble model generally performs better than a single nonlinear forecasting model when applied to new data with a coefficient of determination (R 2 ) of 0.90 and a root mean squared error (RMSE) of 23.69μg/m 3 . For single pollutant grade recognition, the proposed model performs better when applied to days characterized by good air quality than when applied to days registering high levels of pollution. The overall classification accuracy level is 73.93%, with most misclassifications made among adjacent categories. The results demonstrate the interpretability and generalizability of the stacked ensemble model. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Forecasting in foodservice: model development, testing, and evaluation.

    PubMed

    Miller, J L; Thompson, P A; Orabella, M M

    1991-05-01

    This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.

  20. Development, testing, and applications of site-specific tsunami inundation models for real-time forecasting

    NASA Astrophysics Data System (ADS)

    Tang, L.; Titov, V. V.; Chamberlin, C. D.

    2009-12-01

    The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.

  1. The value of forecasting key-decision variables for rain-fed farming

    NASA Astrophysics Data System (ADS)

    Winsemius, Hessel; Werner, Micha

    2013-04-01

    Rain-fed farmers are highly vulnerable to variability in rainfall. Timely knowledge of the onset of the rainy season, the expected amount of rainfall and the occurrence of dry spells can help rain-fed farmers to plan the cropping season. Seasonal probabilistic weather forecasts may provide such information to farmers, but need to provide reliable forecasts of key variables with which farmers can make decisions. In this contribution, we present a new method to evaluate the value of meteorological forecasts in predicting these key variables. The proposed method measures skill by assessing whether a forecast was useful to this decision. This is done by taking into account the required accuracy of timing of the event to make the decision useful. The method progresses the estimate of forecast skill to forecast value by taking into account the required accuracy that is needed to make the decision valuable, based on the cost/loss ratio of possible decisions. The method is applied over the Limpopo region in Southern Africa. We demonstrate the method using the example of temporary water harvesting techniques. Such techniques require time to construct and must be ready long enough before the occurrence of a dry spell to be effective. The value of the forecasts to the decision used as an example is shown to be highly sensitive to the accuracy in the timing of forecasted dry spells, and the tolerance in the decision to timing error. The skill with which dry spells can be predicted is shown to be higher in some parts of the basin, indicating that these forecasts have higher value for the decision in those parts than in others. Through assessing the skill of forecasting key decision variables to the farmers we show that it is easier to understand if the forecasts have value in reducing risk, or if other adaptation strategies should be implemented.

  2. Analysis of forecasting and inventory control of raw material supplies in PT INDAC INT’L

    NASA Astrophysics Data System (ADS)

    Lesmana, E.; Subartini, B.; Riaman; Jabar, D. A.

    2018-03-01

    This study discusses the data forecasting sales of carbon electrodes at PT. INDAC INT L uses winters and double moving average methods, while for predicting the amount of inventory and cost required in ordering raw material of carbon electrode next period using Economic Order Quantity (EOQ) model. The result of error analysis shows that winters method for next period gives result of MAE, MSE, and MAPE, the winters method is a better forecasting method for forecasting sales of carbon electrode products. So that PT. INDAC INT L is advised to provide products that will be sold following the sales amount by the winters method.

  3. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties

    NASA Astrophysics Data System (ADS)

    Wells, J. R.; Kim, J. B.

    2011-12-01

    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that multiple acceptable parameter sets exist. Further we expect to demonstrate that the multiple parameter sets produce significantly divergent future forecasts in NEP, C storage, and ET and runoff; and thereby identify a highly important source of DGVM uncertainty

  4. Evidence of low dimensional chaos in renal blood flow control in genetic and experimental hypertension

    NASA Astrophysics Data System (ADS)

    Yip, K.-P.; Marsh, D. J.; Holstein-Rathlou, N.-H.

    1995-01-01

    We applied a surrogate data technique to test for nonlinear structure in spontaneous fluctuations of hydrostatic pressure in renal tubules of hypertensive rats. Tubular pressure oscillates at 0.03-0.05 Hz in animals with normal blood pressure, but the fluctuations become irregular with chronic hypertension. Using time series from rats with hypertension we produced surrogate data sets to test whether they represent linearly correlated noise or ‘static’ nonlinear transforms of a linear stochastic process. The correlation dimension and the forecasting error were used as discriminating statistics to compare surrogate with experimental data. The results show that the original experimental time series can be distinguished from both linearly and static nonlinearly correlated noise, indicating that the nonlinear behavior is due to the intrinsic dynamics of the system. Together with other evidence this strongly suggests that a low dimensional chaotic attractor governs renal hemodynamics in hypertension. This appears to be the first demonstration of a transition to chaotic dynamics in an integrated physiological control system occurring in association with a pathological condition.

  5. Rao-Blackwellization for Adaptive Gaussian Sum Nonlinear Model Propagation

    NASA Technical Reports Server (NTRS)

    Semper, Sean R.; Crassidis, John L.; George, Jemin; Mukherjee, Siddharth; Singla, Puneet

    2015-01-01

    When dealing with imperfect data and general models of dynamic systems, the best estimate is always sought in the presence of uncertainty or unknown parameters. In many cases, as the first attempt, the Extended Kalman filter (EKF) provides sufficient solutions to handling issues arising from nonlinear and non-Gaussian estimation problems. But these issues may lead unacceptable performance and even divergence. In order to accurately capture the nonlinearities of most real-world dynamic systems, advanced filtering methods have been created to reduce filter divergence while enhancing performance. Approaches, such as Gaussian sum filtering, grid based Bayesian methods and particle filters are well-known examples of advanced methods used to represent and recursively reproduce an approximation to the state probability density function (pdf). Some of these filtering methods were conceptually developed years before their widespread uses were realized. Advanced nonlinear filtering methods currently benefit from the computing advancements in computational speeds, memory, and parallel processing. Grid based methods, multiple-model approaches and Gaussian sum filtering are numerical solutions that take advantage of different state coordinates or multiple-model methods that reduced the amount of approximations used. Choosing an efficient grid is very difficult for multi-dimensional state spaces, and oftentimes expensive computations must be done at each point. For the original Gaussian sum filter, a weighted sum of Gaussian density functions approximates the pdf but suffers at the update step for the individual component weight selections. In order to improve upon the original Gaussian sum filter, Ref. [2] introduces a weight update approach at the filter propagation stage instead of the measurement update stage. This weight update is performed by minimizing the integral square difference between the true forecast pdf and its Gaussian sum approximation. By adaptively updating each component weight during the nonlinear propagation stage an approximation of the true pdf can be successfully reconstructed. Particle filtering (PF) methods have gained popularity recently for solving nonlinear estimation problems due to their straightforward approach and the processing capabilities mentioned above. The basic concept behind PF is to represent any pdf as a set of random samples. As the number of samples increases, they will theoretically converge to the exact, equivalent representation of the desired pdf. When the estimated qth moment is needed, the samples are used for its construction allowing further analysis of the pdf characteristics. However, filter performance deteriorates as the dimension of the state vector increases. To overcome this problem Ref. [5] applies a marginalization technique for PF methods, decreasing complexity of the system to one linear and another nonlinear state estimation problem. The marginalization theory was originally developed by Rao and Blackwell independently. According to Ref. [6] it improves any given estimator under every convex loss function. The improvement comes from calculating a conditional expected value, often involving integrating out a supportive statistic. In other words, Rao-Blackwellization allows for smaller but separate computations to be carried out while reaching the main objective of the estimator. In the case of improving an estimator's variance, any supporting statistic can be removed and its variance determined. Next, any other information that dependents on the supporting statistic is found along with its respective variance. A new approach is developed here by utilizing the strengths of the adaptive Gaussian sum propagation in Ref. [2] and a marginalization approach used for PF methods found in Ref. [7]. In the following sections a modified filtering approach is presented based on a special state-space model within nonlinear systems to reduce the dimensionality of the optimization problem in Ref. [2]. First, the adaptive Gaussian sum propagation is explained and then the new marginalized adaptive Gaussian sum propagation is derived. Finally, an example simulation is presented.

  6. Forecasting the Short-Term Passenger Flow on High-Speed Railway with Neural Networks

    PubMed Central

    Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing

    2014-01-01

    Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway. PMID:25544838

  7. Spatial forecast of landslides in three gorges based on spatial data mining.

    PubMed

    Wang, Xianmin; Niu, Ruiqing

    2009-01-01

    The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods.

  8. Spatial Forecast of Landslides in Three Gorges Based On Spatial Data Mining

    PubMed Central

    Wang, Xianmin; Niu, Ruiqing

    2009-01-01

    The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods. PMID:22573999

  9. Forecasting Glacier Evolution and Hindcasting Paleoclimates In Light of Mass Balance Nonlinearities

    NASA Astrophysics Data System (ADS)

    Malone, A.; Doughty, A. M.; MacAyeal, D. R.

    2016-12-01

    Glaciers are commonly used barometers of present and past climate change, with their variations often being linked to shifts in the mean climate. Climate variability within a unchanging mean state, however, can produce short term mass balance and glacier length anomalies, complicating this linkage. Also, the mass balance response to this variability can be nonlinear, possibly impacting the longer term state of the glacier. We propose a conceptual model to understand these nonlinearities and quantify their impacts on the longer term mass balance and glacier length. The relationship between mass balance and elevation, i.e. the vertical balance profile (VBP), illuminates these nonlinearities (Figure A). The VBP, given here for a wet tropical glacier, is piecewise, which can lead to different mass balance responses to climate anomalies of similar magnitude but opposite sign. We simulate the mass balance response to climate variability by vertically (temperature anomalies) and horizontally (precipitation anomalies) transposing the VBP for the mean climate (Figure A). The resulting anomalous VBP is the superposition of the two translations. We drive a 1-D flowline model with 10,000 years of anomalous VBPs. The aggregate VBP for the mean climate including variability differs from the VBP for the mean climate excluding variability, having a higher equilibrium line altitude (ELA) and a negative mass balance (Figure B). Accordingly, the glacier retreats, and the equilibrium glacier length for the aggregate VBP is the same as the mean length from the 10,000 year flowline simulation (Figure C). The magnitude of the VBP shift and glacier retreat increases with greater temperature variability and larger discontinuities in the VBP slope. These results highlight the importance of both the climate mean and variability in determining the longer term state of the glacier. Thus, forecasting glacier evolution or hindcasting past climates should also include representation of climate variability.

  10. Calibration and combination of dynamical seasonal forecasts to enhance the value of predicted probabilities for managing risk

    NASA Astrophysics Data System (ADS)

    Dutton, John A.; James, Richard P.; Ross, Jeremy D.

    2013-06-01

    Seasonal probability forecasts produced with numerical dynamics on supercomputers offer great potential value in managing risk and opportunity created by seasonal variability. The skill and reliability of contemporary forecast systems can be increased by calibration methods that use the historical performance of the forecast system to improve the ongoing real-time forecasts. Two calibration methods are applied to seasonal surface temperature forecasts of the US National Weather Service, the European Centre for Medium Range Weather Forecasts, and to a World Climate Service multi-model ensemble created by combining those two forecasts with Bayesian methods. As expected, the multi-model is somewhat more skillful and more reliable than the original models taken alone. The potential value of the multimodel in decision making is illustrated with the profits achieved in simulated trading of a weather derivative. In addition to examining the seasonal models, the article demonstrates that calibrated probability forecasts of weekly average temperatures for leads of 2-4 weeks are also skillful and reliable. The conversion of ensemble forecasts into probability distributions of impact variables is illustrated with degree days derived from the temperature forecasts. Some issues related to loss of stationarity owing to long-term warming are considered. The main conclusion of the article is that properly calibrated probabilistic forecasts possess sufficient skill and reliability to contribute to effective decisions in government and business activities that are sensitive to intraseasonal and seasonal climate variability.

  11. The role of ensemble-based statistics in variational assimilation of cloud-affected observations from infrared imagers

    NASA Astrophysics Data System (ADS)

    Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris

    2017-04-01

    Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the apparent benefit from using ensembles for cloudy radiance assimilation in an EnVar context.

  12. Application and evaluation of forecasting methods for municipal solid waste generation in an Eastern-European city.

    PubMed

    Rimaityte, Ingrida; Ruzgas, Tomas; Denafas, Gintaras; Racys, Viktoras; Martuzevicius, Dainius

    2012-01-01

    Forecasting of generation of municipal solid waste (MSW) in developing countries is often a challenging task due to the lack of data and selection of suitable forecasting method. This article aimed to select and evaluate several methods for MSW forecasting in a medium-scaled Eastern European city (Kaunas, Lithuania) with rapidly developing economics, with respect to affluence-related and seasonal impacts. The MSW generation was forecast with respect to the economic activity of the city (regression modelling) and using time series analysis. The modelling based on social-economic indicators (regression implemented in LCA-IWM model) showed particular sensitivity (deviation from actual data in the range from 2.2 to 20.6%) to external factors, such as the synergetic effects of affluence parameters or changes in MSW collection system. For the time series analysis, the combination of autoregressive integrated moving average (ARIMA) and seasonal exponential smoothing (SES) techniques were found to be the most accurate (mean absolute percentage error equalled to 6.5). Time series analysis method was very valuable for forecasting the weekly variation of waste generation data (r (2) > 0.87), but the forecast yearly increase should be verified against the data obtained by regression modelling. The methods and findings of this study may assist the experts, decision-makers and scientists performing forecasts of MSW generation, especially in developing countries.

  13. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  14. Short Term Load Forecasting with Fuzzy Logic Systems for power system planning and reliability-A Review

    NASA Astrophysics Data System (ADS)

    Holmukhe, R. M.; Dhumale, Mrs. Sunita; Chaudhari, Mr. P. S.; Kulkarni, Mr. P. P.

    2010-10-01

    Load forecasting is very essential to the operation of Electricity companies. It enhances the energy efficient and reliable operation of power system. Forecasting of load demand data forms an important component in planning generation schedules in a power system. The purpose of this paper is to identify issues and better method for load foecasting. In this paper we focus on fuzzy logic system based short term load forecasting. It serves as overview of the state of the art in the intelligent techniques employed for load forecasting in power system planning and reliability. Literature review has been conducted and fuzzy logic method has been summarized to highlight advantages and disadvantages of this technique. The proposed technique for implementing fuzzy logic based forecasting is by Identification of the specific day and by using maximum and minimum temperature for that day and finally listing the maximum temperature and peak load for that day. The results show that Load forecasting where there are considerable changes in temperature parameter is better dealt with Fuzzy Logic system method as compared to other short term forecasting techniques.

  15. Egg production forecasting: Determining efficient modeling approaches.

    PubMed

    Ahmad, H A

    2011-12-01

    Several mathematical or statistical and artificial intelligence models were developed to compare egg production forecasts in commercial layers. Initial data for these models were collected from a comparative layer trial on commercial strains conducted at the Poultry Research Farms, Auburn University. Simulated data were produced to represent new scenarios by using means and SD of egg production of the 22 commercial strains. From the simulated data, random examples were generated for neural network training and testing for the weekly egg production prediction from wk 22 to 36. Three neural network architectures-back-propagation-3, Ward-5, and the general regression neural network-were compared for their efficiency to forecast egg production, along with other traditional models. The general regression neural network gave the best-fitting line, which almost overlapped with the commercial egg production data, with an R(2) of 0.71. The general regression neural network-predicted curve was compared with original egg production data, the average curves of white-shelled and brown-shelled strains, linear regression predictions, and the Gompertz nonlinear model. The general regression neural network was superior in all these comparisons and may be the model of choice if the initial overprediction is managed efficiently. In general, neural network models are efficient, are easy to use, require fewer data, and are practical under farm management conditions to forecast egg production.

  16. Nonmechanistic forecasts of seasonal influenza with iterative one-week-ahead distributions.

    PubMed

    Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni

    2018-06-15

    Accurate and reliable forecasts of seasonal epidemics of infectious disease can assist in the design of countermeasures and increase public awareness and preparedness. This article describes two main contributions we made recently toward this goal: a novel approach to probabilistic modeling of surveillance time series based on "delta densities", and an optimization scheme for combining output from multiple forecasting methods into an adaptively weighted ensemble. Delta densities describe the probability distribution of the change between one observation and the next, conditioned on available data; chaining together nonparametric estimates of these distributions yields a model for an entire trajectory. Corresponding distributional forecasts cover more observed events than alternatives that treat the whole season as a unit, and improve upon multiple evaluation metrics when extracting key targets of interest to public health officials. Adaptively weighted ensembles integrate the results of multiple forecasting methods, such as delta density, using weights that can change from situation to situation. We treat selection of optimal weightings across forecasting methods as a separate estimation task, and describe an estimation procedure based on optimizing cross-validation performance. We consider some details of the data generation process, including data revisions and holiday effects, both in the construction of these forecasting methods and when performing retrospective evaluation. The delta density method and an adaptively weighted ensemble of other forecasting methods each improve significantly on the next best ensemble component when applied separately, and achieve even better cross-validated performance when used in conjunction. We submitted real-time forecasts based on these contributions as part of CDC's 2015/2016 FluSight Collaborative Comparison. Among the fourteen submissions that season, this system was ranked by CDC as the most accurate.

  17. Performance of time-series methods in forecasting the demand for red blood cell transfusion.

    PubMed

    Pereira, Arturo

    2004-05-01

    Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.

  18. A Wind Forecasting System for Energy Application

    NASA Astrophysics Data System (ADS)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2010-05-01

    Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.

  19. Advanced data assimilation in strongly nonlinear dynamical systems

    NASA Technical Reports Server (NTRS)

    Miller, Robert N.; Ghil, Michael; Gauthiez, Francois

    1994-01-01

    Advanced data assimilation methods are applied to simple but highly nonlinear problems. The dynamical systems studied here are the stochastically forced double well and the Lorenz model. In both systems, linear approximation of the dynamics about the critical points near which regime transitions occur is not always sufficient to track their occurrence or nonoccurrence. Straightforward application of the extended Kalman filter yields mixed results. The ability of the extended Kalman filter to track transitions of the double-well system from one stable critical point to the other depends on the frequency and accuracy of the observations relative to the mean-square amplitude of the stochastic forcing. The ability of the filter to track the chaotic trajectories of the Lorenz model is limited to short times, as is the ability of strong-constraint variational methods. Examples are given to illustrate the difficulties involved, and qualitative explanations for these difficulties are provided. Three generalizations of the extended Kalman filter are described. The first is based on inspection of the innovation sequence, that is, the successive differences between observations and forecasts; it works very well for the double-well problem. The second, an extension to fourth-order moments, yields excellent results for the Lorenz model but will be unwieldy when applied to models with high-dimensional state spaces. A third, more practical method--based on an empirical statistical model derived from a Monte Carlo simulation--is formulated, and shown to work very well. Weak-constraint methods can be made to perform satisfactorily in the context of these simple models, but such methods do not seem to generalize easily to practical models of the atmosphere and ocean. In particular, it is shown that the equations derived in the weak variational formulation are difficult to solve conveniently for large systems.

  20. Correcting wave predictions with artificial neural networks

    NASA Astrophysics Data System (ADS)

    Makarynskyy, O.; Makarynska, D.

    2003-04-01

    The predictions of wind waves with different lead times are necessary in a large scope of coastal and open ocean activities. Numerical wave models, which usually provide this information, are based on deterministic equations that do not entirely account for the complexity and uncertainty of the wave generation and dissipation processes. An attempt to improve wave parameters short-term forecasts based on artificial neural networks is reported. In recent years, artificial neural networks have been used in a number of coastal engineering applications due to their ability to approximate the nonlinear mathematical behavior without a priori knowledge of interrelations among the elements within a system. The common multilayer feed-forward networks, with a nonlinear transfer functions in the hidden layers, were developed and employed to forecast the wave characteristics over one hour intervals starting from one up to 24 hours, and to correct these predictions. Three non-overlapping data sets of wave characteristics, both from a buoy, moored roughly 60 miles west of the Aran Islands, west coast of Ireland, were used to train and validate the neural nets involved. The networks were trained with error back propagation algorithm. Time series plots and scatterplots of the wave characteristics as well as tables with statistics show an improvement of the results achieved due to the correction procedure employed.

  1. Forecasting volcanic eruptions and other material failure phenomena: An evaluation of the failure forecast method

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.

    2011-08-01

    Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.

  2. Characterizing Time Series Data Diversity for Wind Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong

    Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less

  3. Forecasting Occurrences of Activities.

    PubMed

    Minor, Bryan; Cook, Diane J

    2017-07-01

    While activity recognition has been shown to be valuable for pervasive computing applications, less work has focused on techniques for forecasting the future occurrence of activities. We present an activity forecasting method to predict the time that will elapse until a target activity occurs. This method generates an activity forecast using a regression tree classifier and offers an advantage over sequence prediction methods in that it can predict expected time until an activity occurs. We evaluate this algorithm on real-world smart home datasets and provide evidence that our proposed approach is most effective at predicting activity timings.

  4. Forecasting Jakarta composite index (IHSG) based on chen fuzzy time series and firefly clustering algorithm

    NASA Astrophysics Data System (ADS)

    Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.

    2018-03-01

    This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.

  5. The Research of Regression Method for Forecasting Monthly Electricity Sales Considering Coupled Multi-factor

    NASA Astrophysics Data System (ADS)

    Wang, Jiangbo; Liu, Junhui; Li, Tiantian; Yin, Shuo; He, Xinhui

    2018-01-01

    The monthly electricity sales forecasting is a basic work to ensure the safety of the power system. This paper presented a monthly electricity sales forecasting method which comprehensively considers the coupled multi-factors of temperature, economic growth, electric power replacement and business expansion. The mathematical model is constructed by using regression method. The simulation results show that the proposed method is accurate and effective.

  6. Near real time wind energy forecasting incorporating wind tunnel modeling

    NASA Astrophysics Data System (ADS)

    Lubitz, William David

    A series of experiments and investigations were carried out to inform the development of a day-ahead wind power forecasting system. An experimental near-real time wind power forecasting system was designed and constructed that operates on a desktop PC and forecasts 12--48 hours in advance. The system uses model output of the Eta regional scale forecast (RSF) to forecast the power production of a wind farm in the Altamont Pass, California, USA from 12 to 48 hours in advance. It is of modular construction and designed to also allow diagnostic forecasting using archived RSF data, thereby allowing different methods of completing each forecasting step to be tested and compared using the same input data. Wind-tunnel investigations of the effect of wind direction and hill geometry on wind speed-up above a hill were conducted. Field data from an Altamont Pass, California site was used to evaluate several speed-up prediction algorithms, both with and without wind direction adjustment. These algorithms were found to be of limited usefulness for the complex terrain case evaluated. Wind-tunnel and numerical simulation-based methods were developed for determining a wind farm power curve (the relation between meteorological conditions at a point in the wind farm and the power production of the wind farm). Both methods, as well as two methods based on fits to historical data, ultimately showed similar levels of accuracy: mean absolute errors predicting power production of 5 to 7 percent of the wind farm power capacity. The downscaling of RSF forecast data to the wind farm was found to be complicated by the presence of complex terrain. Poor results using the geostrophic drag law and regression methods motivated the development of a database search method that is capable of forecasting not only wind speeds but also power production with accuracy better than persistence.

  7. Prediction on carbon dioxide emissions based on fuzzy rules

    NASA Astrophysics Data System (ADS)

    Pauzi, Herrini; Abdullah, Lazim

    2014-06-01

    There are several ways to predict air quality, varying from simple regression to models based on artificial intelligence. Most of the conventional methods are not sufficiently able to provide good forecasting performances due to the problems with non-linearity uncertainty and complexity of the data. Artificial intelligence techniques are successfully used in modeling air quality in order to cope with the problems. This paper describes fuzzy inference system (FIS) to predict CO2 emissions in Malaysia. Furthermore, adaptive neuro-fuzzy inference system (ANFIS) is used to compare the prediction performance. Data of five variables: energy use, gross domestic product per capita, population density, combustible renewable and waste and CO2 intensity are employed in this comparative study. The results from the two model proposed are compared and it is clearly shown that the ANFIS outperforms FIS in CO2 prediction.

  8. System load forecasts for an electric utility. [Hourly loads using Box-Jenkins method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uri, N.D.

    This paper discusses forecasting hourly system load for an electric utility using Box-Jenkins time-series analysis. The results indicate that a model based on the method of Box and Jenkins, given its simplicity, gives excellent results over the forecast horizon.

  9. Flood forecasting within urban drainage systems using NARX neural network.

    PubMed

    Abou Rjeily, Yves; Abbas, Oras; Sadek, Marwan; Shahrour, Isam; Hage Chehade, Fadi

    2017-11-01

    Urbanization activity and climate change increase the runoff volumes, and consequently the surcharge of the urban drainage systems (UDS). In addition, age and structural failures of these utilities limit their capacities, and thus generate hydraulic operation shortages, leading to flooding events. The large increase in floods within urban areas requires rapid actions from the UDS operators. The proactivity in taking the appropriate actions is a key element in applying efficient management and flood mitigation. Therefore, this work focuses on developing a flooding forecast system (FFS), able to alert in advance the UDS managers for possible flooding. For a forecasted storm event, a quick estimation of the water depth variation within critical manholes allows a reliable evaluation of the flood risk. The Nonlinear Auto Regressive with eXogenous inputs (NARX) neural network was chosen to develop the FFS as due to its calculation nature it is capable of relating water depth variation in manholes to rainfall intensities. The campus of the University of Lille is used as an experimental site to test and evaluate the FFS proposed in this paper.

  10. Hybrid modelling based on support vector regression with genetic algorithms in forecasting the cyanotoxins presence in the Trasona reservoir (Northern Spain).

    PubMed

    García Nieto, P J; Alonso Fernández, J R; de Cos Juez, F J; Sánchez Lasheras, F; Díaz Muñiz, C

    2013-04-01

    Cyanotoxins, a kind of poisonous substances produced by cyanobacteria, are responsible for health risks in drinking and recreational waters. As a result, anticipate its presence is a matter of importance to prevent risks. The aim of this study is to use a hybrid approach based on support vector regression (SVR) in combination with genetic algorithms (GAs), known as a genetic algorithm support vector regression (GA-SVR) model, in forecasting the cyanotoxins presence in the Trasona reservoir (Northern Spain). The GA-SVR approach is aimed at highly nonlinear biological problems with sharp peaks and the tests carried out proved its high performance. Some physical-chemical parameters have been considered along with the biological ones. The results obtained are two-fold. In the first place, the significance of each biological and physical-chemical variable on the cyanotoxins presence in the reservoir is determined with success. Finally, a predictive model able to forecast the possible presence of cyanotoxins in a short term was obtained. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Development of Parallel Code for the Alaska Tsunami Forecast Model

    NASA Astrophysics Data System (ADS)

    Bahng, B.; Knight, W. R.; Whitmore, P.

    2014-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.

  12. Forecasting and analyzing high O3 time series in educational area through an improved chaotic approach

    NASA Astrophysics Data System (ADS)

    Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md

    2017-08-01

    Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.

  13. A case study of the sensitivity of forecast skill to data and data analysis techniques

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.

    1983-01-01

    A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.

  14. Predicting and downscaling ENSO impacts on intraseasonal precipitation statistics in California: The 1997/98 event

    USGS Publications Warehouse

    Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.

    2000-01-01

    Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.

  15. The Second NWRA Flare-Forecasting Comparison Workshop: Methods Compared and Methodology

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, G.; the Flare Forecasting Comparison Group

    2013-07-01

    The Second NWRA Workshop to compare methods of solar flare forecasting was held 2-4 April 2013 in Boulder, CO. This is a follow-on to the First NWRA Workshop on Flare Forecasting Comparison, also known as the ``All-Clear Forecasting Workshop'', held in 2009 jointly with NASA/SRAG and NOAA/SWPC. For this most recent workshop, many researchers who are active in the field participated, and diverse methods were represented in terms of both the characterization of the Sun and the statistical approaches used to create a forecast. A standard dataset was created for this investigation, using data from the Solar Dynamics Observatory/ Helioseismic and Magnetic Imager (SDO/HMI) vector magnetic field HARP series. For each HARP on each day, 6 hours of data were used, allowing for nominal time-series analysis to be included in the forecasts. We present here a summary of the forecasting methods that participated and the standardized dataset that was used. Funding for the workshop and the data analysis was provided by NASA/Living with a Star contract NNH09CE72C and NASA/Guest Investigator contract NNH12CG10C.

  16. Applications of Principled Search Methods in Climate Influences and Mechanisms

    NASA Technical Reports Server (NTRS)

    Glymour, Clark

    2005-01-01

    Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.

  17. A Load-Based Temperature Prediction Model for Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Sobhani, Masoud

    Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.

  18. Inferring internal properties of Earth's core dynamics and their evolution from surface observations and a numerical geodynamo model

    NASA Astrophysics Data System (ADS)

    Aubert, J.; Fournier, A.

    2011-10-01

    Over the past decades, direct three-dimensional numerical modelling has been successfully used to reproduce the main features of the geodynamo. Here we report on efforts to solve the associated inverse problem, aiming at inferring the underlying properties of the system from the sole knowledge of surface observations and the first principle dynamical equations describing the convective dynamo. To this end we rely on twin experiments. A reference model time sequence is first produced and used to generate synthetic data, restricted here to the large-scale component of the magnetic field and its rate of change at the outer boundary. Starting from a different initial condition, a second sequence is next run and attempts are made to recover the internal magnetic, velocity and buoyancy anomaly fields from the sparse surficial data. In order to reduce the vast underdetermination of this problem, we use stochastic inversion, a linear estimation method determining the most likely internal state compatible with the observations and some prior knowledge, and we also implement a sequential evolution algorithm in order to invert time-dependent surface observations. The prior is the multivariate statistics of the numerical model, which are directly computed from a large number of snapshots stored during a preliminary direct run. The statistics display strong correlation between different harmonic degrees of the surface observations and internal fields, provided they share the same harmonic order, a natural consequence of the linear coupling of the governing dynamical equations and of the leading influence of the Coriolis force. Synthetic experiments performed with a weakly nonlinear model yield an excellent quantitative retrieval of the internal structure. In contrast, the use of a strongly nonlinear (and more realistic) model results in less accurate static estimations, which in turn fail to constrain the unobserved small scales in the time integration of the evolution scheme. Evaluating the quality of forecasts of the system evolution against the reference solution, we show that our scheme can improve predictions based on linear extrapolations on forecast horizons shorter than the system e-folding time. Still, in the perspective of forthcoming data assimilation activities, our study underlines the need of advanced estimation techniques able to cope with the moderate to strong nonlinearities present in the geodynamo.

  19. Conditional Monthly Weather Resampling Procedure for Operational Seasonal Water Resources Forecasting

    NASA Astrophysics Data System (ADS)

    Beckers, J.; Weerts, A.; Tijdeman, E.; Welles, E.; McManamon, A.

    2013-12-01

    To provide reliable and accurate seasonal streamflow forecasts for water resources management several operational hydrologic agencies and hydropower companies around the world use the Extended Streamflow Prediction (ESP) procedure. The ESP in its original implementation does not accommodate for any additional information that the forecaster may have about expected deviations from climatology in the near future. Several attempts have been conducted to improve the skill of the ESP forecast, especially for areas which are affected by teleconnetions (e,g. ENSO, PDO) via selection (Hamlet and Lettenmaier, 1999) or weighting schemes (Werner et al., 2004; Wood and Lettenmaier, 2006; Najafi et al., 2012). A disadvantage of such schemes is that they lead to a reduction of the signal to noise ratio of the probabilistic forecast. To overcome this, we propose a resampling method conditional on climate indices to generate meteorological time series to be used in the ESP. The method can be used to generate a large number of meteorological ensemble members in order to improve the statistical properties of the ensemble. The effectiveness of the method was demonstrated in a real-time operational hydrologic seasonal forecasts system for the Columbia River basin operated by the Bonneville Power Administration. The forecast skill of the k-nn resampler was tested against the original ESP for three basins at the long-range seasonal time scale. The BSS and CRPSS were used to compare the results to those of the original ESP method. Positive forecast skill scores were found for the resampler method conditioned on different indices for the prediction of spring peak flows in the Dworshak and Hungry Horse basin. For the Libby Dam basin however, no improvement of skill was found. The proposed resampling method is a promising practical approach that can add skill to ESP forecasts at the seasonal time scale. Further improvement is possible by fine tuning the method and selecting the most informative climate indices for the region of interest.

  20. Replacement Beef Cow Valuation under Data Availability Constraints

    PubMed Central

    Hagerman, Amy D.; Thompson, Jada M.; Ham, Charlotte; Johnson, Kamina K.

    2017-01-01

    Economists are often tasked with estimating the benefits or costs associated with livestock production losses; however, lack of available data or absence of consistent reporting can reduce the accuracy of these valuations. This work looks at three potential estimation techniques for determining the value for replacement beef cows with varying types of market data to proxy constrained data availability and discusses the potential margin of error for each technique. Oklahoma bred replacement cows are valued using hedonic pricing based on Oklahoma bred cow data—a best case scenario—vector error correction modeling (VECM) based on national cow sales data and cost of production (COP) based on just a representative enterprise budget and very limited sales data. Each method was then used to perform a within-sample forecast of 2016 January to December, and forecasts are compared with the 2016 monthly observed market prices in Oklahoma using the mean absolute percent error (MAPE). Hedonic pricing methods tend to overvalue for within-sample forecasting but performed best, as measured by MAPE for high quality cows. The VECM tended to undervalue cows but performed best for younger animals. COP performed well, compared with the more data intensive methods. Examining each method individually across eight representative replacement beef female types, the VECM forecast resulted in a MAPE under 10% for 33% of forecasted months, followed by hedonic pricing at 24% of the forecasted months and COP at 14% of the forecasted months for average quality beef females. For high quality females, the hedonic pricing method worked best producing a MAPE under 10% in 36% of the forecasted months followed by the COP method at 21% of months and the VECM at 14% of the forecasted months. These results suggested that livestock valuation method selection was not one-size-fits-all and may need to vary based not only on the data available but also on the characteristics (e.g., quality or age) of the livestock being valued. PMID:29164141

  1. Post-processing through linear regression

    NASA Astrophysics Data System (ADS)

    van Schaeybroeck, B.; Vannitsem, S.

    2011-03-01

    Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.

  2. Performance of the 'material Failure Forecast Method' in real-time situations: A Bayesian approach applied on effusive and explosive eruptions

    NASA Astrophysics Data System (ADS)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.; Arámbula-Mendoza, R.; Budi-Santoso, A.

    2016-11-01

    Most attempts of deterministic eruption forecasting are based on the material Failure Forecast Method (FFM). This method assumes that a precursory observable, such as the rate of seismic activity, can be described by a simple power law which presents a singularity at a time close to the eruption onset. Until now, this method has been applied only in a small number of cases, generally for forecasts in hindsight. In this paper, a rigorous Bayesian approach of the FFM designed for real-time applications is applied. Using an automatic recognition system, seismo-volcanic events are detected and classified according to their physical mechanism and time series of probability distributions of the rates of events are calculated. At each time of observation, a Bayesian inversion provides estimations of the exponent of the power law and of the time of eruption, together with their probability density functions. Two criteria are defined in order to evaluate the quality and reliability of the forecasts. Our automated procedure has allowed the analysis of long, continuous seismic time series: 13 years from Volcán de Colima, Mexico, 10 years from Piton de la Fournaise, Reunion Island, France, and several months from Merapi volcano, Java, Indonesia. The new forecasting approach has been applied to 64 pre-eruptive sequences which present various types of dominant seismic activity (volcano-tectonic or long-period events) and patterns of seismicity with different level of complexity. This has allowed us to test the FFM assumptions, to determine in which conditions the method can be applied, and to quantify the success rate of the forecasts. 62% of the precursory sequences analysed are suitable for the application of FFM and half of the total number of eruptions are successfully forecast in hindsight. In real-time, the method allows for the successful forecast of 36% of all the eruptions considered. Nevertheless, real-time forecasts are successful for 83% of the cases that fulfil the reliability criteria. Therefore, good confidence on the method is obtained when the reliability criteria are met.

  3. Support vector machine-an alternative to artificial neuron network for water quality forecasting in an agricultural nonpoint source polluted river?

    PubMed

    Liu, Mei; Lu, Jun

    2014-09-01

    Water quality forecasting in agricultural drainage river basins is difficult because of the complicated nonpoint source (NPS) pollution transport processes and river self-purification processes involved in highly nonlinear problems. Artificial neural network (ANN) and support vector model (SVM) were developed to predict total nitrogen (TN) and total phosphorus (TP) concentrations for any location of the river polluted by agricultural NPS pollution in eastern China. River flow, water temperature, flow travel time, rainfall, dissolved oxygen, and upstream TN or TP concentrations were selected as initial inputs of the two models. Monthly, bimonthly, and trimonthly datasets were selected to train the two models, respectively, and the same monthly dataset which had not been used for training was chosen to test the models in order to compare their generalization performance. Trial and error analysis and genetic algorisms (GA) were employed to optimize the parameters of ANN and SVM models, respectively. The results indicated that the proposed SVM models performed better generalization ability due to avoiding the occurrence of overtraining and optimizing fewer parameters based on structural risk minimization (SRM) principle. Furthermore, both TN and TP SVM models trained by trimonthly datasets achieved greater forecasting accuracy than corresponding ANN models. Thus, SVM models will be a powerful alternative method because it is an efficient and economic tool to accurately predict water quality with low risk. The sensitivity analyses of two models indicated that decreasing upstream input concentrations during the dry season and NPS emission along the reach during average or flood season should be an effective way to improve Changle River water quality. If the necessary water quality and hydrology data and even trimonthly data are available, the SVM methodology developed here can easily be applied to other NPS-polluted rivers.

  4. A proposal of monitoring and forecasting system for crustal activity in and around Japan using a large-scale high-fidelity finite element simulation codes

    NASA Astrophysics Data System (ADS)

    Hori, Takane; Ichimura, Tsuyoshi; Takahashi, Narumi

    2017-04-01

    Here we propose a system for monitoring and forecasting of crustal activity, such as spatio-temporal variation in slip velocity on the plate interface including earthquakes, seismic wave propagation, and crustal deformation. Although, we can obtain continuous dense surface deformation data on land and partly on the sea floor, the obtained data are not fully utilized for monitoring and forecasting. It is necessary to develop a physics-based data analysis system including (1) a structural model with the 3D geometry of the plate interface and the material property such as elasticity and viscosity, (2) calculation code for crustal deformation and seismic wave propagation using (1), (3) inverse analysis or data assimilation code both for structure and fault slip using (1) & (2). To accomplish this, it is at least necessary to develop highly reliable large-scale simulation code to calculate crustal deformation and seismic wave propagation for 3D heterogeneous structure. Actually, Ichimura et al. (2015, SC15) has developed unstructured FE non-linear seismic wave simulation code, which achieved physics-based urban earthquake simulation enhanced by 1.08 T DOF x 6.6 K time-step. Ichimura et al. (2013, GJI) has developed high fidelity FEM simulation code with mesh generator to calculate crustal deformation in and around Japan with complicated surface topography and subducting plate geometry for 1km mesh. Fujita et al. (2016, SC16) has improved the code for crustal deformation and achieved 2.05 T-DOF with 45m resolution on the plate interface. This high-resolution analysis enables computation of change of stress acting on the plate interface. Further, for inverse analyses, Errol et al. (2012, BSSA) has developed waveform inversion code for modeling 3D crustal structure, and Agata et al. (2015, AGU Fall Meeting) has improved the high-fidelity FEM code to apply an adjoint method for estimating fault slip and asthenosphere viscosity. Hence, we have large-scale simulation and analysis tools for monitoring. Furthermore, we are developing the methods for forecasting the slip velocity variation on the plate interface. Basic concept is given in Hori et al. (2014, Oceanography) introducing ensemble based sequential data assimilation procedure. Although the prototype described there is for elastic half space model, we are applying it for 3D heterogeneous structure with the high-fidelity FE model.

  5. Verification of FLYSAFE Clear Air Turbulence (CAT) objects against aircraft turbulence measurements

    NASA Astrophysics Data System (ADS)

    Lunnon, R.; Gill, P.; Reid, L.; Mirza, A.

    2009-09-01

    Prediction of gridded CAT fields The main causes of CAT are (a) Vertical wind shear - low Richardson Number (b) Mountain waves (c) Convection. All three causes contribute roughly equally to CAT occurrences, globally Prediction of shear induced CAT The predictions of shear induced CAT has a longer history than either mountain-wave induced CAT or convectively induced CAT. Both Global Aviation Forecasting Centres are currently using the Ellrod TI1 algorithm (Ellrod and Knapp, 1992). This predictor is the scalar product of deformation [akm1]and vertical wind shear. More sophisticated algorithms can amplify errors in non-linear, differentiated quantities so it is very likely that Ellrod will out-perform other algorithms when verified globally. Prediction of mountain wave CAT The Global Aviation Forecasting Centre in the UK has been generating automated forecasts of mountain wave CAT since the late 1990s, based on the diagnosis of gravity wave drag. Generation of CAT objects In the FLYSAFE project it was decided at an early stage that short range forecasts of meteorological hazards, i.e. icing, Clear Air Turbulence, Cumulonimbus Clouds, should be represented as weather objects, that is, descriptions of individual hazardous volumes of airspace. For CAT, the forecast information on which the weather objects were based was gridded, that comprised a representation of a hazard level for all points in a pre-defined 3-D grid, for a range of forecast times. A "grid-to-objects" capability was generated. This is discussed further in Mirza and Drouin (this conference). Verification of CAT forecasts Verification was performed using digital accelerometer data from aircraft in the British Airways Boeing 747 fleet. A preliminary processing of the aircraft data were performed to generate a truth field on a scale similar to that used to provide gridded forecasts to airlines. This truth field was binary, i.e. each flight segment was characterised as being either "turbulent" or "benign". A gridded forecast field is a continuously changing variable. In contrast, a simple weather object must be characterised by a specific threshold. For a gridded forecast and a binary truth measure it is possible to generate Relative Operating Characteristic (ROC) curves. For weather objects, a single point in the hit-rate/false-alarm-rate space can be generated. If this point is plotted on a ROC curve graph then the skill of the forecast using weather objects can be compared with the skill of the gridded forecast.

  6. Statistical Correction of Air Temperature Forecasts for City and Road Weather Applications

    NASA Astrophysics Data System (ADS)

    Mahura, Alexander; Petersen, Claus; Sass, Bent; Gilet, Nicolas

    2014-05-01

    The method for statistical correction of air /road surface temperatures forecasts was developed based on analysis of long-term time-series of meteorological observations and forecasts (from HIgh Resolution Limited Area Model & Road Conditions Model; 3 km horizontal resolution). It has been tested for May-Aug 2012 & Oct 2012 - Mar 2013, respectively. The developed method is based mostly on forecasted meteorological parameters with a minimal inclusion of observations (covering only a pre-history period). Although the st iteration correction is based taking into account relevant temperature observations, but the further adjustment of air and road temperature forecasts is based purely on forecasted meteorological parameters. The method is model independent, e.g. it can be applied for temperature correction with other types of models having different horizontal resolutions. It is relatively fast due to application of the singular value decomposition method for matrix solution to find coefficients. Moreover, there is always a possibility for additional improvement due to extra tuning of the temperature forecasts for some locations (stations), and in particular, where for example, the MAEs are generally higher compared with others (see Gilet et al., 2014). For the city weather applications, new operationalized procedure for statistical correction of the air temperature forecasts has been elaborated and implemented for the HIRLAM-SKA model runs at 00, 06, 12, and 18 UTCs covering forecast lengths up to 48 hours. The procedure includes segments for extraction of observations and forecast data, assigning these to forecast lengths, statistical correction of temperature, one-&multi-days statistical evaluation of model performance, decision-making on using corrections by stations, interpolation, visualisation and storage/backup. Pre-operational air temperature correction runs were performed for the mainland Denmark since mid-April 2013 and shown good results. Tests also showed that the CPU time required for the operational procedure is relatively short (less than 15 minutes including a large time spent for interpolation). These also showed that in order to start correction of forecasts there is no need to have a long-term pre-historical data (containing forecasts and observations) and, at least, a couple of weeks will be sufficient when a new observational station is included and added to the forecast point. Note for the road weather application, the operationalization of the statistical correction of the road surface temperature forecasts (for the RWM system daily hourly runs covering forecast length up to 5 hours ahead) for the Danish road network (for about 400 road stations) was also implemented, and it is running in a test mode since Sep 2013. The method can also be applied for correction of the dew point temperature and wind speed (as a part of observations/ forecasts at synoptical stations), where these both meteorological parameters are parts of the proposed system of equations. The evaluation of the method performance for improvement of the wind speed forecasts is planned as well, with considering possibilities for the wind direction improvements (which is more complex due to multi-modal types of such data distribution). The method worked for the entire domain of mainland Denmark (tested for 60 synoptical and 395 road stations), and hence, it can be also applied for any geographical point within this domain, as through interpolation to about 100 cities' locations (for Danish national byvejr forecasts). Moreover, we can assume that the same method can be used in other geographical areas. The evaluation for other domains (with a focus on Greenland and Nordic countries) is planned. In addition, a similar approach might be also tested for statistical correction of concentrations of chemical species, but such approach will require additional elaboration and evaluation.

  7. Performance of univariate forecasting on seasonal diseases: the case of tuberculosis.

    PubMed

    Permanasari, Adhistya Erna; Rambli, Dayang Rohaya Awang; Dominic, P Dhanapal Durai

    2011-01-01

    The annual disease incident worldwide is desirable to be predicted for taking appropriate policy to prevent disease outbreak. This chapter considers the performance of different forecasting method to predict the future number of disease incidence, especially for seasonal disease. Six forecasting methods, namely linear regression, moving average, decomposition, Holt-Winter's, ARIMA, and artificial neural network (ANN), were used for disease forecasting on tuberculosis monthly data. The model derived met the requirement of time series with seasonality pattern and downward trend. The forecasting performance was compared using similar error measure in the base of the last 5 years forecast result. The findings indicate that ARIMA model was the most appropriate model since it obtained the less relatively error than the other model.

  8. Study on load forecasting to data centers of high power density based on power usage effectiveness

    NASA Astrophysics Data System (ADS)

    Zhou, C. C.; Zhang, F.; Yuan, Z.; Zhou, L. M.; Wang, F. M.; Li, W.; Yang, J. H.

    2016-08-01

    There is usually considerable energy consumption in data centers. Load forecasting to data centers is in favor of formulating regional load density indexes and of great benefit to getting regional spatial load forecasting more accurately. The building structure and the other influential factors, i.e. equipment, geographic and climatic conditions, are considered for the data centers, and a method to forecast the load of the data centers based on power usage effectiveness is proposed. The cooling capacity of a data center and the index of the power usage effectiveness are used to forecast the power load of the data center in the method. The cooling capacity is obtained by calculating the heat load of the data center. The index is estimated using the group decision-making method of mixed language information. An example is given to prove the applicability and accuracy of this method.

  9. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization techniques.

    PubMed

    Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan

    2013-06-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.

  10. Calibration of decadal ensemble predictions

    NASA Astrophysics Data System (ADS)

    Pasternack, Alexander; Rust, Henning W.; Bhend, Jonas; Liniger, Mark; Grieger, Jens; Müller, Wolfgang; Ulbrich, Uwe

    2017-04-01

    Decadal climate predictions are of great socio-economic interest due to the corresponding planning horizons of several political and economic decisions. Due to uncertainties of weather and climate, forecasts (e.g. due to initial condition uncertainty), they are issued in a probabilistic way. One issue frequently observed for probabilistic forecasts is that they tend to be not reliable, i.e. the forecasted probabilities are not consistent with the relative frequency of the associated observed events. Thus, these kind of forecasts need to be re-calibrated. While re-calibration methods for seasonal time scales are available and frequently applied, these methods still have to be adapted for decadal time scales and its characteristic problems like climate trend and lead time dependent bias. Regarding this, we propose a method to re-calibrate decadal ensemble predictions that takes the above mentioned characteristics into account. Finally, this method will be applied and validated to decadal forecasts from the MiKlip system (Germany's initiative for decadal prediction).

  11. Scenario Generation and Assessment Framework Solution in Support of the Comprehensive Approach

    DTIC Science & Technology

    2010-04-01

    attention, stress, fatigue etc.) and neurofeedback tracking for evaluation in a qualitative manner the real involvement of the trained participants in CAX...Series, Softrade, 2006 (in Bulgarian). [11] Minchev Z., Dukov G., Georgiev S. EEG Spectral Analysis in Serious Gaming: An Ad Hoc Experimental...Nonlinear and linear forecasting of the EEG time series, Biological Cybernetics, 66, 221-259, 1991. [20] Schubert, J., Svenson, P., and Mårtenson, Ch

  12. Nonlinear filtering techniques for noisy geophysical data: Using big data to predict the future

    NASA Astrophysics Data System (ADS)

    Moore, J. M.

    2014-12-01

    Chaos is ubiquitous in physical systems. Within the Earth sciences it is readily evident in seismology, groundwater flows and drilling data. Models and workflows have been applied successfully to understand and even to predict chaotic systems in other scientific fields, including electrical engineering, neurology and oceanography. Unfortunately, the high levels of noise characteristic of our planet's chaotic processes often render these frameworks ineffective. This contribution presents techniques for the reduction of noise associated with measurements of nonlinear systems. Our ultimate aim is to develop data assimilation techniques for forward models that describe chaotic observations, such as episodic tremor and slip (ETS) events in fault zones. A series of nonlinear filters are presented and evaluated using classical chaotic systems. To investigate whether the filters can successfully mitigate the effect of noise typical of Earth science, they are applied to sunspot data. The filtered data can be used successfully to forecast sunspot evolution for up to eight years (see figure).

  13. A new scoring method for evaluating the performance of earthquake forecasts and predictions

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2009-12-01

    This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  14. PONS2train: tool for testing the MLP architecture and local traning methods for runoff forecast

    NASA Astrophysics Data System (ADS)

    Maca, P.; Pavlasek, J.; Pech, P.

    2012-04-01

    The purpose of presented poster is to introduce the PONS2train developed for runoff prediction via multilayer perceptron - MLP. The software application enables the implementation of 12 different MLP's transfer functions, comparison of 9 local training algorithms and finally the evaluation the MLP performance via 17 selected model evaluation metrics. The PONS2train software is written in C++ programing language. Its implementation consists of 4 classes. The NEURAL_NET and NEURON classes implement the MLP, the CRITERIA class estimates model evaluation metrics and for model performance evaluation via testing and validation datasets. The DATA_PATTERN class prepares the validation, testing and calibration datasets. The software application uses the LAPACK, BLAS and ARMADILLO C++ linear algebra libraries. The PONS2train implements the first order local optimization algorithms: standard on-line and batch back-propagation with learning rate combined with momentum and its variants with the regularization term, Rprop and standard batch back-propagation with variable momentum and learning rate. The second order local training algorithms represents: the Levenberg-Marquardt algorithm with and without regularization and four variants of scaled conjugate gradients. The other important PONS2train features are: the multi-run, the weight saturation control, early stopping of trainings, and the MLP weights analysis. The weights initialization is done via two different methods: random sampling from uniform distribution on open interval or Nguyen Widrow method. The data patterns can be transformed via linear and nonlinear transformation. The runoff forecast case study focuses on PONS2train implementation and shows the different aspects of the MLP training, the MLP architecture estimation, the neural network weights analysis and model uncertainty estimation.

  15. Data-driven forecasting algorithms for building energy consumption

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram

    2013-04-01

    This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.

  16. An information-theoretical perspective on weighted ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Weijs, Steven V.; van de Giesen, Nick

    2013-08-01

    This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.

  17. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  18. On Manpower Forecasting. Methods for Manpower Analysis, No.2.

    ERIC Educational Resources Information Center

    Morton, J.E.

    Some of the problems and techniques involved in manpower forecasting are discussed. This non-technical introduction to the field aims at reducing fears of data manipulation methods and at increasing respect for conceptual, logical, and analytical issues. The major approaches to manpower forecasting are explicated and evaluated under the headings:…

  19. A Comparison of Conventional Linear Regression Methods and Neural Networks for Forecasting Educational Spending.

    ERIC Educational Resources Information Center

    Baker, Bruce D.; Richards, Craig E.

    1999-01-01

    Applies neural network methods for forecasting 1991-95 per-pupil expenditures in U.S. public elementary and secondary schools. Forecasting models included the National Center for Education Statistics' multivariate regression model and three neural architectures. Regarding prediction accuracy, neural network results were comparable or superior to…

  20. Educational Forecasting Methodologies: State of the Art, Trends, and Highlights.

    ERIC Educational Resources Information Center

    Hudson, Barclay; Bruno, James

    This overview of both quantitative and qualitative methods of educational forecasting is introduced by a discussion of a general typology of forecasting methods. In each of the following sections, discussion follows the same general format: a number of basic approaches are identified (e.g. extrapolation, correlation, systems modelling), and each…

  1. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  2. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  3. Flare forecasting at the Met Office Space Weather Operations Centre

    NASA Astrophysics Data System (ADS)

    Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.

    2017-04-01

    The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.

  4. The Cooperative VAS Program with the Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Menzel, W. Paul

    1988-01-01

    Work was divided between the analysis/forecast model development and evaluation of the impact of satellite data in mesoscale numerical weather prediction (NWP), development of the Multispectral Atmospheric Mapping Sensor (MAMS), and other related research. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) Synoptic Scale Model (SSM) has progressed from a relatively basic analysis/forecast system to a package which includes such features as nonlinear vertical mode initialization, comprehensive Planetary Boundary Layer (PBL) physics, and the core of a fully four-dimensional data assimilation package. The MAMS effort has produced a calibrated visible and infrared sensor that produces imager at high spatial resolution. The MAMS was developed in order to study small scale atmospheric moisture variability, to monitor and classify clouds, and to investigate the role of surface characteristics in the production of clouds, precipitation, and severe storms.

  5. Wind wave prediction in shallow water: Theory and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cavaleri, L.; Rizzoli, P.M.

    1981-11-20

    A wind wave forecasting model is described, based upon the ray technique, which is specifically designed for shallow water areas. The model explicitly includes wave generation, refraction, and shoaling, while nonlinear dissipative processes (breaking and bottom fricton) are introduced through a suitable parametrization. The forecast is provided at a specified time and target position, in terms of a directional spectrum, from which the one-dimensional spectrum and the significant wave height are derived. The model has been used to hindcast storms both in shallow water (Northern Adriatic Sea) and in deep water conditions (Tyrrhenian Sea). The results have been compared withmore » local measurements, and the rms error for the significant wave height is between 10 and 20%. A major problems has been found in the correct evaluation of the wind field.« less

  6. Iterative near-term ecological forecasting: Needs, opportunities, and challenges

    USGS Publications Warehouse

    Dietze, Michael C.; Fox, Andrew; Beck-Johnson, Lindsay; Betancourt, Julio L.; Hooten, Mevin B.; Jarnevich, Catherine S.; Keitt, Timothy H.; Kenney, Melissa A.; Laney, Christine M.; Larsen, Laurel G.; Loescher, Henry W.; Lunch, Claire K.; Pijanowski, Bryan; Randerson, James T.; Read, Emily; Tredennick, Andrew T.; Vargas, Rodrigo; Weathers, Kathleen C.; White, Ethan P.

    2018-01-01

    Two foundational questions about sustainability are “How are ecosystems and the services they provide going to change in the future?” and “How do human decisions affect these trajectories?” Answering these questions requires an ability to forecast ecological processes. Unfortunately, most ecological forecasts focus on centennial-scale climate responses, therefore neither meeting the needs of near-term (daily to decadal) environmental decision-making nor allowing comparison of specific, quantitative predictions to new observational data, one of the strongest tests of scientific theory. Near-term forecasts provide the opportunity to iteratively cycle between performing analyses and updating predictions in light of new evidence. This iterative process of gaining feedback, building experience, and correcting models and methods is critical for improving forecasts. Iterative, near-term forecasting will accelerate ecological research, make it more relevant to society, and inform sustainable decision-making under high uncertainty and adaptive management. Here, we identify the immediate scientific and societal needs, opportunities, and challenges for iterative near-term ecological forecasting. Over the past decade, data volume, variety, and accessibility have greatly increased, but challenges remain in interoperability, latency, and uncertainty quantification. Similarly, ecologists have made considerable advances in applying computational, informatic, and statistical methods, but opportunities exist for improving forecast-specific theory, methods, and cyberinfrastructure. Effective forecasting will also require changes in scientific training, culture, and institutions. The need to start forecasting is now; the time for making ecology more predictive is here, and learning by doing is the fastest route to drive the science forward.

  7. Iterative near-term ecological forecasting: Needs, opportunities, and challenges.

    PubMed

    Dietze, Michael C; Fox, Andrew; Beck-Johnson, Lindsay M; Betancourt, Julio L; Hooten, Mevin B; Jarnevich, Catherine S; Keitt, Timothy H; Kenney, Melissa A; Laney, Christine M; Larsen, Laurel G; Loescher, Henry W; Lunch, Claire K; Pijanowski, Bryan C; Randerson, James T; Read, Emily K; Tredennick, Andrew T; Vargas, Rodrigo; Weathers, Kathleen C; White, Ethan P

    2018-02-13

    Two foundational questions about sustainability are "How are ecosystems and the services they provide going to change in the future?" and "How do human decisions affect these trajectories?" Answering these questions requires an ability to forecast ecological processes. Unfortunately, most ecological forecasts focus on centennial-scale climate responses, therefore neither meeting the needs of near-term (daily to decadal) environmental decision-making nor allowing comparison of specific, quantitative predictions to new observational data, one of the strongest tests of scientific theory. Near-term forecasts provide the opportunity to iteratively cycle between performing analyses and updating predictions in light of new evidence. This iterative process of gaining feedback, building experience, and correcting models and methods is critical for improving forecasts. Iterative, near-term forecasting will accelerate ecological research, make it more relevant to society, and inform sustainable decision-making under high uncertainty and adaptive management. Here, we identify the immediate scientific and societal needs, opportunities, and challenges for iterative near-term ecological forecasting. Over the past decade, data volume, variety, and accessibility have greatly increased, but challenges remain in interoperability, latency, and uncertainty quantification. Similarly, ecologists have made considerable advances in applying computational, informatic, and statistical methods, but opportunities exist for improving forecast-specific theory, methods, and cyberinfrastructure. Effective forecasting will also require changes in scientific training, culture, and institutions. The need to start forecasting is now; the time for making ecology more predictive is here, and learning by doing is the fastest route to drive the science forward.

  8. Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)

    NASA Astrophysics Data System (ADS)

    OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.

    2016-02-01

    Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.

  9. Approaches in Health Human Resource Forecasting: A Roadmap for Improvement

    PubMed Central

    Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh

    2016-01-01

    Introduction Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. Methods A literature review was conducted for studies published in English from 1990–2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies’ references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses Results Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. Conclusions An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems. PMID:27790343

  10. Using Meteorological Analogues for Reordering Postprocessed Precipitation Ensembles in Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Bellier, Joseph; Bontron, Guillaume; Zin, Isabella

    2017-12-01

    Meteorological ensemble forecasts are nowadays widely used as input of hydrological models for probabilistic streamflow forecasting. These forcings are frequently biased and have to be statistically postprocessed, using most of the time univariate techniques that apply independently to individual locations, lead times and weather variables. Postprocessed ensemble forecasts therefore need to be reordered so as to reconstruct suitable multivariate dependence structures. The Schaake shuffle and ensemble copula coupling are the two most popular methods for this purpose. This paper proposes two adaptations of them that make use of meteorological analogues for reconstructing spatiotemporal dependence structures of precipitation forecasts. Performances of the original and adapted techniques are compared through a multistep verification experiment using real forecasts from the European Centre for Medium-Range Weather Forecasts. This experiment evaluates not only multivariate precipitation forecasts but also the corresponding streamflow forecasts that derive from hydrological modeling. Results show that the relative performances of the different reordering methods vary depending on the verification step. In particular, the standard Schaake shuffle is found to perform poorly when evaluated on streamflow. This emphasizes the crucial role of the precipitation spatiotemporal dependence structure in hydrological ensemble forecasting.

  11. Fuzzy State Transition and Kalman Filter Applied in Short-Term Traffic Flow Forecasting

    PubMed Central

    Ming-jun, Deng; Shi-ru, Qu

    2015-01-01

    Traffic flow is widely recognized as an important parameter for road traffic state forecasting. Fuzzy state transform and Kalman filter (KF) have been applied in this field separately. But the studies show that the former method has good performance on the trend forecasting of traffic state variation but always involves several numerical errors. The latter model is good at numerical forecasting but is deficient in the expression of time hysteretically. This paper proposed an approach that combining fuzzy state transform and KF forecasting model. In considering the advantage of the two models, a weight combination model is proposed. The minimum of the sum forecasting error squared is regarded as a goal in optimizing the combined weight dynamically. Real detection data are used to test the efficiency. Results indicate that the method has a good performance in terms of short-term traffic forecasting. PMID:26779258

  12. Fuzzy State Transition and Kalman Filter Applied in Short-Term Traffic Flow Forecasting.

    PubMed

    Deng, Ming-jun; Qu, Shi-ru

    2015-01-01

    Traffic flow is widely recognized as an important parameter for road traffic state forecasting. Fuzzy state transform and Kalman filter (KF) have been applied in this field separately. But the studies show that the former method has good performance on the trend forecasting of traffic state variation but always involves several numerical errors. The latter model is good at numerical forecasting but is deficient in the expression of time hysteretically. This paper proposed an approach that combining fuzzy state transform and KF forecasting model. In considering the advantage of the two models, a weight combination model is proposed. The minimum of the sum forecasting error squared is regarded as a goal in optimizing the combined weight dynamically. Real detection data are used to test the efficiency. Results indicate that the method has a good performance in terms of short-term traffic forecasting.

  13. A technique for determining viable military logistics support alternatives

    NASA Astrophysics Data System (ADS)

    Hester, Jesse Stuart

    A look at today's US military will see them operating much beyond the scope of protecting and defending the United States. These operations now consist of, but are not limited to humanitarian aid, disaster relief, peace keeping, and conflict resolution. This broad spectrum of operational environments has necessitated a transformation of the individual military services to a hybrid force that is attempting to leverage the inherent and emerging capabilities and strengths of all those under the umbrella of the Department of Defense (DOD), this concept has been coined Joint Operations. Supporting Joint Operations requires a new approach to determining a viable military logistics support system. The logistics architecture for these operations has to accommodate scale, time, varied mission objectives, and imperfect information. Compounding the problem is the human in the loop (HITL) decision maker (DM) who is a necessary component for quickly assessing and planning logistics support activities. Past outcomes are not necessarily good indicators of future results, but they can provide a reasonable starting point for planning and prediction of specific needs for future requirements. Adequately forecasting the necessary logistical support structure and commodities needed for any resource intensive environment has progressed well beyond stable demand assumptions to one in which dynamic and nonlinear environments can be captured with some degree of fidelity and accuracy. While these advances are important, a holistic approach that allows exploration of the operational environment or design space does not exist to guide the military logistician in a methodical way to support military forecasting activities. To bridge this capability gap, a method called Adaptive Technique for Logistics Architecture Solutions (ATLAS) has been developed. This method provides a process that facilitates the use of techniques and tools that filter and provide relevant information to the DM. By doing so, a justifiable course of action (COA) can be determined based on a variety of quantitative and qualitative information available. This thesis describes and applies the ATLAS method to a notional military scenario that involves the Navy concept of Seabasing and the Marine Corps concept of Distributed Operations applied to a platoon sized element. The small force is tasked to conduct deterrence and combat operations over a seven day period. This work uses modeling and simulation to incorporate expert opinion and knowledge of military operations, dynamic reasoning methods, and certainty analysis to create a decisions support system (DSS) that can be used to provide the DM an enhanced view of the logistics environment and uses variables that impact specific measures of effectiveness. The results from applying the ATLAS method provide a better understanding and ability for the DM to conduct the logistics planning/execution more efficiently and quickly. This is accomplished by providing relevant data that can be applied to perform dynamic forecasting activities for the platoon and aids in determining the necessary support architecture to fulfill the forecasted need.

  14. The total probabilities from high-resolution ensemble forecasting of floods

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2015-04-01

    Ensemble forecasting has for a long time been used in meteorological modelling, to give an indication of the uncertainty of the forecasts. As meteorological ensemble forecasts often show some bias and dispersion errors, there is a need for calibration and post-processing of the ensembles. Typical methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). To make optimal predictions of floods along the stream network in hydrology, we can easily use the ensemble members as input to the hydrological models. However, some of the post-processing methods will need modifications when regionalizing the forecasts outside the calibration locations, as done by Hemri et al. (2013). We present a method for spatial regionalization of the post-processed forecasts based on EMOS and top-kriging (Skøien et al., 2006). We will also look into different methods for handling the non-normality of runoff and the effect on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005. Skøien, J. O., Merz, R. and Blöschl, G.: Top-kriging - Geostatistics on stream networks, Hydrol. Earth Syst. Sci., 10(2), 277-287, 2006.

  15. Action-based flood forecasting for triggering humanitarian action

    NASA Astrophysics Data System (ADS)

    Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin

    2016-09-01

    Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.

  16. Forecasting Error Calculation with Mean Absolute Deviation and Mean Absolute Percentage Error

    NASA Astrophysics Data System (ADS)

    Khair, Ummul; Fahmi, Hasanul; Hakim, Sarudin Al; Rahim, Robbi

    2017-12-01

    Prediction using a forecasting method is one of the most important things for an organization, the selection of appropriate forecasting methods is also important but the percentage error of a method is more important in order for decision makers to adopt the right culture, the use of the Mean Absolute Deviation and Mean Absolute Percentage Error to calculate the percentage of mistakes in the least square method resulted in a percentage of 9.77% and it was decided that the least square method be worked for time series and trend data.

  17. An Integrated Crustal Dynamics Simulator

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Mora, P.

    2007-12-01

    Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.

  18. Episode forecasting in bipolar disorder: Is energy better than mood?

    PubMed

    Ortiz, Abigail; Bradler, Kamil; Hintze, Arend

    2018-01-22

    Bipolar disorder is a severe mood disorder characterized by alternating episodes of mania and depression. Several interventions have been developed to decrease high admission rates and high suicides rates associated with the illness, including psychoeducation and early episode detection, with mixed results. More recently, machine learning approaches have been used to aid clinical diagnosis or to detect a particular clinical state; however, contradictory results arise from confusion around which of the several automatically generated data are the most contributory and useful to detect a particular clinical state. Our aim for this study was to apply machine learning techniques and nonlinear analyses to a physiological time series dataset in order to find the best predictor for forecasting episodes in mood disorders. We employed three different techniques: entropy calculations and two different machine learning approaches (genetic programming and Markov Brains as classifiers) to determine whether mood, energy or sleep was the best predictor to forecast a mood episode in a physiological time series. Evening energy was the best predictor for both manic and depressive episodes in each of the three aforementioned techniques. This suggests that energy might be a better predictor than mood for forecasting mood episodes in bipolar disorder and that these particular machine learning approaches are valuable tools to be used clinically. Energy should be considered as an important factor for episode prediction. Machine learning approaches provide better tools to forecast episodes and to increase our understanding of the processes that underlie mood regulation. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. A review of multimodel superensemble forecasting for weather, seasonal climate, and hurricanes

    NASA Astrophysics Data System (ADS)

    Krishnamurti, T. N.; Kumar, V.; Simon, A.; Bhardwaj, A.; Ghosh, T.; Ross, R.

    2016-06-01

    This review provides a summary of work in the area of ensemble forecasts for weather, climate, oceans, and hurricanes. This includes a combination of multiple forecast model results that does not dwell on the ensemble mean but uses a unique collective bias reduction procedure. A theoretical framework for this procedure is provided, utilizing a suite of models that is constructed from the well-known Lorenz low-order nonlinear system. A tutorial that includes a walk-through table and illustrates the inner workings of the multimodel superensemble's principle is provided. Systematic errors in a single deterministic model arise from a host of features that range from the model's initial state (data assimilation), resolution, representation of physics, dynamics, and ocean processes, local aspects of orography, water bodies, and details of the land surface. Models, in their diversity of representation of such features, end up leaving unique signatures of systematic errors. The multimodel superensemble utilizes as many as 10 million weights to take into account the bias errors arising from these diverse features of multimodels. The design of a single deterministic forecast models that utilizes multiple features from the use of the large volume of weights is provided here. This has led to a better understanding of the error growths and the collective bias reductions for several of the physical parameterizations within diverse models, such as cumulus convection, planetary boundary layer physics, and radiative transfer. A number of examples for weather, seasonal climate, hurricanes and sub surface oceanic forecast skills of member models, the ensemble mean, and the superensemble are provided.

  20. Short-Term fo F2 Forecast: Present Day State of Art

    NASA Astrophysics Data System (ADS)

    Mikhailov, A. V.; Depuev, V. H.; Depueva, A. H.

    An analysis of the F2-layer short-term forecast problem has been done. Both objective and methodological problems prevent us from a deliberate F2-layer forecast issuing at present. An empirical approach based on statistical methods may be recommended for practical use. A forecast method based on a new aeronomic index (a proxy) AI has been proposed and tested over selected 64 severe storm events. The method provides an acceptable prediction accuracy both for strongly disturbed and quiet conditions. The problems with the prediction of the F2-layer quiet-time disturbances as well as some other unsolved problems are discussed

  1. Load forecast method of electric vehicle charging station using SVR based on GA-PSO

    NASA Astrophysics Data System (ADS)

    Lu, Kuan; Sun, Wenxue; Ma, Changhui; Yang, Shenquan; Zhu, Zijian; Zhao, Pengfei; Zhao, Xin; Xu, Nan

    2017-06-01

    This paper presents a Support Vector Regression (SVR) method for electric vehicle (EV) charging station load forecast based on genetic algorithm (GA) and particle swarm optimization (PSO). Fuzzy C-Means (FCM) clustering is used to establish similar day samples. GA is used for global parameter searching and PSO is used for a more accurately local searching. Load forecast is then regressed using SVR. The practical load data of an EV charging station were taken to illustrate the proposed method. The result indicates an obvious improvement in the forecasting accuracy compared with SVRs based on PSO and GA exclusively.

  2. Assessing the impact of different satellite retrieval methods on forecast available potential energy

    NASA Technical Reports Server (NTRS)

    Whittaker, Linda M.; Horn, Lyle H.

    1990-01-01

    The effects of the inclusion of satellite temperature retrieval data, and of different satellite retrieval methods, on forecasts made with the NASA Goddard Laboratory for Atmospheres (GLA) fourth-order model were investigated using, as the parameter, the available potential energy (APE) in its isentropic form. Calculation of the APE were used to study the differences in the forecast sets both globally and in the Northern Hemisphere during 72-h forecast period. The analysis data sets used for the forecasts included one containing the NESDIS TIROS-N retrievals, the GLA retrievals using the physical inversion method, and a third, which did not contain satellite data, used as a control; two data sets, with and without satellite data, were used for verification. For all three data sets, the Northern Hemisphere values for the total APE showed an increase throughout the forecast period, mostly due to an increase in the zonal component, in contrast to the verification sets, which showed a steady level of total APE.

  3. A hybrid group method of data handling with discrete wavelet transform for GDP forecasting

    NASA Astrophysics Data System (ADS)

    Isa, Nadira Mohamed; Shabri, Ani

    2013-09-01

    This study is proposed the application of hybridization model using Group Method of Data Handling (GMDH) and Discrete Wavelet Transform (DWT) in time series forecasting. The objective of this paper is to examine the flexibility of the hybridization GMDH in time series forecasting by using Gross Domestic Product (GDP). A time series data set is used in this study to demonstrate the effectiveness of the forecasting model. This data are utilized to forecast through an application aimed to handle real life time series. This experiment compares the performances of a hybrid model and a single model of Wavelet-Linear Regression (WR), Artificial Neural Network (ANN), and conventional GMDH. It is shown that the proposed model can provide a promising alternative technique in GDP forecasting.

  4. a system approach to the long term forecasting of the climat data in baikal region

    NASA Astrophysics Data System (ADS)

    Abasov, N.; Berezhnykh, T.

    2003-04-01

    The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method of forecasting (with a year in advance) is based on the property of alternation of series of years with increase and decrease in the observed indicators (characteristic indices) of natural processes. Most of the series (98.4-99.6%) are represented by series of one to three years. The problem of forecasting is divided into two parts: 1) qualitative forecast of the probability that the started series will either continue or be replaced by a new series during the next year that is based on the frequency characteristics of series of years with increase or decrease of the forecasted sequence); 2) quantitative estimate of the forecasted value in the form of a curve of conditional frequencies is made on the base of intra-sequence interrelations among hydrometeorological elements by their differentiation with respect to series of years of increase or decrease, by construction of particular curves of conditional frequencies of the runoff for each expected variant of series development and by subsequent construction a generalized curve. Approximative learning methods form forecasted trajectories of the studied process indices for a long-term perspective. The method of analog-similarity relations is based on the fact that long periods of observations reveal some similarities in the character of variability of indices for some fragments of the sequence x (t) by definite criteria. The idea of the method is to estimate similarity of such fragments of the sequence that have been called the analogs. The method applies multistage optimization of both external parameters (e.g. the number of iterations of the sliding averaging needed to decompose the sequence into two components: the smoothed one with isolated periodic oscillations and the residual or random one). The method is applicable to current terms of forecasts and ending with the double solar cycle. Using a special procedure of integration, it separates terms with the best results for the given optimization subsample. Several optimal vectors of parameters obtained are tested on the examination (verifying) subsample. If the procedure is successful, the forecast is immediately made by integration of several best solutions. Peculiarities of forecasting extreme processes. Methods of long-term forecasting allow the sufficiently reliable forecasts to be made within the interval of xmin+Δ_1, xmax - Δ_2 (i.e. in the interval of medium values of indices). Meanwhile, in the intervals close to extreme ones, reliability of forecasts is substantially lower. While for medium values the statistics of the100-year sequence gives acceptable results owing to a sufficiently large number of revealed analogs that correspond to prognostic samples, for extreme values the situation is quite different, first of all by virtue of poverty of statistical data. Decreasing the values of Δ_1,Δ_2: Δ_1,Δ_2 rightarrow 0 (by including them into optimization parameters of the considered forecasting methods) could be one of the ways to improve reliability of forecasts. Partially, such an approach has been realized in the method of analog-similarity relations, giving the possibility to form a range of possible forecasted trajectories in two variants - from the minimum possible trajectory to the maximum possible one. Reliability of long-term forecasts. Both the methodology and the methods considered above have been realized as the information-forecasting system "GIPSAR". The system includes some tools implementing several methods of forecasting, analysis of initial and forecasted information, a developed database, a set of tools for verification of algorithms, additional information on the algorithms of statistical processing of sequences (sliding averaging, integral-difference curves, etc.), aids to organize input of initial information (in its various forms) as well as aids to draw up output prognostic documents. Risk management. The normal functioning of the Angara cascade is periodically interrupted by risks of two types that take place in the Baikal, the Bratsk and Ust-Ilimsk reservoirs: long low-water periods and sudden periods of extremely high water levels. For example, low-water periods, observed in the reservoirs of the Angara cascade can be classified under four risk categories : 1 - acceptable (negligible reduction of electric power generation by hydropower plants; certain difficulty in meeting environmental and navigation requirements); 2 - significant (substantial reduction of electric power generation by hydropower plants; certain restriction on water releases for navigation; violation of environmental requirements in some years); 3 - emergency (big losses in electric power generation; limited electricity supply to large consumers; significant restriction of water releases for navigation; threat of exposure of drinkable water intake works; violation of environmental requirements for a number of years); 4 - catastrophic (energy crisis; social crisis exposure of drinkable water intake works; termination of navigation; environmental catastrophe). Management of energy systems consists in operative, many-year regulation and perspective planning and has to take into account the analysis of operative data (water reserves in reservoirs), long-term statistics and relations among natural processes and also forecasts - short-term (for a day, week, decade), long-term and/or super-long-term (from a month to several decades). Such natural processes as water inflow to reservoirs, air temperatures during heating periods depend in turn on external factors: prevailing types of atmospheric circulation, intensity of the 11- and 22-year cycles of solar activity, volcanic activity, interaction between the ocean and atmosphere, etc. Until recently despite the formed scientific schools on long-term forecasting (I.P.Druzhinin, A.P.Reznikhov) the energy system management has been based on specially drawn dispatching schedules and long-term hydrometeorological forecasts only without attraction of perspective forecasted indices. Insertion of a parallel block of forecast (based on the analysis of data on natural processes and special methods of forecasting) into the scheme can largely smooth unfavorable consequences from the impact of natural processes on sustainable development of energy systems and especially on its safe operation. However, the requirements to reliability and accuracy of long-term forecasts significantly increase. The considered approach to long term forecasting can be used for prediction: mean winter and summer air temperatures, droughts and wood fires.

  5. Insight into the theoretical and experimental studies of 1-phenyl-3-methyl-4-benzoyl-5-pyrazolone N(4)-methyl-N(4)- phenylthiosemicarbazone - A potential NLO material

    NASA Astrophysics Data System (ADS)

    Sangeetha, K. G.; Aravindakshan, K. K.; Safna Hussan, K. P.

    2017-12-01

    The synthesis, geometrical parameters, spectroscopic studies, optimised molecular structure, vibrational analysis, Mullikan population analysis, MEP, NBO, frontier molecular orbitals and NLO effects of 1-phenyl-3-methyl-4-benzoyl-5-pyrazolone N-(4)-methyl-N-(4)-phenylthiosemicarbazone, C25H23N5OS (L1) have been communicated in this paper. A combined experimental and theoretical approach was used to explore the structure and properties of the compound. For computational studies, Gaussian 09 program was used. Starting geometry of molecule was taken from X-ray refinement data and has been optimized by using DFT (B3LYP) method with the 6-31+G (d, p) basis sets. NBO analysis gave insight into the strongly delocalized structure, responsible for the nonlinearity and hence the stability of the molecule. Frontier molecular orbitals have been defined to forecast the global reactivity descriptors of L1. The computed first-order hyperpolarizability (β) of the compound is 2 times higher than that of urea and this account for its nonlinear optical property. Simultaneously, a molecular docking study of the compound was performed using GLIDE Program. For this, three biological enzymes, histone deacetylase, ribonucleotide reductase and DNA methyl transferase, were selected as receptor molecules.

  6. Using statistical and artificial neural network models to forecast potentiometric levels at a deep well in South Texas

    NASA Astrophysics Data System (ADS)

    Uddameri, V.

    2007-01-01

    Reliable forecasts of monthly and quarterly fluctuations in groundwater levels are necessary for short- and medium-term planning and management of aquifers to ensure proper service of seasonal demands within a region. Development of physically based transient mathematical models at this time scale poses considerable challenges due to lack of suitable data and other uncertainties. Artificial neural networks (ANN) possess flexible mathematical structures and are capable of mapping highly nonlinear relationships. Feed-forward neural network models were constructed and trained using the back-percolation algorithm to forecast monthly and quarterly time-series water levels at a well that taps into the deeper Evangeline formation of the Gulf Coast aquifer in Victoria, TX. Unlike unconfined formations, no causal relationships exist between water levels and hydro-meteorological variables measured near the vicinity of the well. As such, an endogenous forecasting model using dummy variables to capture short-term seasonal fluctuations and longer-term (decadal) trends was constructed. The root mean square error, mean absolute deviation and correlation coefficient ( R) were noted to be 1.40, 0.33 and 0.77 m, respectively, for an evaluation dataset of quarterly measurements and 1.17, 0.46, and 0.88 m for an evaluative monthly dataset not used to train or test the model. These statistics were better for the ANN model than those developed using statistical regression techniques.

  7. Models for forecasting the flowering of Cornicabra olive groves.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-11-01

    This study examined the impact of weather-related variables on flowering phenology in the Cornicabra olive tree and constructed models based on linear and Poisson regression to forecast the onset and length of the pre-flowering and flowering phenophases. Spain is the world's leading olive oil producer, and the Cornicabra variety is the second largest Spanish variety in terms of surface area. However, there has been little phenological research into this variety. Phenological observations were made over a 5-year period (2009-2013) at four sampling sites in the province of Toledo (central Spain). Results showed that the onset of the pre-flowering phase is governed largely by temperature, which displayed a positive correlation with the temperature in the start of dormancy (November) and a negative correlation during the months prior to budburst (January, February and March). A similar relationship was recorded for the onset of flowering. Other weather-related variables, including solar radiation and rainfall, also influenced the succession of olive flowering phenophases. Linear models proved the most suitable for forecasting the onset and length of the pre-flowering period and the onset of flowering. The onset and length of pre-flowering can be predicted up to 1 or 2 months prior to budburst, whilst the onset of flowering can be forecast up to 3 months beforehand. By contrast, a nonlinear model using Poisson regression was best suited to predict the length of the flowering period.

  8. Increased performance in the short-term water demand forecasting through the use of a parallel adaptive weighting strategy

    NASA Astrophysics Data System (ADS)

    Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.

    2018-03-01

    Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.

  9. A scoping review of nursing workforce planning and forecasting research.

    PubMed

    Squires, Allison; Jylhä, Virpi; Jun, Jin; Ensio, Anneli; Kinnunen, Juha

    2017-11-01

    This study will critically evaluate forecasting models and their content in workforce planning policies for nursing professionals and to highlight the strengths and the weaknesses of existing approaches. Although macro-level nursing workforce issues may not be the first thing that many nurse managers consider in daily operations, the current and impending nursing shortage in many countries makes nursing specific models for workforce forecasting important. A scoping review was conducted using a directed and summative content analysis approach to capture supply and demand analytic methods of nurse workforce planning and forecasting. The literature on nurse workforce forecasting studies published in peer-reviewed journals as well as in grey literature was included in the scoping review. Thirty six studies met the inclusion criteria, with the majority coming from the USA. Forecasting methods were biased towards service utilization analyses and were not consistent across studies. Current methods for nurse workforce forecasting are inconsistent and have not accounted sufficiently for socioeconomic and political factors that can influence workforce projections. Additional studies examining past trends are needed to improve future modelling. Accurate nursing workforce forecasting can help nurse managers, administrators and policy makers to understand the supply and demand of the workforce to prepare and maintain an adequate and competent current and future workforce. © 2017 John Wiley & Sons Ltd.

  10. Robustness of disaggregate oil and gas discovery forecasting models

    USGS Publications Warehouse

    Attanasi, E.D.; Schuenemeyer, J.H.

    1989-01-01

    The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.

  11. Why did the 2015/16 El Niño Fail to Bring Excessive Precipitation to California?

    NASA Astrophysics Data System (ADS)

    Jong, B. T.; Ting, M.; Seager, R.; Lee, D. E.

    2016-12-01

    California has experienced severe drought in recent years posing great challenges to water resources, agriculture, and land management. El Niño, as the prime sources of seasonal to interannual climate predictability, offers the potential of alleviation of drought in California. Here, El Niño's impacts on California winter precipitation are examined. Our results, based on the observations during 1901-2010, show that El Niño's influence on precipitation strengthens from early to late winter even as El Niño weakens. The cause of the nonlinear relationship between sea surface temperature anomaly (SSTA) amplitude and teleconnection strength is the late winter warming of the climatological mean SST over the tropical eastern Pacific, allowing more active and eastward extending tropical deep convection anomaly. The 2015/16 El Niño, one of the strongest events in recent history, did not bring the heavy precipitation to California anticipated based on model forecasts and experience with the previous two strong El Niños, 1982/83 and 1997/98. North American Multi-Model Ensemble (NMME) 3-month average forecasts of SST from February 1 2016, models overestimated the Niño3 SSTA, compared to what actually occurred and, consistently, forecast heavier than observed California precipitation. The too high Niño3 SSTA drove too strong deep convection anomalies in the eastern tropical Pacific, triggering a too strong teleconnection that made the forecast California precipitation too wet. Thus, the faster than forecast decay in Niño3 SST anomalies at the end of the 2015/16 El Niño is one possible reason why the event failed to bring excess precipitation to California in the late winter. Controlled GCM experiments support this hypothesis and show that the teleconnection forced by the multimodel mean forecast of 2016 February-March-April SSTAs is stronger than the one forced by the observed SSTAs. Within the NMME those models that more correctly forecast the decay of El Niño 2015/16 also more correctly forecast modest precipitation anomalies over California.

  12. Forecasting hotspots in East Kutai, Kutai Kartanegara, and West Kutai as early warning information

    NASA Astrophysics Data System (ADS)

    Wahyuningsih, S.; Goejantoro, R.; Rizki, N. A.

    2018-04-01

    The aims of this research are to model hotspots and forecast hotspot 2017 in East Kutai, Kutai Kartanegara and West Kutai. The methods which used in this research were Holt exponential smoothing, Holt’s additive dump trend method, Holt-Winters’ additive method, additive decomposition method, multiplicative decomposition method, Loess decomposition method and Box-Jenkins method. For smoothing techniques, additive decomposition is better than Holt’s exponential smoothing. The hotspots model using Box-Jenkins method were Autoregressive Moving Average ARIMA(1,1,0), ARIMA(0,2,1), and ARIMA(0,1,0). Comparing the results from all methods which were used in this research, and based on Root of Mean Squared Error (RMSE), show that Loess decomposition method is the best times series model, because it has the least RMSE. Thus the Loess decomposition model used to forecast the number of hotspot. The forecasting result indicatethat hotspots pattern tend to increase at the end of 2017 in Kutai Kartanegara and West Kutai, but stationary in East Kutai.

  13. Identifying sensitive areas of adaptive observations for prediction of the Kuroshio large meander using a shallow-water model

    NASA Astrophysics Data System (ADS)

    Zou, Guang'an; Wang, Qiang; Mu, Mu

    2016-09-01

    Sensitive areas for prediction of the Kuroshio large meander using a 1.5-layer, shallow-water ocean model were investigated using the conditional nonlinear optimal perturbation (CNOP) and first singular vector (FSV) methods. A series of sensitivity experiments were designed to test the sensitivity of sensitive areas within the numerical model. The following results were obtained: (1) the eff ect of initial CNOP and FSV patterns in their sensitive areas is greater than that of the same patterns in randomly selected areas, with the eff ect of the initial CNOP patterns in CNOP sensitive areas being the greatest; (2) both CNOP- and FSV-type initial errors grow more quickly than random errors; (3) the eff ect of random errors superimposed on the sensitive areas is greater than that of random errors introduced into randomly selected areas, and initial errors in the CNOP sensitive areas have greater eff ects on final forecasts. These results reveal that the sensitive areas determined using the CNOP are more sensitive than those of FSV and other randomly selected areas. In addition, ideal hindcasting experiments were conducted to examine the validity of the sensitive areas. The results indicate that reduction (or elimination) of CNOP-type errors in CNOP sensitive areas at the initial time has a greater forecast benefit than the reduction (or elimination) of FSV-type errors in FSV sensitive areas. These results suggest that the CNOP method is suitable for determining sensitive areas in the prediction of the Kuroshio large-meander path.

  14. Using Landslide Failure Forecast Models in Near Real Time: the Mt. de La Saxe case-study

    NASA Astrophysics Data System (ADS)

    Manconi, Andrea; Giordan, Daniele

    2014-05-01

    Forecasting the occurrence of landslide phenomena in space and time is a major scientific challenge. The approaches used to forecast landslides mainly depend on the spatial scale analyzed (regional vs. local), the temporal range of forecast (long- vs. short-term), as well as the triggering factor and the landslide typology considered. By focusing on short-term forecast methods for large, deep seated slope instabilities, the potential time of failure (ToF) can be estimated by studying the evolution of the landslide deformation over time (i.e., strain rate) provided that, under constant stress conditions, landslide materials follow creep mechanism before reaching rupture. In the last decades, different procedures have been proposed to estimate ToF by considering simplified empirical and/or graphical methods applied to time series of deformation data. Fukuzono, 1985 proposed a failure forecast method based on the experience performed during large scale laboratory experiments, which were aimed at observing the kinematic evolution of a landslide induced by rain. This approach, known also as the inverse-velocity method, considers the evolution over time of the inverse value of the surface velocity (v) as an indicator of the ToF, by assuming that failure approaches while 1/v tends to zero. Here we present an innovative method to aimed at achieving failure forecast of landslide phenomena by considering near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and then apply straightforward statistical methods to obtain confidence intervals on the time of failure. Our results can be relevant to support the management of early warning systems during landslide emergency conditions, also when the predefined displacement and/or velocity thresholds are exceeded. In addition, our statistical approach for the definition of confidence interval and forecast reliability can be applied also to different failure forecast methods. We applied for the first time the herein presented approach in near real time during the emergency scenario relevant to the reactivation of the La Saxe rockslide, a large mass wasting menacing the population of Courmayeur, northern Italy, and the important European route E25. We show how the application of simplified but robust forecast models can be a convenient method to manage and support early warning systems during critical situations. References: Fukuzono T. (1985), A New Method for Predicting the Failure Time of a Slope, Proc. IVth International Conference and Field Workshop on Landslides, Tokyo.

  15. Bias Adjusted Precipitation Threat Scores

    NASA Astrophysics Data System (ADS)

    Mesinger, F.

    2008-04-01

    Among the wide variety of performance measures available for the assessment of skill of deterministic precipitation forecasts, the equitable threat score (ETS) might well be the one used most frequently. It is typically used in conjunction with the bias score. However, apart from its mathematical definition the meaning of the ETS is not clear. It has been pointed out (Mason, 1989; Hamill, 1999) that forecasts with a larger bias tend to have a higher ETS. Even so, the present author has not seen this having been accounted for in any of numerous papers that in recent years have used the ETS along with bias "as a measure of forecast accuracy". A method to adjust the threat score (TS) or the ETS so as to arrive at their values that correspond to unit bias in order to show the model's or forecaster's accuracy in placing precipitation has been proposed earlier by the present author (Mesinger and Brill, the so-called dH/dF method). A serious deficiency however has since been noted with the dH/dF method in that the hypothetical function that it arrives at to interpolate or extrapolate the observed value of hits to unit bias can have values of hits greater than forecast when the forecast area tends to zero. Another method is proposed here based on the assumption that the increase in hits per unit increase in false alarms is proportional to the yet unhit area. This new method removes the deficiency of the dH/dF method. Examples of its performance for 12 months of forecasts by three NCEP operational models are given.

  16. Survey of air cargo forecasting techniques

    NASA Technical Reports Server (NTRS)

    Kuhlthan, A. R.; Vermuri, R. S.

    1978-01-01

    Forecasting techniques currently in use in estimating or predicting the demand for air cargo in various markets are discussed with emphasis on the fundamentals of the different forecasting approaches. References to specific studies are cited when appropriate. The effectiveness of current methods is evaluated and several prospects for future activities or approaches are suggested. Appendices contain summary type analyses of about 50 specific publications on forecasting, and selected bibliographies on air cargo forecasting, air passenger demand forecasting, and general demand and modalsplit modeling.

  17. Stochastic demographic forecasting.

    PubMed

    Lee, R D

    1992-11-01

    "This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary.... Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented." excerpt

  18. Comparison between stochastic and machine learning methods for hydrological multi-step ahead forecasting: All forecasts are wrong!

    NASA Astrophysics Data System (ADS)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2017-04-01

    Machine learning (ML) is considered to be a promising approach to hydrological processes forecasting. We conduct a comparison between several stochastic and ML point estimation methods by performing large-scale computational experiments based on simulations. The purpose is to provide generalized results, while the respective comparisons in the literature are usually based on case studies. The stochastic methods used include simple methods, models from the frequently used families of Autoregressive Moving Average (ARMA), Autoregressive Fractionally Integrated Moving Average (ARFIMA) and Exponential Smoothing models. The ML methods used are Random Forests (RF), Support Vector Machines (SVM) and Neural Networks (NN). The comparison refers to the multi-step ahead forecasting properties of the methods. A total of 20 methods are used, among which 9 are the ML methods. 12 simulation experiments are performed, while each of them uses 2 000 simulated time series of 310 observations. The time series are simulated using stochastic processes from the families of ARMA and ARFIMA models. Each time series is split into a fitting (first 300 observations) and a testing set (last 10 observations). The comparative assessment of the methods is based on 18 metrics, that quantify the methods' performance according to several criteria related to the accurate forecasting of the testing set, the capturing of its variation and the correlation between the testing and forecasted values. The most important outcome of this study is that there is not a uniformly better or worse method. However, there are methods that are regularly better or worse than others with respect to specific metrics. It appears that, although a general ranking of the methods is not possible, their classification based on their similar or contrasting performance in the various metrics is possible to some extent. Another important conclusion is that more sophisticated methods do not necessarily provide better forecasts compared to simpler methods. It is pointed out that the ML methods do not differ dramatically from the stochastic methods, while it is interesting that the NN, RF and SVM algorithms used in this study offer potentially very good performance in terms of accuracy. It should be noted that, although this study focuses on hydrological processes, the results are of general scientific interest. Another important point in this study is the use of several methods and metrics. Using fewer methods and fewer metrics would have led to a very different overall picture, particularly if those fewer metrics corresponded to fewer criteria. For this reason, we consider that the proposed methodology is appropriate for the evaluation of forecasting methods.

  19. Strength resistance of reinforced concrete elements of high-rise buildings under dynamic loads

    NASA Astrophysics Data System (ADS)

    Berlinov, Mikhail

    2018-03-01

    A new method for calculating reinforced concrete constructions of high-rise buildings under dynamic loads from wind, seismic, transport and equipment based on the initial assumptions of the modern phenomenological theory of a nonlinearly deformable elastic-creeping body is proposed. In the article examined the influence of reinforcement on the work of concrete in the conditions of triaxial stress-strain state, based on the compatibility of the deformation of concrete and reinforcement. Mathematical phenomenological equations have been obtained that make it possible to calculate the reinforced concrete elements working without and with cracks. A method for linearizing of these equations based on integral estimates is proposed, which provides the fixation of the vibro-creep processes in the considered period of time. Application of such a technique using the finite-difference method, step method and successive approximations will allow to find a numerical solution of the problem. Such an approach in the design of reinforced concrete constructions will allow not only more fully to take into account the real conditions of their work, revealing additional reserves of load capacity, but also to open additional opportunities for analysis and forecasting their functioning at various stages of operation.

  20. Short-term Forecasting of the Prevalence of Trachoma: Expert Opinion, Statistical Regression, versus Transmission Models

    PubMed Central

    Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.

    2015-01-01

    Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380

  1. Wind Turbine Gust Prediction Using Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Towers, Paul; Jones, Bryn

    2013-11-01

    Offshore wind energy is a growing energy source as governments around the world look for environmentally friendly solutions to potential future energy shortages. In order to capture more energy from the wind, larger turbines are being designed, leading to the structures becoming increasingly vulnerable to damage caused by violent gusts of wind. Advance knowledge of such gusts will enable turbine control systems to take preventative action, reducing turbine maintenance costs. We present a system which can accurately forecast the velocity profile of an oncoming wind, given only limited spatial measurements from light detection and ranging (LiDAR) units, which are currently operational in industry. Our method combines nonlinear state estimation techniques with low-order models of atmospheric boundary-layer flows to generate flow-field estimates. We discuss the accuracy of our velocity profile predictions by direct comparison to data derived from large eddy simulations of the atmospheric boundary layer.

  2. Copula Entropy coupled with Wavelet Neural Network Model for Hydrological Prediction

    NASA Astrophysics Data System (ADS)

    Wang, Yin; Yue, JiGuang; Liu, ShuGuang; Wang, Li

    2018-02-01

    Artificial Neural network(ANN) has been widely used in hydrological forecasting. in this paper an attempt has been made to find an alternative method for hydrological prediction by combining Copula Entropy(CE) with Wavelet Neural Network(WNN), CE theory permits to calculate mutual information(MI) to select Input variables which avoids the limitations of the traditional linear correlation(LCC) analysis. Wavelet analysis can provide the exact locality of any changes in the dynamical patterns of the sequence Coupled with ANN Strong non-linear fitting ability. WNN model was able to provide a good fit with the hydrological data. finally, the hybrid model(CE+WNN) have been applied to daily water level of Taihu Lake Basin, and compared with CE ANN, LCC WNN and LCC ANN. Results showed that the hybrid model produced better results in estimating the hydrograph properties than the latter models.

  3. A dynamic method to forecast the wheel slip for antilock braking system and its experimental evaluation.

    PubMed

    Oniz, Yesim; Kayacan, Erdal; Kaynak, Okyay

    2009-04-01

    The control of an antilock braking system (ABS) is a difficult problem due to its strongly nonlinear and uncertain characteristics. To overcome this difficulty, the integration of gray-system theory and sliding-mode control is proposed in this paper. This way, the prediction capabilities of the former and the robustness of the latter are combined to regulate optimal wheel slip depending on the vehicle forward velocity. The design approach described is novel, considering that a point, rather than a line, is used as the sliding control surface. The control algorithm is derived and subsequently tested on a quarter vehicle model. Encouraged by the simulation results indicating the ability to overcome the stated difficulties with fast convergence, experimental results are carried out on a laboratory setup. The results presented indicate the potential of the approach in handling difficult real-time control problems.

  4. User's guide for a general purpose dam-break flood simulation model (K-634)

    USGS Publications Warehouse

    Land, Larry F.

    1981-01-01

    An existing computer program for simulating dam-break floods for forecast purposes has been modified with an emphasis on general purpose applications. The original model was formulated, developed and documented by the National Weather Service. This model is based on the complete flow equations and uses a nonlinear implicit finite-difference numerical method. The first phase of the simulation routes a flood wave through the reservoir and computes an outflow hydrograph which is the sum of the flow through the dam 's structures and the gradually developing breach. The second phase routes this outflow hydrograph through the stream which may be nonprismatic and have segments with subcritical or supercritical flow. The results are discharge and stage hydrographs at the dam as well as all of the computational nodes in the channel. From these hydrographs, peak discharge and stage profiles are tabulated. (USGS)

  5. Data-driven outbreak forecasting with a simple nonlinear growth model.

    PubMed

    Lega, Joceline; Brown, Heidi E

    2016-12-01

    Recent events have thrown the spotlight on infectious disease outbreak response. We developed a data-driven method, EpiGro, which can be applied to cumulative case reports to estimate the order of magnitude of the duration, peak and ultimate size of an ongoing outbreak. It is based on a surprisingly simple mathematical property of many epidemiological data sets, does not require knowledge or estimation of disease transmission parameters, is robust to noise and to small data sets, and runs quickly due to its mathematical simplicity. Using data from historic and ongoing epidemics, we present the model. We also provide modeling considerations that justify this approach and discuss its limitations. In the absence of other information or in conjunction with other models, EpiGro may be useful to public health responders. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Developing a predictive tropospheric ozone model for Tabriz

    NASA Astrophysics Data System (ADS)

    Khatibi, Rahman; Naghipour, Leila; Ghorbani, Mohammad A.; Smith, Michael S.; Karimi, Vahid; Farhoudi, Reza; Delafrouz, Hadi; Arvanaghi, Hadi

    2013-04-01

    Predictive ozone models are becoming indispensable tools by providing a capability for pollution alerts to serve people who are vulnerable to the risks. We have developed a tropospheric ozone prediction capability for Tabriz, Iran, by using the following five modeling strategies: three regression-type methods: Multiple Linear Regression (MLR), Artificial Neural Networks (ANNs), and Gene Expression Programming (GEP); and two auto-regression-type models: Nonlinear Local Prediction (NLP) to implement chaos theory and Auto-Regressive Integrated Moving Average (ARIMA) models. The regression-type modeling strategies explain the data in terms of: temperature, solar radiation, dew point temperature, and wind speed, by regressing present ozone values to their past values. The ozone time series are available at various time intervals, including hourly intervals, from August 2010 to March 2011. The results for MLR, ANN and GEP models are not overly good but those produced by NLP and ARIMA are promising for the establishing a forecasting capability.

  7. Establishing a method of short-term rainfall forecasting based on GNSS-derived PWV and its application.

    PubMed

    Yao, Yibin; Shan, Lulu; Zhao, Qingzhi

    2017-09-29

    Global Navigation Satellite System (GNSS) can effectively retrieve precipitable water vapor (PWV) with high precision and high-temporal resolution. GNSS-derived PWV can be used to reflect water vapor variation in the process of strong convection weather. By studying the relationship between time-varying PWV and rainfall, it can be found that PWV contents increase sharply before raining. Therefore, a short-term rainfall forecasting method is proposed based on GNSS-derived PWV. Then the method is validated using hourly GNSS-PWV data from Zhejiang Continuously Operating Reference Station (CORS) network of the period 1 September 2014 to 31 August 2015 and its corresponding hourly rainfall information. The results show that the forecasted correct rate can reach about 80%, while the false alarm rate is about 66%. Compared with results of the previous studies, the correct rate is improved by about 7%, and the false alarm rate is comparable. The method is also applied to other three actual rainfall events of different regions, different durations, and different types. The results show that the method has good applicability and high accuracy, which can be used for rainfall forecasting, and in the future study, it can be assimilated with traditional weather forecasting techniques to improve the forecasted accuracy.

  8. Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power

    NASA Astrophysics Data System (ADS)

    Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab

    2014-01-01

    Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.

  9. A Hybrid Approach on Tourism Demand Forecasting

    NASA Astrophysics Data System (ADS)

    Nor, M. E.; Nurul, A. I. M.; Rusiman, M. S.

    2018-04-01

    Tourism has become one of the important industries that contributes to the country’s economy. Tourism demand forecasting gives valuable information to policy makers, decision makers and organizations related to tourism industry in order to make crucial decision and planning. However, it is challenging to produce an accurate forecast since economic data such as the tourism data is affected by social, economic and environmental factors. In this study, an equally-weighted hybrid method, which is a combination of Box-Jenkins and Artificial Neural Networks, was applied to forecast Malaysia’s tourism demand. The forecasting performance was assessed by taking the each individual method as a benchmark. The results showed that this hybrid approach outperformed the other two models

  10. Rogue waves in terms of multi-point statistics and nonequilibrium thermodynamics

    NASA Astrophysics Data System (ADS)

    Hadjihosseini, Ali; Lind, Pedro; Mori, Nobuhito; Hoffmann, Norbert P.; Peinke, Joachim

    2017-04-01

    Ocean waves, which lead to rogue waves, are investigated on the background of complex systems. In contrast to deterministic approaches based on the nonlinear Schroedinger equation or focusing effects, we analyze this system in terms of a noisy stochastic system. In particular we present a statistical method that maps the complexity of multi-point data into the statistics of hierarchically ordered height increments for different time scales. We show that the stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. This stochastic description enables us to show several new aspects of wave states. Surrogate data sets can in turn be generated allowing to work out different statistical features of the complex sea state in general and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics. As a new outlook the ocean wave states will be considered in terms of nonequilibrium thermodynamics, for which the entropy production of different wave heights will be considered. We show evidence that rogue waves are characterized by negative entropy production. The statistics of the entropy production can be used to distinguish different wave states.

  11. Fuzzy forecasting based on fuzzy-trend logical relationship groups.

    PubMed

    Chen, Shyi-Ming; Wang, Nai-Yi

    2010-10-01

    In this paper, we present a new method to predict the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) based on fuzzy-trend logical relationship groups (FTLRGs). The proposed method divides fuzzy logical relationships into FTLRGs based on the trend of adjacent fuzzy sets appearing in the antecedents of fuzzy logical relationships. First, we apply an automatic clustering algorithm to cluster the historical data into intervals of different lengths. Then, we define fuzzy sets based on these intervals of different lengths. Then, the historical data are fuzzified into fuzzy sets to derive fuzzy logical relationships. Then, we divide the fuzzy logical relationships into FTLRGs for forecasting the TAIEX. Moreover, we also apply the proposed method to forecast the enrollments and the inventory demand, respectively. The experimental results show that the proposed method gets higher average forecasting accuracy rates than the existing methods.

  12. North Carolina forecasts for truck traffic

    DOT National Transportation Integrated Search

    2006-07-01

    North Carolina has experienced significant increases in truck traffic on many of its highways. Yet, current NCDOT : project-level highway traffic forecasts do not appropriately capture anticipated truck traffic growth. NCDOT : methods forecast total ...

  13. Adaptive correction of ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane

    2017-04-01

    Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.

  14. Approaches in Health Human Resource Forecasting: A Roadmap for Improvement.

    PubMed

    Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh

    2016-09-01

    Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. A literature review was conducted for studies published in English from 1990-2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies' references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses. Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems.

  15. Forecasting United States heartworm Dirofilaria immitis prevalence in dogs.

    PubMed

    Bowman, Dwight D; Liu, Yan; McMahan, Christopher S; Nordone, Shila K; Yabsley, Michael J; Lund, Robert B

    2016-10-10

    This paper forecasts next year's canine heartworm prevalence in the United States from 16 climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 31 million antigen heartworm tests conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on 16 predictive factors, including temperature, precipitation, median household income, local forest and surface water coverage, and presence/absence of eight mosquito species. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county heartworm prevalence for the 5-year period 2011-2015 is 0.727, demonstrating reasonable model accuracy. The correlation between 2015 observed and forecasted county-by-county heartworm prevalence is 0.940, demonstrating significant skill and showing that heartworm prevalence can be forecasted reasonably accurately. The forecast presented herein can a priori alert veterinarians to areas expected to see higher than normal heartworm activity. The proposed methods may prove useful for forecasting other diseases.

  16. A Hybrid Neural Network Model for Sales Forecasting Based on ARIMA and Search Popularity of Article Titles.

    PubMed

    Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren

    2016-01-01

    Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words.

  17. A Hybrid Neural Network Model for Sales Forecasting Based on ARIMA and Search Popularity of Article Titles

    PubMed Central

    Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren

    2016-01-01

    Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words. PMID:27313605

  18. Statistical and Machine Learning forecasting methods: Concerns and ways forward

    PubMed Central

    Makridakis, Spyros; Assimakopoulos, Vassilios

    2018-01-01

    Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784

  19. Probabilistic precipitation nowcasting based on an extrapolation of radar reflectivity and an ensemble approach

    NASA Astrophysics Data System (ADS)

    Sokol, Zbyněk; Mejsnar, Jan; Pop, Lukáš; Bližňák, Vojtěch

    2017-09-01

    A new method for the probabilistic nowcasting of instantaneous rain rates (ENS) based on the ensemble technique and extrapolation along Lagrangian trajectories of current radar reflectivity is presented. Assuming inaccurate forecasts of the trajectories, an ensemble of precipitation forecasts is calculated and used to estimate the probability that rain rates will exceed a given threshold in a given grid point. Although the extrapolation neglects the growth and decay of precipitation, their impact on the probability forecast is taken into account by the calibration of forecasts using the reliability component of the Brier score (BS). ENS forecasts the probability that the rain rates will exceed thresholds of 0.1, 1.0 and 3.0 mm/h in squares of 3 km by 3 km. The lead times were up to 60 min, and the forecast accuracy was measured by the BS. The ENS forecasts were compared with two other methods: combined method (COM) and neighbourhood method (NEI). NEI considered the extrapolated values in the square neighbourhood of 5 by 5 grid points of the point of interest as ensemble members, and the COM ensemble was comprised of united ensemble members of ENS and NEI. The results showed that the calibration technique significantly improves bias of the probability forecasts by including additional uncertainties that correspond to neglected processes during the extrapolation. In addition, the calibration can also be used for finding the limits of maximum lead times for which the forecasting method is useful. We found that ENS is useful for lead times up to 60 min for thresholds of 0.1 and 1 mm/h and approximately 30 to 40 min for a threshold of 3 mm/h. We also found that a reasonable size of the ensemble is 100 members, which provided better scores than ensembles with 10, 25 and 50 members. In terms of the BS, the best results were obtained by ENS and COM, which are comparable. However, ENS is better calibrated and thus preferable.

  20. Past speculations of the future: a review of the methods used for forecasting emerging health technologies

    PubMed Central

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2016-01-01

    Objectives Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3–20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Design Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. Participants People are not needed in this study. Data sources The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Main outcome measure Studies reporting methods used to predict future health technologies within a 3–20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. Results 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. Conclusions The methodological fundamentals of formal 3–20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. PMID:26966060

  1. Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test

    PubMed Central

    Jiang, Chuanjin; Zhang, Jing; Song, Fugen

    2014-01-01

    Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061

  2. Selecting single model in combination forecasting based on cointegration test and encompassing test.

    PubMed

    Jiang, Chuanjin; Zhang, Jing; Song, Fugen

    2014-01-01

    Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.

  3. Application of empirical mode decomposition with local linear quantile regression in financial time series forecasting.

    PubMed

    Jaber, Abobaker M; Ismail, Mohd Tahir; Altaher, Alsaidi M

    2014-01-01

    This paper mainly forecasts the daily closing price of stock markets. We propose a two-stage technique that combines the empirical mode decomposition (EMD) with nonparametric methods of local linear quantile (LLQ). We use the proposed technique, EMD-LLQ, to forecast two stock index time series. Detailed experiments are implemented for the proposed method, in which EMD-LPQ, EMD, and Holt-Winter methods are compared. The proposed EMD-LPQ model is determined to be superior to the EMD and Holt-Winter methods in predicting the stock closing prices.

  4. Technical note: Combining quantile forecasts and predictive distributions of streamflows

    NASA Astrophysics Data System (ADS)

    Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano

    2017-11-01

    The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.

  5. A Simulation Optimization Approach to Epidemic Forecasting

    PubMed Central

    Nsoesie, Elaine O.; Beckman, Richard J.; Shashaani, Sara; Nagaraj, Kalyani S.; Marathe, Madhav V.

    2013-01-01

    Reliable forecasts of influenza can aid in the control of both seasonal and pandemic outbreaks. We introduce a simulation optimization (SIMOP) approach for forecasting the influenza epidemic curve. This study represents the final step of a project aimed at using a combination of simulation, classification, statistical and optimization techniques to forecast the epidemic curve and infer underlying model parameters during an influenza outbreak. The SIMOP procedure combines an individual-based model and the Nelder-Mead simplex optimization method. The method is used to forecast epidemics simulated over synthetic social networks representing Montgomery County in Virginia, Miami, Seattle and surrounding metropolitan regions. The results are presented for the first four weeks. Depending on the synthetic network, the peak time could be predicted within a 95% CI as early as seven weeks before the actual peak. The peak infected and total infected were also accurately forecasted for Montgomery County in Virginia within the forecasting period. Forecasting of the epidemic curve for both seasonal and pandemic influenza outbreaks is a complex problem, however this is a preliminary step and the results suggest that more can be achieved in this area. PMID:23826222

  6. A Simulation Optimization Approach to Epidemic Forecasting.

    PubMed

    Nsoesie, Elaine O; Beckman, Richard J; Shashaani, Sara; Nagaraj, Kalyani S; Marathe, Madhav V

    2013-01-01

    Reliable forecasts of influenza can aid in the control of both seasonal and pandemic outbreaks. We introduce a simulation optimization (SIMOP) approach for forecasting the influenza epidemic curve. This study represents the final step of a project aimed at using a combination of simulation, classification, statistical and optimization techniques to forecast the epidemic curve and infer underlying model parameters during an influenza outbreak. The SIMOP procedure combines an individual-based model and the Nelder-Mead simplex optimization method. The method is used to forecast epidemics simulated over synthetic social networks representing Montgomery County in Virginia, Miami, Seattle and surrounding metropolitan regions. The results are presented for the first four weeks. Depending on the synthetic network, the peak time could be predicted within a 95% CI as early as seven weeks before the actual peak. The peak infected and total infected were also accurately forecasted for Montgomery County in Virginia within the forecasting period. Forecasting of the epidemic curve for both seasonal and pandemic influenza outbreaks is a complex problem, however this is a preliminary step and the results suggest that more can be achieved in this area.

  7. Automated Statistical Forecast Method to 36-48H ahead of Storm Wind and Dangerous Precipitation at the Mediterranean Region

    NASA Astrophysics Data System (ADS)

    Perekhodtseva, E. V.

    2009-09-01

    Development of successful method of forecast of storm winds, including squalls and tornadoes and heavy rainfalls, that often result in human and material losses, could allow one to take proper measures against destruction of buildings and to protect people. Well-in-advance successful forecast (from 12 hours to 48 hour) makes possible to reduce the losses. Prediction of the phenomena involved is a very difficult problem for synoptic till recently. The existing graphic and calculation methods still depend on subjective decision of an operator. Nowadays in Russia there is no hydrodynamic model for forecast of the maximal precipitation and wind velocity V> 25m/c, hence the main tools of objective forecast are statistical methods using the dependence of the phenomena involved on a number of atmospheric parameters (predictors). Statistical decisive rule of the alternative and probability forecast of these events was obtained in accordance with the concept of "perfect prognosis" using the data of objective analysis. For this purpose the different teaching samples of present and absent of this storm wind and rainfalls were automatically arranged that include the values of forty physically substantiated potential predictors. Then the empirical statistical method was used that involved diagonalization of the mean correlation matrix R of the predictors and extraction of diagonal blocks of strongly correlated predictors. Thus for these phenomena the most informative predictors were selected without loosing information. The statistical decisive rules for diagnosis and prognosis of the phenomena involved U(X) were calculated for choosing informative vector-predictor. We used the criterion of distance of Mahalanobis and criterion of minimum of entropy by Vapnik-Chervonenkis for the selection predictors. Successful development of hydrodynamic models for short-term forecast and improvement of 36-48h forecasts of pressure, temperature and others parameters allowed us to use the prognostic fields of those models for calculations of the discriminant functions in the nodes of the grid 150x150km and the values of probabilities P of dangerous wind and thus to get fully automated forecasts. In order to change to the alternative forecast the author proposes the empirical threshold values specified for this phenomenon and advance period 36 hours. In the accordance to the Pirsey-Obukhov criterion (T), the success of these automated statistical methods of forecast of squalls and tornadoes to 36 -48 hours ahead and heavy rainfalls in the warm season for the territory of Italy, Spain and Balkan countries is T = 1-a-b=0,54: 0,78 after author experiments. A lot of examples of very successful forecasts of summer storm wind and heavy rainfalls over the Italy and Spain territory are submitted at this report. The same decisive rules were applied to the forecast of these phenomena during cold period in this year too. This winter heavy snowfalls in Spain and in Italy and storm wind at this territory were observed very often. And our forecasts are successful.

  8. Residual uncertainty estimation using instance-based learning with applications to hydrologic forecasting

    NASA Astrophysics Data System (ADS)

    Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.

    2017-08-01

    A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.

  9. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.

  10. Tourism forecasting using modified empirical mode decomposition and group method of data handling

    NASA Astrophysics Data System (ADS)

    Yahya, N. A.; Samsudin, R.; Shabri, A.

    2017-09-01

    In this study, a hybrid model using modified Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) model is proposed for tourism forecasting. This approach reconstructs intrinsic mode functions (IMFs) produced by EMD using trial and error method. The new component and the remaining IMFs is then predicted respectively using GMDH model. Finally, the forecasted results for each component are aggregated to construct an ensemble forecast. The data used in this experiment are monthly time series data of tourist arrivals from China, Thailand and India to Malaysia from year 2000 to 2016. The performance of the model is evaluated using Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) where conventional GMDH model and EMD-GMDH model are used as benchmark models. Empirical results proved that the proposed model performed better forecasts than the benchmarked models.

  11. Purposes and methods of scoring earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2010-12-01

    There are two kinds of purposes in the studies on earthquake prediction or forecasts: one is to give a systematic estimation of earthquake risks in some particular region and period in order to give advice to governments and enterprises for the use of reducing disasters, the other one is to search for reliable precursors that can be used to improve earthquake prediction or forecasts. For the first case, a complete score is necessary, while for the latter case, a partial score, which can be used to evaluate whether the forecasts or predictions have some advantages than a well know model, is necessary. This study reviews different scoring methods for evaluating the performance of earthquake prediction and forecasts. Especially, the gambling scoring method, which is developed recently, shows its capacity in finding good points in an earthquake prediction algorithm or model that are not in a reference model, even if its overall performance is no better than the reference model.

  12. Statistical Post-Processing of Wind Speed Forecasts to Estimate Relative Economic Value

    NASA Astrophysics Data System (ADS)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2013-04-01

    The objective of this research is to get the best possible wind speed forecasts for the wind energy industry by using an optimal combination of well-established forecasting and post-processing methods. We start with the ECMWF 51 member ensemble prediction system (EPS) which is underdispersive and hence uncalibrated. We aim to produce wind speed forecasts that are more accurate and calibrated than the EPS. The 51 members of the EPS are clustered to 8 weighted representative members (RMs), chosen to minimize the within-cluster spread, while maximizing the inter-cluster spread. The forecasts are then downscaled using two limited area models, WRF and COSMO, at two resolutions, 14km and 3km. This process creates four distinguishable ensembles which are used as input to statistical post-processes requiring multi-model forecasts. Two such processes are presented here. The first, Bayesian Model Averaging, has been proven to provide more calibrated and accurate wind speed forecasts than the ECMWF EPS using this multi-model input data. The second, heteroscedastic censored regression is indicating positive results also. We compare the two post-processing methods, applied to a year of hindcast wind speed data around Ireland, using an array of deterministic and probabilistic verification techniques, such as MAE, CRPS, probability transform integrals and verification rank histograms, to show which method provides the most accurate and calibrated forecasts. However, the value of a forecast to an end-user cannot be fully quantified by just the accuracy and calibration measurements mentioned, as the relationship between skill and value is complex. Capturing the full potential of the forecast benefits also requires detailed knowledge of the end-users' weather sensitive decision-making processes and most importantly the economic impact it will have on their income. Finally, we present the continuous relative economic value of both post-processing methods to identify which is more beneficial to the wind energy industry of Ireland.

  13. Assessing the Value of Frost Forecasts to Orchardists: A Dynamic Decision-Making Approach.

    NASA Astrophysics Data System (ADS)

    Katz, Richard W.; Murphy, Allan H.; Winkler, Robert L.

    1982-04-01

    The methodology of decision analysis is used to investigate the economic value of frost (i.e., minimum temperature) forecasts to orchardists. First, the fruit-frost situation and previous studies of the value of minimum temperature forecasts in this context are described. Then, after a brief overview of decision analysis, a decision-making model for the fruit-frost problem is presented. The model involves identifying the relevant actions and events (or outcomes), specifying the effect of taking protective action, and describing the relationships among temperature, bud loss, and yield loss. A bivariate normal distribution is used to model the relationship between forecast and observed temperatures, thereby characterizing the quality of different types of information. Since the orchardist wants to minimize expenses (or maximize payoffs) over the entire frost-protection season and since current actions and outcomes at any point in the season are related to both previous and future actions and outcomes, the decision-making problem is inherently dynamic in nature. As a result, a class of dynamic models known as Markov decision processes is considered. A computational technique called dynamic programming is used in conjunction with these models to determine the optimal actions and to estimate the value of meteorological information.Some results concerning the value of frost forecasts to orchardists in the Yakima Valley of central Washington are presented for the cases of red delicious apples, bartlett pears, and elberta peaches. Estimates of the parameter values in the Markov decision process are obtained from relevant physical and economic data. Twenty years of National Weather Service forecast and observed temperatures for the Yakima key station are used to estimate the quality of different types of information, including perfect forecasts, current forecasts, and climatological information. The orchardist's optimal actions over the frost-protection season and the expected expenses associated with the use of such information are determined using a dynamic programming algorithm. The value of meteorological information is defined as the difference between the expected expense for the information of interest and the expected expense for climatological information. Over the entire frost-protection season, the value estimates (in 1977 dollars) for current forecasts were $808 per acre for red delicious apples, $492 per acre for bartlett pears, and $270 per acre for elberta peaches. These amounts account for 66, 63, and 47%, respectively, of the economic value associated with decisions based on perfect forecasts. Varying the quality of the minimum temperature forecasts reveals that the relationship between the accuracy and value of such forecasts is nonlinear and that improvements in current forecasts would not be as significant in terms of economic value as were comparable improvements in the past.Several possible extensions of this study of the value of frost forecasts to orchardists are briefly described. Finally, the application of the dynamic model formulated in this paper to other decision-making problems involving the use of meteorological information is mentioned.

  14. First Assessment of Itaipu Dam Ensemble Inflow Forecasting System

    NASA Astrophysics Data System (ADS)

    Mainardi Fan, Fernando; Machado Vieira Lisboa, Auder; Gomes Villa Trinidad, Giovanni; Rógenes Monteiro Pontes, Paulo; Collischonn, Walter; Tucci, Carlos; Costa Buarque, Diogo

    2017-04-01

    Inflow forecasting for Hydropower Plants (HPP) Dams is one of the prominent uses for hydrological forecasts. A very important HPP in terms of energy generation for South America is the Itaipu Dam, located in the Paraná River, between Brazil and Paraguay countries, with a drainage area of 820.000km2. In this work, we present the development of an ensemble forecasting system for Itaipu, operational since November 2015. The system is based in the MGB-IPH hydrological model, includes hydrodynamics simulations of the main river, and is run every day morning forced by seven different rainfall forecasts: (i) CPTEC-ETA 15km; (ii) CPTEC-BRAMS 5km; (iii) SIMEPAR WRF Ferrier; (iv) SIMEPAR WRF Lin; (v) SIMEPAR WRF Morrison; (vi) SIMEPAR WRF WDM6; (vii) SIMEPAR MEDIAN. The last one (vii) corresponds to the median value of SIMEPAR WRF model versions (iii to vi) rainfall forecasts. Besides the developed system, the "traditional" method for inflow forecasting generation for the Itaipu Dam is also run every day. This traditional method consists in the approximation of the future inflow based on the discharge tendency of upstream telemetric gauges. Nowadays, after all the forecasts are run, the hydrology team of Itaipu develop a consensus forecast, based on all obtained results, which is the one used for the Itaipu HPP Dam operation. After one year of operation a first evaluation of the Ensemble Forecasting System was conducted. Results show that the system performs satisfactory for rising flows up to five days lead time. However, some false alarms were also issued by most ensemble members in some cases. And not in all cases the system performed better than the traditional method, especially during hydrograph recessions. In terms of meteorological forecasts, some members usage are being discontinued. In terms of the hydrodynamics representation, it seems that a better information of rivers cross section could improve hydrographs recession curves forecasts. Those opportunities for improvements are currently being addressed in the system next update.

  15. Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics

    NASA Astrophysics Data System (ADS)

    Lazarus, S. M.; Holman, B. P.; Splitt, M. E.

    2017-12-01

    A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.

  16. Forecasting the onset of an allergic risk to poaceae in Nancy and Strasbourg (France) with different methods.

    PubMed

    Cassagne, E; Caillaud, P D; Besancenot, J P; Thibaudon, M

    2007-10-01

    Pollen of Poaceae is among the most allergenic pollen in Europe with pollen of birch. It is therefore useful to elaborate models to help pollen allergy sufferers. The objective of this study was to construct forecast models that could predict the first day characterized by a certain level of allergic risk called here the Starting Date of the Allergic Risk (SDAR). Models result from four forecast methods (three summing and one multiple regression analysis) used in the literature. They were applied on Nancy and Strasbourg from 1988 to 2005 and were tested on 2006. Mean Absolute Error and Actual forecast ability test are the parameters used to choose best models, assess and compare their accuracy. It was found, on the whole, that all the models presented a good forecast accuracy which was equivalent. They were all reliable and were used in order to forecast the SDAR in 2006 with contrasting results in forecasting precision.

  17. Assessing the skill of seasonal precipitation and streamflow forecasts in sixteen French catchments

    NASA Astrophysics Data System (ADS)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian

    2015-04-01

    Meteorological centres make sustained efforts to provide seasonal forecasts that are increasingly skilful. Streamflow forecasting is one of the many applications than can benefit from these efforts. Seasonal flow forecasts generated using seasonal ensemble precipitation forecasts as input to a hydrological model can help to take anticipatory measures for water supply reservoir operation or drought risk management. The objective of the study is to assess the skill of seasonal precipitation and streamflow forecasts in France. First, we evaluated the skill of ECMWF SYS4 seasonal precipitation forecasts for streamflow forecasting in sixteen French catchments. Daily flow forecasts were produced using raw seasonal precipitation forecasts as input to the GR6J hydrological model. Ensemble forecasts are issued every month with 15 or 51 members according to the month of the year and evaluated for up to 90 days ahead. In a second step, we applied eight variants of bias correction approaches to the precipitation forecasts prior to generating the flow forecasts. The approaches were based on the linear scaling and the distribution mapping methods. The skill of the ensemble forecasts was assessed in accuracy (MAE), reliability (PIT Diagram) and overall performance (CRPS). The results show that, in most catchments, raw seasonal precipitation and streamflow forecasts are more skilful in terms of accuracy and overall performance than a reference prediction based on historic observed precipitation and watershed initial conditions at the time of forecast. Reliability is the only attribute that is not significantly improved. The skill of the forecasts is, in general, improved when applying bias correction. Two bias correction methods showed the best performance for the studied catchments: the simple linear scaling of monthly values and the empirical distribution mapping of daily values. L. Crochemore is funded by the Interreg IVB DROP Project (Benefit of governance in DROught adaPtation).

  18. Forecasting biodiversity in breeding birds using best practices

    PubMed Central

    Taylor, Shawn D.; White, Ethan P.

    2018-01-01

    Biodiversity forecasts are important for conservation, management, and evaluating how well current models characterize natural systems. While the number of forecasts for biodiversity is increasing, there is little information available on how well these forecasts work. Most biodiversity forecasts are not evaluated to determine how well they predict future diversity, fail to account for uncertainty, and do not use time-series data that captures the actual dynamics being studied. We addressed these limitations by using best practices to explore our ability to forecast the species richness of breeding birds in North America. We used hindcasting to evaluate six different modeling approaches for predicting richness. Hindcasts for each method were evaluated annually for a decade at 1,237 sites distributed throughout the continental United States. All models explained more than 50% of the variance in richness, but none of them consistently outperformed a baseline model that predicted constant richness at each site. The best practices implemented in this study directly influenced the forecasts and evaluations. Stacked species distribution models and “naive” forecasts produced poor estimates of uncertainty and accounting for this resulted in these models dropping in the relative performance compared to other models. Accounting for observer effects improved model performance overall, but also changed the rank ordering of models because it did not improve the accuracy of the “naive” model. Considering the forecast horizon revealed that the prediction accuracy decreased across all models as the time horizon of the forecast increased. To facilitate the rapid improvement of biodiversity forecasts, we emphasize the value of specific best practices in making forecasts and evaluating forecasting methods. PMID:29441230

  19. Consistent nonlinear deterministic and stochastic evolution equations for deep to shallow water wave shoaling

    NASA Astrophysics Data System (ADS)

    Vrecica, Teodor; Toledo, Yaron

    2015-04-01

    One-dimensional deterministic and stochastic evolution equations are derived for the dispersive nonlinear waves while taking dissipation of energy into account. The deterministic nonlinear evolution equations are formulated using operational calculus by following the approach of Bredmose et al. (2005). Their formulation is extended to include the linear and nonlinear effects of wave dissipation due to friction and breaking. The resulting equation set describes the linear evolution of the velocity potential for each wave harmonic coupled by quadratic nonlinear terms. These terms describe the nonlinear interactions between triads of waves, which represent the leading-order nonlinear effects in the near-shore region. The equations are translated to the amplitudes of the surface elevation by using the approach of Agnon and Sheremet (1997) with the correction of Eldeberky and Madsen (1999). The only current possibility for calculating the surface gravity wave field over large domains is by using stochastic wave evolution models. Hence, the above deterministic model is formulated as a stochastic one using the method of Agnon and Sheremet (1997) with two types of stochastic closure relations (Benney and Saffman's, 1966, and Hollway's, 1980). These formulations cannot be applied to the common wave forecasting models without further manipulation, as they include a non-local wave shoaling coefficients (i.e., ones that require integration along the wave rays). Therefore, a localization method was applied (see Stiassnie and Drimer, 2006, and Toledo and Agnon, 2012). This process essentially extracts the local terms that constitute the mean nonlinear energy transfer while discarding the remaining oscillatory terms, which transfer energy back and forth. One of the main findings of this work is the understanding that the approximated non-local coefficients behave in two essentially different manners. In intermediate water depths these coefficients indeed consist of rapidly oscillating terms, but as the water depth becomes shallow they change to an exponential growth (or decay) behavior. Hence, the formerly used localization technique cannot be justified for the shallow water region. A new formulation is devised for the localization in shallow water, it approximates the nonlinear non-local shoaling coefficient in shallow water and matches it to the one fitting to the intermediate water region. This allows the model behavior to be consistent from deep water to intermediate depths and up to the shallow water regime. Various simulations of the model were performed for the cases of intermediate, and shallow water, overall the model was found to give good results in both shallow and intermediate water depths. The essential difference between the shallow and intermediate nonlinear shoaling physics is explained via the dominating class III Bragg resonances phenomenon. By inspecting the resonance conditions and the nature of the dispersion relation, it is shown that unlike in the intermediate water regime, in shallow water depths the formation of resonant interactions is possible without taking into account bottom components. References Agnon, Y. & Sheremet, A. 1997 Stochastic nonlinear shoaling of directional spectra. J. Fluid Mech. 345, 79-99. Benney, D. J. & Saffman, P. G. 1966 Nonlinear interactions of random waves. Proc. R. Soc. Lond. A 289, 301-321. Bredmose, H., Agnon, Y., Madsen, P.A. & Schaffer, H.A. 2005 Wave transformation models with exact second-order transfer. European J. of Mech. - B/Fluids 24 (6), 659-682. Eldeberky, Y. & Madsen, P. A. 1999 Deterministic and stochastic evolution equations for fully dispersive and weakly nonlinear waves. Coastal Engineering 38, 1-24. Kaihatu, J. M. & Kirby, J. T. 1995 Nonlinear transformation of waves in infinite water depth. Phys. Fluids 8, 175-188. Holloway, G. 1980 Oceanic internal waves are not weak waves. J. Phys. Oceanogr. 10, 906-914. Stiassnie, M. & Drimer, N. 2006 Prediction of long forcing waves for harbor agitation studies. J. of waterways, port, coastal and ocean engineering 132(3), 166-171. Toledo, Y. & Agnon, Y. 2012 Stochastic evolution equations with localized nonlinear shoaling coefficients. European J. of Mech. - B/Fluids 34, 13-18.

  20. Application of an Ensemble Smoother to Precipitation Assimilation

    NASA Technical Reports Server (NTRS)

    Zhang, Sara; Zupanski, Dusanka; Hou, Arthur; Zupanski, Milija

    2008-01-01

    Assimilation of precipitation in a global modeling system poses a special challenge in that the observation operators for precipitation processes are highly nonlinear. In the variational approach, substantial development work and model simplifications are required to include precipitation-related physical processes in the tangent linear model and its adjoint. An ensemble based data assimilation algorithm "Maximum Likelihood Ensemble Smoother (MLES)" has been developed to explore the ensemble representation of the precipitation observation operator with nonlinear convection and large-scale moist physics. An ensemble assimilation system based on the NASA GEOS-5 GCM has been constructed to assimilate satellite precipitation data within the MLES framework. The configuration of the smoother takes the time dimension into account for the relationship between state variables and observable rainfall. The full nonlinear forward model ensembles are used to represent components involving the observation operator and its transpose. Several assimilation experiments using satellite precipitation observations have been carried out to investigate the effectiveness of the ensemble representation of the nonlinear observation operator and the data impact of assimilating rain retrievals from the TMI and SSM/I sensors. Preliminary results show that this ensemble assimilation approach is capable of extracting information from nonlinear observations to improve the analysis and forecast if ensemble size is adequate, and a suitable localization scheme is applied. In addition to a dynamically consistent precipitation analysis, the assimilation system produces a statistical estimate of the analysis uncertainty.

Top