Prediction of a service demand using combined forecasting approach
NASA Astrophysics Data System (ADS)
Zhou, Ling
2017-08-01
Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.
Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States.
Yamana, Teresa K; Kandula, Sasikiran; Shaman, Jeffrey
2017-11-01
Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time.
Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States
Kandula, Sasikiran; Shaman, Jeffrey
2017-01-01
Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time. PMID:29107987
NASA Astrophysics Data System (ADS)
Murray, S.; Guerra, J. A.
2017-12-01
One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.
Technical note: Combining quantile forecasts and predictive distributions of streamflows
NASA Astrophysics Data System (ADS)
Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano
2017-11-01
The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.
Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061
Selecting single model in combination forecasting based on cointegration test and encompassing test.
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.
NASA Astrophysics Data System (ADS)
Khan, Valentina; Tscepelev, Valery; Vilfand, Roman; Kulikova, Irina; Kruglova, Ekaterina; Tischenko, Vladimir
2016-04-01
Long-range forecasts at monthly-seasonal time scale are in great demand of socio-economic sectors for exploiting climate-related risks and opportunities. At the same time, the quality of long-range forecasts is not fully responding to user application necessities. Different approaches, including combination of different prognostic models, are used in forecast centers to increase the prediction skill for specific regions and globally. In the present study, two forecasting methods are considered which are exploited in operational practice of Hydrometeorological Center of Russia. One of them is synoptical-analogous method of forecasting of surface air temperature at monthly scale. Another one is dynamical system based on the global semi-Lagrangian model SL-AV, developed in collaboration of Institute of Numerical Mathematics and Hydrometeorological Centre of Russia. The seasonal version of this model has been used to issue global and regional forecasts at monthly-seasonal time scales. This study presents results of the evaluation of surface air temperature forecasts generated with using above mentioned synoptical-statistical and dynamical models, and their combination to potentially increase skill score over Northern Eurasia. The test sample of operational forecasts is encompassing period from 2010 through 2015. The seasonal and interannual variability of skill scores of these methods has been discussed. It was noticed that the quality of all forecasts is highly dependent on the inertia of macro-circulation processes. The skill scores of forecasts are decreasing during significant alterations of synoptical fields for both dynamical and empirical schemes. Procedure of combination of forecasts from different methods, in some cases, has demonstrated its effectiveness. For this study the support has been provided by Grant of Russian Science Foundation (№14-37-00053).
Fuzzy State Transition and Kalman Filter Applied in Short-Term Traffic Flow Forecasting
Ming-jun, Deng; Shi-ru, Qu
2015-01-01
Traffic flow is widely recognized as an important parameter for road traffic state forecasting. Fuzzy state transform and Kalman filter (KF) have been applied in this field separately. But the studies show that the former method has good performance on the trend forecasting of traffic state variation but always involves several numerical errors. The latter model is good at numerical forecasting but is deficient in the expression of time hysteretically. This paper proposed an approach that combining fuzzy state transform and KF forecasting model. In considering the advantage of the two models, a weight combination model is proposed. The minimum of the sum forecasting error squared is regarded as a goal in optimizing the combined weight dynamically. Real detection data are used to test the efficiency. Results indicate that the method has a good performance in terms of short-term traffic forecasting. PMID:26779258
Fuzzy State Transition and Kalman Filter Applied in Short-Term Traffic Flow Forecasting.
Deng, Ming-jun; Qu, Shi-ru
2015-01-01
Traffic flow is widely recognized as an important parameter for road traffic state forecasting. Fuzzy state transform and Kalman filter (KF) have been applied in this field separately. But the studies show that the former method has good performance on the trend forecasting of traffic state variation but always involves several numerical errors. The latter model is good at numerical forecasting but is deficient in the expression of time hysteretically. This paper proposed an approach that combining fuzzy state transform and KF forecasting model. In considering the advantage of the two models, a weight combination model is proposed. The minimum of the sum forecasting error squared is regarded as a goal in optimizing the combined weight dynamically. Real detection data are used to test the efficiency. Results indicate that the method has a good performance in terms of short-term traffic forecasting.
Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)
NASA Astrophysics Data System (ADS)
OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.
2016-02-01
Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.
Intermittent Demand Forecasting in a Tertiary Pediatric Intensive Care Unit.
Cheng, Chen-Yang; Chiang, Kuo-Liang; Chen, Meng-Yin
2016-10-01
Forecasts of the demand for medical supplies both directly and indirectly affect the operating costs and the quality of the care provided by health care institutions. Specifically, overestimating demand induces an inventory surplus, whereas underestimating demand possibly compromises patient safety. Uncertainty in forecasting the consumption of medical supplies generates intermittent demand events. The intermittent demand patterns for medical supplies are generally classified as lumpy, erratic, smooth, and slow-moving demand. This study was conducted with the purpose of advancing a tertiary pediatric intensive care unit's efforts to achieve a high level of accuracy in its forecasting of the demand for medical supplies. On this point, several demand forecasting methods were compared in terms of the forecast accuracy of each. The results confirm that applying Croston's method combined with a single exponential smoothing method yields the most accurate results for forecasting lumpy, erratic, and slow-moving demand, whereas the Simple Moving Average (SMA) method is the most suitable for forecasting smooth demand. In addition, when the classification of demand consumption patterns were combined with the demand forecasting models, the forecasting errors were minimized, indicating that this classification framework can play a role in improving patient safety and reducing inventory management costs in health care institutions.
Liu, Dong-jun; Li, Li
2015-01-01
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332
Liu, Dong-jun; Li, Li
2015-06-23
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.
Forecasting space weather: Can new econometric methods improve accuracy?
NASA Astrophysics Data System (ADS)
Reikard, Gordon
2011-06-01
Space weather forecasts are currently used in areas ranging from navigation and communication to electric power system operations. The relevant forecast horizons can range from as little as 24 h to several days. This paper analyzes the predictability of two major space weather measures using new time series methods, many of them derived from econometrics. The data sets are the A p geomagnetic index and the solar radio flux at 10.7 cm. The methods tested include nonlinear regressions, neural networks, frequency domain algorithms, GARCH models (which utilize the residual variance), state transition models, and models that combine elements of several techniques. While combined models are complex, they can be programmed using modern statistical software. The data frequency is daily, and forecasting experiments are run over horizons ranging from 1 to 7 days. Two major conclusions stand out. First, the frequency domain method forecasts the A p index more accurately than any time domain model, including both regressions and neural networks. This finding is very robust, and holds for all forecast horizons. Combining the frequency domain method with other techniques yields a further small improvement in accuracy. Second, the neural network forecasts the solar flux more accurately than any other method, although at short horizons (2 days or less) the regression and net yield similar results. The neural net does best when it includes measures of the long-term component in the data.
NASA Astrophysics Data System (ADS)
Areekul, Phatchakorn; Senjyu, Tomonobu; Urasaki, Naomitsu; Yona, Atsushi
Electricity price forecasting is becoming increasingly relevant to power producers and consumers in the new competitive electric power markets, when planning bidding strategies in order to maximize their benefits and utilities, respectively. This paper proposed a method to predict hourly electricity prices for next-day electricity markets by combination methodology of ARIMA and ANN models. The proposed method is examined on the Australian National Electricity Market (NEM), New South Wales regional in year 2006. Comparison of forecasting performance with the proposed ARIMA, ANN and combination (ARIMA-ANN) models are presented. Empirical results indicate that an ARIMA-ANN model can improve the price forecasting accuracy.
[Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].
Zheng, Chang-song; Ma, Biao
2009-04-01
The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.
Forecasting peaks of seasonal influenza epidemics.
Nsoesie, Elaine; Mararthe, Madhav; Brownstein, John
2013-06-21
We present a framework for near real-time forecast of influenza epidemics using a simulation optimization approach. The method combines an individual-based model and a simple root finding optimization method for parameter estimation and forecasting. In this study, retrospective forecasts were generated for seasonal influenza epidemics using web-based estimates of influenza activity from Google Flu Trends for 2004-2005, 2007-2008 and 2012-2013 flu seasons. In some cases, the peak could be forecasted 5-6 weeks ahead. This study adds to existing resources for influenza forecasting and the proposed method can be used in conjunction with other approaches in an ensemble framework.
Entropy Econometrics for combining regional economic forecasts: A Data-Weighted Prior Estimator
NASA Astrophysics Data System (ADS)
Fernández-Vázquez, Esteban; Moreno, Blanca
2017-10-01
Forecast combination has been studied in econometrics for a long time, and the literature has shown the superior performance of forecast combination over individual predictions. However, there is still controversy on which is the best procedure to specify the forecast weights. This paper explores the possibility of using a procedure based on Entropy Econometrics, which allows setting the weights for the individual forecasts as a mixture of different alternatives. In particular, we examine the ability of the Data-Weighted Prior Estimator proposed by Golan (J Econom 101(1):165-193, 2001) to combine forecasting models in a context of small sample sizes, a relative common scenario when dealing with time series for regional economies. We test the validity of the proposed approach using a simulation exercise and a real-world example that aims at predicting gross regional product growth rates for a regional economy. The forecasting performance of the Data-Weighted Prior Estimator proposed is compared with other combining methods. The simulation results indicate that in scenarios of heavily ill-conditioned datasets the approach suggested dominates other forecast combination strategies. The empirical results are consistent with the conclusions found in the numerical experiment.
NASA Astrophysics Data System (ADS)
Bulatov, S. V.
2018-05-01
The article considers the method of short-term combined forecasting, which includes theoretical and experimental estimates of the need for details of units and assemblies, which allows obtaining the optimum number of spare parts necessary for rolling stock operation without downtime in repair areas.
NASA Astrophysics Data System (ADS)
Rodrigues, Luis R. L.; Doblas-Reyes, Francisco J.; Coelho, Caio A. S.
2018-02-01
A Bayesian method known as the Forecast Assimilation (FA) was used to calibrate and combine monthly near-surface temperature and precipitation outputs from seasonal dynamical forecast systems. The simple multimodel (SMM), a method that combines predictions with equal weights, was used as a benchmark. This research focuses on Europe and adjacent regions for predictions initialized in May and November, covering the boreal summer and winter months. The forecast quality of the FA and SMM as well as the single seasonal dynamical forecast systems was assessed using deterministic and probabilistic measures. A non-parametric bootstrap method was used to account for the sampling uncertainty of the forecast quality measures. We show that the FA performs as well as or better than the SMM in regions where the dynamical forecast systems were able to represent the main modes of climate covariability. An illustration with the near-surface temperature over North Atlantic, the Mediterranean Sea and Middle-East in summer months associated with the well predicted first mode of climate covariability is offered. However, the main modes of climate covariability are not well represented in most situations discussed in this study as the seasonal dynamical forecast systems have limited skill when predicting the European climate. In these situations, the SMM performs better more often.
NASA Astrophysics Data System (ADS)
Rasim; Junaeti, E.; Wirantika, R.
2018-01-01
Accurate forecasting for the sale of a product depends on the forecasting method used. The purpose of this research is to build motorcycle sales forecasting application using Fuzzy Time Series method combined with interval determination using automatic clustering algorithm. Forecasting is done using the sales data of motorcycle sales in the last ten years. Then the error rate of forecasting is measured using Means Percentage Error (MPE) and Means Absolute Percentage Error (MAPE). The results of forecasting in the one-year period obtained in this study are included in good accuracy.
Gas demand forecasting by a new artificial intelligent algorithm
NASA Astrophysics Data System (ADS)
Khatibi. B, Vahid; Khatibi, Elham
2012-01-01
Energy demand forecasting is a key issue for consumers and generators in all energy markets in the world. This paper presents a new forecasting algorithm for daily gas demand prediction. This algorithm combines a wavelet transform and forecasting models such as multi-layer perceptron (MLP), linear regression or GARCH. The proposed method is applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the proposed method.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Hybrid Intrusion Forecasting Framework for Early Warning System
NASA Astrophysics Data System (ADS)
Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo
Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.
NASA Astrophysics Data System (ADS)
Dutton, John A.; James, Richard P.; Ross, Jeremy D.
2013-06-01
Seasonal probability forecasts produced with numerical dynamics on supercomputers offer great potential value in managing risk and opportunity created by seasonal variability. The skill and reliability of contemporary forecast systems can be increased by calibration methods that use the historical performance of the forecast system to improve the ongoing real-time forecasts. Two calibration methods are applied to seasonal surface temperature forecasts of the US National Weather Service, the European Centre for Medium Range Weather Forecasts, and to a World Climate Service multi-model ensemble created by combining those two forecasts with Bayesian methods. As expected, the multi-model is somewhat more skillful and more reliable than the original models taken alone. The potential value of the multimodel in decision making is illustrated with the profits achieved in simulated trading of a weather derivative. In addition to examining the seasonal models, the article demonstrates that calibrated probability forecasts of weekly average temperatures for leads of 2-4 weeks are also skillful and reliable. The conversion of ensemble forecasts into probability distributions of impact variables is illustrated with degree days derived from the temperature forecasts. Some issues related to loss of stationarity owing to long-term warming are considered. The main conclusion of the article is that properly calibrated probabilistic forecasts possess sufficient skill and reliability to contribute to effective decisions in government and business activities that are sensitive to intraseasonal and seasonal climate variability.
Gan, Ruijing; Chen, Xiaojun; Yan, Yu; Huang, Daizheng
2015-01-01
Accurate incidence forecasting of infectious disease provides potentially valuable insights in its own right. It is critical for early prevention and may contribute to health services management and syndrome surveillance. This study investigates the use of a hybrid algorithm combining grey model (GM) and back propagation artificial neural networks (BP-ANN) to forecast hepatitis B in China based on the yearly numbers of hepatitis B and to evaluate the method's feasibility. The results showed that the proposal method has advantages over GM (1, 1) and GM (2, 1) in all the evaluation indexes.
A Simulation Optimization Approach to Epidemic Forecasting
Nsoesie, Elaine O.; Beckman, Richard J.; Shashaani, Sara; Nagaraj, Kalyani S.; Marathe, Madhav V.
2013-01-01
Reliable forecasts of influenza can aid in the control of both seasonal and pandemic outbreaks. We introduce a simulation optimization (SIMOP) approach for forecasting the influenza epidemic curve. This study represents the final step of a project aimed at using a combination of simulation, classification, statistical and optimization techniques to forecast the epidemic curve and infer underlying model parameters during an influenza outbreak. The SIMOP procedure combines an individual-based model and the Nelder-Mead simplex optimization method. The method is used to forecast epidemics simulated over synthetic social networks representing Montgomery County in Virginia, Miami, Seattle and surrounding metropolitan regions. The results are presented for the first four weeks. Depending on the synthetic network, the peak time could be predicted within a 95% CI as early as seven weeks before the actual peak. The peak infected and total infected were also accurately forecasted for Montgomery County in Virginia within the forecasting period. Forecasting of the epidemic curve for both seasonal and pandemic influenza outbreaks is a complex problem, however this is a preliminary step and the results suggest that more can be achieved in this area. PMID:23826222
A Simulation Optimization Approach to Epidemic Forecasting.
Nsoesie, Elaine O; Beckman, Richard J; Shashaani, Sara; Nagaraj, Kalyani S; Marathe, Madhav V
2013-01-01
Reliable forecasts of influenza can aid in the control of both seasonal and pandemic outbreaks. We introduce a simulation optimization (SIMOP) approach for forecasting the influenza epidemic curve. This study represents the final step of a project aimed at using a combination of simulation, classification, statistical and optimization techniques to forecast the epidemic curve and infer underlying model parameters during an influenza outbreak. The SIMOP procedure combines an individual-based model and the Nelder-Mead simplex optimization method. The method is used to forecast epidemics simulated over synthetic social networks representing Montgomery County in Virginia, Miami, Seattle and surrounding metropolitan regions. The results are presented for the first four weeks. Depending on the synthetic network, the peak time could be predicted within a 95% CI as early as seven weeks before the actual peak. The peak infected and total infected were also accurately forecasted for Montgomery County in Virginia within the forecasting period. Forecasting of the epidemic curve for both seasonal and pandemic influenza outbreaks is a complex problem, however this is a preliminary step and the results suggest that more can be achieved in this area.
NASA Astrophysics Data System (ADS)
Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.
2018-03-01
This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.
Applications of Principled Search Methods in Climate Influences and Mechanisms
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.
NASA Astrophysics Data System (ADS)
Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.
2018-03-01
Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.
Combining forecast weights: Why and how?
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim
2012-09-01
This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.
Stochastic demographic forecasting.
Lee, R D
1992-11-01
"This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary.... Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented." excerpt
NASA Astrophysics Data System (ADS)
Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si
2018-02-01
In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.
Jaber, Abobaker M; Ismail, Mohd Tahir; Altaher, Alsaidi M
2014-01-01
This paper mainly forecasts the daily closing price of stock markets. We propose a two-stage technique that combines the empirical mode decomposition (EMD) with nonparametric methods of local linear quantile (LLQ). We use the proposed technique, EMD-LLQ, to forecast two stock index time series. Detailed experiments are implemented for the proposed method, in which EMD-LPQ, EMD, and Holt-Winter methods are compared. The proposed EMD-LPQ model is determined to be superior to the EMD and Holt-Winter methods in predicting the stock closing prices.
NASA Astrophysics Data System (ADS)
Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.
2013-10-01
Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.
Latent fluctuation periods and long-term forecasting of the level of Markakol lake
NASA Astrophysics Data System (ADS)
Madibekov, A. S.; Babkin, A. V.; Musakulkyzy, A.; Cherednichenko, A. V.
2018-01-01
The analysis of time series of the level of Markakol Lake by the method of “Periodicities” reveals in its variations the harmonics with the periods of 12 and 14 years, respectively. The verification forecasts of the lake level by the trend tendency and by its combination with these sinusoids were computed with the lead time of 5 and 10 years. The estimation of the forecast results by the new independent data permitted to conclude that forecasts by the combination of the sinusoids and trend tendency are better than by the trend tendency only. They are no worse than the mean value prediction.
A Hybrid Approach on Tourism Demand Forecasting
NASA Astrophysics Data System (ADS)
Nor, M. E.; Nurul, A. I. M.; Rusiman, M. S.
2018-04-01
Tourism has become one of the important industries that contributes to the country’s economy. Tourism demand forecasting gives valuable information to policy makers, decision makers and organizations related to tourism industry in order to make crucial decision and planning. However, it is challenging to produce an accurate forecast since economic data such as the tourism data is affected by social, economic and environmental factors. In this study, an equally-weighted hybrid method, which is a combination of Box-Jenkins and Artificial Neural Networks, was applied to forecast Malaysia’s tourism demand. The forecasting performance was assessed by taking the each individual method as a benchmark. The results showed that this hybrid approach outperformed the other two models
Counteracting structural errors in ensemble forecast of influenza outbreaks.
Pei, Sen; Shaman, Jeffrey
2017-10-13
For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.
Superensemble forecasts of dengue outbreaks
Kandula, Sasikiran; Shaman, Jeffrey
2016-01-01
In recent years, a number of systems capable of predicting future infectious disease incidence have been developed. As more of these systems are operationalized, it is important that the forecasts generated by these different approaches be formally reconciled so that individual forecast error and bias are reduced. Here we present a first example of such multi-system, or superensemble, forecast. We develop three distinct systems for predicting dengue, which are applied retrospectively to forecast outbreak characteristics in San Juan, Puerto Rico. We then use Bayesian averaging methods to combine the predictions from these systems and create superensemble forecasts. We demonstrate that on average, the superensemble approach produces more accurate forecasts than those made from any of the individual forecasting systems. PMID:27733698
Forecasting of the electrical actuators condition using stator’s current signals
NASA Astrophysics Data System (ADS)
Kruglova, T. N.; Yaroshenko, I. V.; Rabotalov, N. N.; Melnikov, M. A.
2017-02-01
This article describes a forecasting method for electrical actuators realized through the combination of Fourier transformation and neural network techniques. The method allows finding the value of diagnostic functions in the iterating operating cycle and the number of operational cycles in time before the BLDC actuator fails. For forecasting of the condition of the actuator, we propose a hierarchical structure of the neural network aiming to reduce the training time of the neural network and improve estimation accuracy.
Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis
NASA Astrophysics Data System (ADS)
Mohamed Ismael, Hawa; Vandyck, George Kobina
The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.
Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns
NASA Astrophysics Data System (ADS)
Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto
2017-09-01
Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
Improving Seasonal Crop Monitoring and Forecasting for Soybean and Corn in Iowa
NASA Astrophysics Data System (ADS)
Togliatti, K.; Archontoulis, S.; Dietzel, R.; VanLoocke, A.
2016-12-01
Accurately forecasting crop yield in advance of harvest could greatly benefit farmers, however few evaluations have been conducted to determine the effectiveness of forecasting methods. We tested one such method that used a combination of short-term weather forecasting from the Weather Research and Forecasting Model (WRF) to predict in season weather variables, such as, maximum and minimum temperature, precipitation and radiation at 4 different forecast lengths (2 weeks, 1 week, 3 days, and 0 days). This forecasted weather data along with the current and historic (previous 35 years) data from the Iowa Environmental Mesonet was combined to drive Agricultural Production Systems sIMulator (APSIM) simulations to forecast soybean and corn yields in 2015 and 2016. The goal of this study is to find the forecast length that reduces the variability of simulated yield predictions while also increasing the accuracy of those predictions. APSIM simulations of crop variables were evaluated against bi-weekly field measurements of phenology, biomass, and leaf area index from early and late planted soybean plots located at the Agricultural Engineering and Agronomy Research Farm in central Iowa as well as the Northwest Research Farm in northwestern Iowa. WRF model predictions were evaluated against observed weather data collected at the experimental fields. Maximum temperature was the most accurately predicted variable, followed by minimum temperature and radiation, and precipitation was least accurate according to RMSE values and the number of days that were forecasted within a 20% error of the observed weather. Our analysis indicated that for the majority of months in the growing season the 3 day forecast performed the best. The 1 week forecast came in second and the 2 week forecast was the least accurate for the majority of months. Preliminary results for yield indicate that the 2 week forecast is the least variable of the forecast lengths, however it also is the least accurate. The 3 day and 1 week forecast have a better accuracy, with an increase in variability.
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi
2014-09-01
Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.
NASA Astrophysics Data System (ADS)
Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki
2015-04-01
Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.
NASA Astrophysics Data System (ADS)
Wu, Qi
2010-03-01
Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.
Rimaityte, Ingrida; Ruzgas, Tomas; Denafas, Gintaras; Racys, Viktoras; Martuzevicius, Dainius
2012-01-01
Forecasting of generation of municipal solid waste (MSW) in developing countries is often a challenging task due to the lack of data and selection of suitable forecasting method. This article aimed to select and evaluate several methods for MSW forecasting in a medium-scaled Eastern European city (Kaunas, Lithuania) with rapidly developing economics, with respect to affluence-related and seasonal impacts. The MSW generation was forecast with respect to the economic activity of the city (regression modelling) and using time series analysis. The modelling based on social-economic indicators (regression implemented in LCA-IWM model) showed particular sensitivity (deviation from actual data in the range from 2.2 to 20.6%) to external factors, such as the synergetic effects of affluence parameters or changes in MSW collection system. For the time series analysis, the combination of autoregressive integrated moving average (ARIMA) and seasonal exponential smoothing (SES) techniques were found to be the most accurate (mean absolute percentage error equalled to 6.5). Time series analysis method was very valuable for forecasting the weekly variation of waste generation data (r (2) > 0.87), but the forecast yearly increase should be verified against the data obtained by regression modelling. The methods and findings of this study may assist the experts, decision-makers and scientists performing forecasts of MSW generation, especially in developing countries.
A Case Study on a Combination NDVI Forecasting Model Based on the Entropy Weight Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Shengzhi; Ming, Bo; Huang, Qiang
It is critically meaningful to accurately predict NDVI (Normalized Difference Vegetation Index), which helps guide regional ecological remediation and environmental managements. In this study, a combination forecasting model (CFM) was proposed to improve the performance of NDVI predictions in the Yellow River Basin (YRB) based on three individual forecasting models, i.e., the Multiple Linear Regression (MLR), Artificial Neural Network (ANN), and Support Vector Machine (SVM) models. The entropy weight method was employed to determine the weight coefficient for each individual model depending on its predictive performance. Results showed that: (1) ANN exhibits the highest fitting capability among the four orecastingmore » models in the calibration period, whilst its generalization ability becomes weak in the validation period; MLR has a poor performance in both calibration and validation periods; the predicted results of CFM in the calibration period have the highest stability; (2) CFM generally outperforms all individual models in the validation period, and can improve the reliability and stability of predicted results through combining the strengths while reducing the weaknesses of individual models; (3) the performances of all forecasting models are better in dense vegetation areas than in sparse vegetation areas.« less
Ensemble averaging and stacking of ARIMA and GSTAR model for rainfall forecasting
NASA Astrophysics Data System (ADS)
Anggraeni, D.; Kurnia, I. F.; Hadi, A. F.
2018-04-01
Unpredictable rainfall changes can affect human activities, such as in agriculture, aviation, shipping which depend on weather forecasts. Therefore, we need forecasting tools with high accuracy in predicting the rainfall in the future. This research focus on local forcasting of the rainfall at Jember in 2005 until 2016, from 77 rainfall stations. The rainfall here was not only related to the occurrence of the previous of its stations, but also related to others, it’s called the spatial effect. The aim of this research is to apply the GSTAR model, to determine whether there are some correlations of spatial effect between one to another stations. The GSTAR model is an expansion of the space-time model that combines the time-related effects, the locations (stations) in a time series effects, and also the location it self. The GSTAR model will also be compared to the ARIMA model that completely ignores the independent variables. The forcested value of the ARIMA and of the GSTAR models then being combined using the ensemble forecasting technique. The averaging and stacking method of ensemble forecasting method here provide us the best model with higher acuracy model that has the smaller RMSE (Root Mean Square Error) value. Finally, with the best model we can offer a better local rainfall forecasting in Jember for the future.
A FAST BAYESIAN METHOD FOR UPDATING AND FORECASTING HOURLY OZONE LEVELS
A Bayesian hierarchical space-time model is proposed by combining information from real-time ambient AIRNow air monitoring data, and output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model. A model validation analysis shows...
A comparison of LOD and UT1-UTC forecasts by different combined prediction techniques
NASA Astrophysics Data System (ADS)
Kosek, W.; Kalarus, M.; Johnson, T. J.; Wooden, W. H.; McCarthy, D. D.; Popiński, W.
Stochastic prediction techniques including autocovariance, autoregressive, autoregressive moving average, and neural networks were applied to the UT1-UTC and Length of Day (LOD) International Earth Rotation and Reference Systems Servive (IERS) EOPC04 time series to evaluate the capabilities of each method. All known effects such as leap seconds and solid Earth zonal tides were first removed from the observed values of UT1-UTC and LOD. Two combination procedures were applied to predict the resulting LODR time series: 1) the combination of the least-squares (LS) extrapolation with a stochastic predition method, and 2) the combination of the discrete wavelet transform (DWT) filtering and a stochastic prediction method. The results of the combination of the LS extrapolation with different stochastic prediction techniques were compared with the results of the UT1-UTC prediction method currently used by the IERS Rapid Service/Prediction Centre (RS/PC). It was found that the prediction accuracy depends on the starting prediction epochs, and for the combined forecast methods, the mean prediction errors for 1 to about 70 days in the future are of the same order as those of the method used by the IERS RS/PC.
Post-processing of a low-flow forecasting system in the Thur basin (Switzerland)
NASA Astrophysics Data System (ADS)
Bogner, Konrad; Joerg-Hess, Stefanie; Bernhard, Luzi; Zappa, Massimiliano
2015-04-01
Low-flows and droughts are natural hazards with potentially severe impacts and economic loss or damage in a number of environmental and socio-economic sectors. As droughts develop slowly there is time to prepare and pre-empt some of these impacts. Real-time information and forecasting of a drought situation can therefore be an effective component of drought management. Although Switzerland has traditionally been more concerned with problems related to floods, in recent years some unprecedented low-flow situations have been experienced. Driven by the climate change debate a drought information platform has been developed to guide water resources management during situations where water resources drop below critical low-flow levels characterised by the indices duration (time between onset and offset), severity (cumulative water deficit) and magnitude (severity/duration). However to gain maximum benefit from such an information system it is essential to remove the bias from the meteorological forecast, to derive optimal estimates of the initial conditions, and to post-process the stream-flow forecasts. Quantile mapping methods for pre-processing the meteorological forecasts and improved data assimilation methods of snow measurements, which accounts for much of the seasonal stream-flow predictability for the majority of the basins in Switzerland, have been tested previously. The objective of this study is the testing of post-processing methods in order to remove bias and dispersion errors and to derive the predictive uncertainty of a calibrated low-flow forecast system. Therefore various stream-flow error correction methods with different degrees of complexity have been applied and combined with the Hydrological Uncertainty Processor (HUP) in order to minimise the differences between the observations and model predictions and to derive posterior probabilities. The complexity of the analysed error correction methods ranges from simple AR(1) models to methods including wavelet transformations and support vector machines. These methods have been combined with forecasts driven by Numerical Weather Prediction (NWP) systems with different temporal and spatial resolutions, lead-times and different numbers of ensembles covering short to medium to extended range forecasts (COSMO-LEPS, 10-15 days, monthly and seasonal ENS) as well as climatological forecasts. Additionally the suitability of various skill scores and efficiency measures regarding low-flow predictions will be tested. Amongst others the novel 2afc (2 alternatives forced choices) score and the quantile skill score and its decompositions will be applied to evaluate the probabilistic forecasts and the effects of post-processing. First results of the performance of the low-flow predictions of the hydrological model PREVAH initialised with different NWP's will be shown.
New technique for ensemble dressing combining Multimodel SuperEnsemble and precipitation PDF
NASA Astrophysics Data System (ADS)
Cane, D.; Milelli, M.
2009-09-01
The Multimodel SuperEnsemble technique (Krishnamurti et al., Science 285, 1548-1550, 1999) is a postprocessing method for the estimation of weather forecast parameters reducing direct model output errors. It differs from other ensemble analysis techniques by the use of an adequate weighting of the input forecast models to obtain a combined estimation of meteorological parameters. Weights are calculated by least-square minimization of the difference between the model and the observed field during a so-called training period. Although it can be applied successfully on the continuous parameters like temperature, humidity, wind speed and mean sea level pressure (Cane and Milelli, Meteorologische Zeitschrift, 15, 2, 2006), the Multimodel SuperEnsemble gives good results also when applied on the precipitation, a parameter quite difficult to handle with standard post-processing methods. Here we present our methodology for the Multimodel precipitation forecasts applied on a wide spectrum of results over Piemonte very dense non-GTS weather station network. We will focus particularly on an accurate statistical method for bias correction and on the ensemble dressing in agreement with the observed precipitation forecast-conditioned PDF. Acknowledgement: this work is supported by the Italian Civil Defence Department.
Jensen, Dan B; Hogeveen, Henk; De Vries, Albert
2016-09-01
Rapid detection of dairy cow mastitis is important so corrective action can be taken as soon as possible. Automatically collected sensor data used to monitor the performance and the health state of the cow could be useful for rapid detection of mastitis while reducing the labor needs for monitoring. The state of the art in combining sensor data to predict clinical mastitis still does not perform well enough to be applied in practice. Our objective was to combine a multivariate dynamic linear model (DLM) with a naïve Bayesian classifier (NBC) in a novel method using sensor and nonsensor data to detect clinical cases of mastitis. We also evaluated reductions in the number of sensors for detecting mastitis. With the DLM, we co-modeled 7 sources of sensor data (milk yield, fat, protein, lactose, conductivity, blood, body weight) collected at each milking for individual cows to produce one-step-ahead forecasts for each sensor. The observations were subsequently categorized according to the errors of the forecasted values and the estimated forecast variance. The categorized sensor data were combined with other data pertaining to the cow (week in milk, parity, mastitis history, somatic cell count category, and season) using Bayes' theorem, which produced a combined probability of the cow having clinical mastitis. If this probability was above a set threshold, the cow was classified as mastitis positive. To illustrate the performance of our method, we used sensor data from 1,003,207 milkings from the University of Florida Dairy Unit collected from 2008 to 2014. Of these, 2,907 milkings were associated with recorded cases of clinical mastitis. Using the DLM/NBC method, we reached an area under the receiver operating characteristic curve of 0.89, with a specificity of 0.81 when the sensitivity was set at 0.80. Specificities with omissions of sensor data ranged from 0.58 to 0.81. These results are comparable to other studies, but differences in data quality, definitions of clinical mastitis, and time windows make comparisons across studies difficult. We found the DLM/NBC method to be a flexible method for combining multiple sensor and nonsensor data sources to predict clinical mastitis and accommodate missing observations. Further research is needed before practical implementation is possible. In particular, the performance of our method needs to be improved in the first 2 wk of lactation. The DLM method produces forecasts that are based on continuously estimated multivariate normal distributions, which makes forecasts and forecast errors easy to interpret, and new sensors can easily be added. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Stephenson, S. R.; Babiker, M.; Sandven, S.; Muckenhuber, S.; Korosov, A.; Bobylev, L.; Vesman, A.; Mushta, A.; Demchev, D.; Volkov, V.; Smirnov, K.; Hamre, T.
2015-12-01
Sea ice monitoring and forecasting systems are important tools for minimizing accident risk and environmental impacts of Arctic maritime operations. Satellite data such as synthetic aperture radar (SAR), combined with atmosphere-ice-ocean forecasting models, navigation models and automatic identification system (AIS) transponder data from ships are essential components of such systems. Here we present first results from the SONARC project (project term: 2015-2017), an international multidisciplinary effort to develop novel and complementary ice monitoring and forecasting systems for vessels and offshore platforms in the Arctic. Automated classification methods (Zakhvatkina et al., 2012) are applied to Sentinel-1 dual-polarization SAR images from the Barents and Kara Sea region to identify ice types (e.g. multi-year ice, level first-year ice, deformed first-year ice, new/young ice, open water) and ridges. Short-term (1-3 days) ice drift forecasts are computed from SAR images using feature tracking and pattern tracking methods (Berg & Eriksson, 2014). Ice classification and drift forecast products are combined with ship positions based on AIS data from a selected period of 3-4 weeks to determine optimal vessel speed and routing in ice. Results illustrate the potential of high-resolution SAR data for near-real-time monitoring and forecasting of Arctic ice conditions. Over the next 3 years, SONARC findings will contribute new knowledge about sea ice in the Arctic while promoting safe and cost-effective shipping, domain awareness, resource management, and environmental protection.
Nonmechanistic forecasts of seasonal influenza with iterative one-week-ahead distributions.
Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni
2018-06-15
Accurate and reliable forecasts of seasonal epidemics of infectious disease can assist in the design of countermeasures and increase public awareness and preparedness. This article describes two main contributions we made recently toward this goal: a novel approach to probabilistic modeling of surveillance time series based on "delta densities", and an optimization scheme for combining output from multiple forecasting methods into an adaptively weighted ensemble. Delta densities describe the probability distribution of the change between one observation and the next, conditioned on available data; chaining together nonparametric estimates of these distributions yields a model for an entire trajectory. Corresponding distributional forecasts cover more observed events than alternatives that treat the whole season as a unit, and improve upon multiple evaluation metrics when extracting key targets of interest to public health officials. Adaptively weighted ensembles integrate the results of multiple forecasting methods, such as delta density, using weights that can change from situation to situation. We treat selection of optimal weightings across forecasting methods as a separate estimation task, and describe an estimation procedure based on optimizing cross-validation performance. We consider some details of the data generation process, including data revisions and holiday effects, both in the construction of these forecasting methods and when performing retrospective evaluation. The delta density method and an adaptively weighted ensemble of other forecasting methods each improve significantly on the next best ensemble component when applied separately, and achieve even better cross-validated performance when used in conjunction. We submitted real-time forecasts based on these contributions as part of CDC's 2015/2016 FluSight Collaborative Comparison. Among the fourteen submissions that season, this system was ranked by CDC as the most accurate.
Forecasting daily passenger traffic volumes in the Moscow metro
NASA Astrophysics Data System (ADS)
Ivanov, V. V.; Osetrov, E. S.
2018-01-01
In this paper we have developed a methodology for the medium-term prediction of daily volumes of passenger traffic in the Moscow metro. It includes three options for the forecast: (1) based on artificial neural networks (ANNs), (2) singular-spectral analysis implemented in the Caterpillar-SSA package, and (3) a combination of the ANN and Caterpillar-SSA approaches. The methods and algorithms allow the mediumterm forecasting of passenger traffic flows in the Moscow metro with reasonable accuracy.
The Prediction of Teacher Turnover Employing Time Series Analysis.
ERIC Educational Resources Information Center
Costa, Crist H.
The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…
NASA Astrophysics Data System (ADS)
Xu, Yongbin; Xie, Haihong; Wu, Liuyi
2018-05-01
The share of coal transportation in the total railway freight volume is about 50%. As is widely acknowledged, coal industry is vulnerable to the economic situation and national policies. Coal transportation volume fluctuates significantly under the new economic normal. Grasp the overall development trend of railway coal transportation market, have important reference and guidance significance to the railway and coal industry decision-making. By analyzing the economic indicators and policy implications, this paper expounds the trend of the coal transportation volume, and further combines the economic indicators with the high correlation with the coal transportation volume with the traditional traffic prediction model to establish a combined forecasting model based on the back propagation neural network. The error of the prediction results is tested, which proves that the method has higher accuracy and has practical application.
Mixture EMOS model for calibrating ensemble forecasts of wind speed.
Baran, S; Lerch, S
2016-03-01
Ensemble model output statistics (EMOS) is a statistical tool for post-processing forecast ensembles of weather variables obtained from multiple runs of numerical weather prediction models in order to produce calibrated predictive probability density functions. The EMOS predictive probability density function is given by a parametric distribution with parameters depending on the ensemble forecasts. We propose an EMOS model for calibrating wind speed forecasts based on weighted mixtures of truncated normal (TN) and log-normal (LN) distributions where model parameters and component weights are estimated by optimizing the values of proper scoring rules over a rolling training period. The new model is tested on wind speed forecasts of the 50 member European Centre for Medium-range Weather Forecasts ensemble, the 11 member Aire Limitée Adaptation dynamique Développement International-Hungary Ensemble Prediction System ensemble of the Hungarian Meteorological Service, and the eight-member University of Washington mesoscale ensemble, and its predictive performance is compared with that of various benchmark EMOS models based on single parametric families and combinations thereof. The results indicate improved calibration of probabilistic and accuracy of point forecasts in comparison with the raw ensemble and climatological forecasts. The mixture EMOS model significantly outperforms the TN and LN EMOS methods; moreover, it provides better calibrated forecasts than the TN-LN combination model and offers an increased flexibility while avoiding covariate selection problems. © 2016 The Authors Environmetrics Published by JohnWiley & Sons Ltd.
Automated flare forecasting using a statistical learning technique
NASA Astrophysics Data System (ADS)
Yuan, Yuan; Shih, Frank Y.; Jing, Ju; Wang, Hai-Min
2010-08-01
We present a new method for automatically forecasting the occurrence of solar flares based on photospheric magnetic measurements. The method is a cascading combination of an ordinal logistic regression model and a support vector machine classifier. The predictive variables are three photospheric magnetic parameters, i.e., the total unsigned magnetic flux, length of the strong-gradient magnetic polarity inversion line, and total magnetic energy dissipation. The output is true or false for the occurrence of a certain level of flares within 24 hours. Experimental results, from a sample of 230 active regions between 1996 and 2005, show the accuracies of a 24-hour flare forecast to be 0.86, 0.72, 0.65 and 0.84 respectively for the four different levels. Comparison shows an improvement in the accuracy of X-class flare forecasting.
Research on time series data prediction based on clustering algorithm - A case study of Yuebao
NASA Astrophysics Data System (ADS)
Lu, Xu; Zhao, Tianzhong
2017-08-01
Forecasting is the prerequisite for making scientific decisions, it is based on the past information of the research on the phenomenon, and combined with some of the factors affecting this phenomenon, then using scientific methods to forecast the development trend of the future, it is an important way for people to know the world. This is particularly important in the prediction of financial data, because proper financial data forecasts can provide a great deal of help to financial institutions in their strategic implementation, strategic alignment and risk control. However, the current forecasts of financial data generally use the method of forecast of overall data, which lack of consideration of customer behavior and other factors in the financial data forecasting process, and they are important factors influencing the change of financial data. Based on this situation, this paper analyzed the data of Yuebao, and according to the user's attributes and the operating characteristics, this paper classified 567 users of Yuebao, and made further predicted the data of Yuebao for every class of users, the results showed that the forecasting model in this paper can meet the demand of forecasting.
Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain
Dai, Yonghui; Han, Dongmei; Dai, Weihui
2014-01-01
The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659
NASA Astrophysics Data System (ADS)
Rings, Joerg; Vrugt, Jasper A.; Schoups, Gerrit; Huisman, Johan A.; Vereecken, Harry
2012-05-01
Bayesian model averaging (BMA) is a standard method for combining predictive distributions from different models. In recent years, this method has enjoyed widespread application and use in many fields of study to improve the spread-skill relationship of forecast ensembles. The BMA predictive probability density function (pdf) of any quantity of interest is a weighted average of pdfs centered around the individual (possibly bias-corrected) forecasts, where the weights are equal to posterior probabilities of the models generating the forecasts, and reflect the individual models skill over a training (calibration) period. The original BMA approach presented by Raftery et al. (2005) assumes that the conditional pdf of each individual model is adequately described with a rather standard Gaussian or Gamma statistical distribution, possibly with a heteroscedastic variance. Here we analyze the advantages of using BMA with a flexible representation of the conditional pdf. A joint particle filtering and Gaussian mixture modeling framework is presented to derive analytically, as closely and consistently as possible, the evolving forecast density (conditional pdf) of each constituent ensemble member. The median forecasts and evolving conditional pdfs of the constituent models are subsequently combined using BMA to derive one overall predictive distribution. This paper introduces the theory and concepts of this new ensemble postprocessing method, and demonstrates its usefulness and applicability by numerical simulation of the rainfall-runoff transformation using discharge data from three different catchments in the contiguous United States. The revised BMA method receives significantly lower-prediction errors than the original default BMA method (due to filtering) with predictive uncertainty intervals that are substantially smaller but still statistically coherent (due to the use of a time-variant conditional pdf).
Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses
NASA Astrophysics Data System (ADS)
Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong
2017-04-01
Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums shows a mean improvement of more than 40% in CRPS when compared to bilinearly interpolated uncalibrated ensemble forecasts. The validation on randomly selected grid points, representing the true height distribution over Austria, still indicates a mean improvement of 35%. The applied statistical model is currently set up for 6-hourly and daily accumulation periods, but will be extended to a temporal resolution of 1-3 hours within a new probabilistic nowcasting system operated by ZAMG.
Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew
2016-01-01
Objectives Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3–20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Design Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. Participants People are not needed in this study. Data sources The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Main outcome measure Studies reporting methods used to predict future health technologies within a 3–20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. Results 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. Conclusions The methodological fundamentals of formal 3–20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. PMID:26966060
The application of hybrid artificial intelligence systems for forecasting
NASA Astrophysics Data System (ADS)
Lees, Brian; Corchado, Juan
1999-03-01
The results to date are presented from an ongoing investigation, in which the aim is to combine the strengths of different artificial intelligence methods into a single problem solving system. The premise underlying this research is that a system which embodies several cooperating problem solving methods will be capable of achieving better performance than if only a single method were employed. The work has so far concentrated on the combination of case-based reasoning and artificial neural networks. The relative merits of artificial neural networks and case-based reasoning problem solving paradigms, and their combination are discussed. The integration of these two AI problem solving methods in a hybrid systems architecture, such that the neural network provides support for learning from past experience in the case-based reasoning cycle, is then presented. The approach has been applied to the task of forecasting the variation of physical parameters of the ocean. Results obtained so far from tests carried out in the dynamic oceanic environment are presented.
Combining a Spatial Model and Demand Forecasts to Map Future Surface Coal Mining in Appalachia
Strager, Michael P.; Strager, Jacquelyn M.; Evans, Jeffrey S.; Dunscomb, Judy K.; Kreps, Brad J.; Maxwell, Aaron E.
2015-01-01
Predicting the locations of future surface coal mining in Appalachia is challenging for a number of reasons. Economic and regulatory factors impact the coal mining industry and forecasts of future coal production do not specifically predict changes in location of future coal production. With the potential environmental impacts from surface coal mining, prediction of the location of future activity would be valuable to decision makers. The goal of this study was to provide a method for predicting future surface coal mining extents under changing economic and regulatory forecasts through the year 2035. This was accomplished by integrating a spatial model with production demand forecasts to predict (1 km2) gridded cell size land cover change. Combining these two inputs was possible with a ratio which linked coal extraction quantities to a unit area extent. The result was a spatial distribution of probabilities allocated over forecasted demand for the Appalachian region including northern, central, southern, and eastern Illinois coal regions. The results can be used to better plan for land use alterations and potential cumulative impacts. PMID:26090883
Short-term forecasts gain in accuracy. [Regression technique using ''Box-Jenkins'' analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Box-Jenkins time-series models offer accuracy for short-term forecasts that compare with large-scale macroeconomic forecasts. Utilities need to be able to forecast peak demand in order to plan their generating, transmitting, and distribution systems. This new method differs from conventional models by not assuming specific data patterns, but by fitting available data into a tentative pattern on the basis of auto-correlations. Three types of models (autoregressive, moving average, or mixed autoregressive/moving average) can be used according to which provides the most appropriate combination of autocorrelations and related derivatives. Major steps in choosing a model are identifying potential models, estimating the parametersmore » of the problem, and running a diagnostic check to see if the model fits the parameters. The Box-Jenkins technique is well suited for seasonal patterns, which makes it possible to have as short as hourly forecasts of load demand. With accuracy up to two years, the method will allow electricity price-elasticity forecasting that can be applied to facility planning and rate design. (DCK)« less
Prediction on sunspot activity based on fuzzy information granulation and support vector machine
NASA Astrophysics Data System (ADS)
Peng, Lingling; Yan, Haisheng; Yang, Zhigang
2018-04-01
In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.
NASA Astrophysics Data System (ADS)
Zhu, Zhiwei; Li, Tim
2017-01-01
The extended-range (10-30-day) rainfall forecast over the entire China was carried out using spatial-temporal projection models (STPMs). Using a rotated empirical orthogonal function analysis of intraseasonal (10-80-day) rainfall anomalies, China is divided into ten sub-regions. Different predictability sources were selected for each of the ten regions. The forecast skills are ranked for each region. Based on temporal correlation coefficient (TCC) and Gerrity skill score, useful skills are found for most parts of China at a 20-25-day lead. The southern China and the mid-lower reaches of Yangtze River Valley show the highest predictive skills, whereas southwestern China and Huang-Huai region have the lowest predictive skills. By combining forecast results from ten regional STPMs, the TCC distribution of 8-year (2003-2010) independent forecast for the entire China is investigated. The combined forecast results from ten STPMs show significantly higher skills than the forecast with just one single STPM for the entire China. Independent forecast examples of summer rainfall anomalies around the period of Beijing Olympic Games in 2008 and Shanghai World Expo in 2010 are presented. The result shows that the current model is able to reproduce the gross pattern of the summer intraseasonal rainfall over China at a 20-day lead. The present study provides, for the first time, a guide on the statistical extended-range forecast of summer rainfall anomalies for the entire China. It is anticipated that the ideas and methods proposed here will facilitate the extended-range forecast in China.
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2017-04-01
Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes distance, stream-connectivity and size of the catchment areas into account. This can in some cases have a slight negative impact on the calibration error, but avoids large differences between parameters of nearby locations, whether stream connected or not. The spatial calibration also makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2016-04-01
Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative impact on the calibration error, but makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
Aviation Turbulence: Dynamics, Forecasting, and Response to Climate Change
NASA Astrophysics Data System (ADS)
Storer, Luke N.; Williams, Paul D.; Gill, Philip G.
2018-03-01
Atmospheric turbulence is a major hazard in the aviation industry and can cause injuries to passengers and crew. Understanding the physical and dynamical generation mechanisms of turbulence aids with the development of new forecasting algorithms and, therefore, reduces the impact that it has on the aviation industry. The scope of this paper is to review the dynamics of aviation turbulence, its response to climate change, and current forecasting methods at the cruising altitude of aircraft. Aviation-affecting turbulence comes from three main sources: vertical wind shear instabilities, convection, and mountain waves. Understanding these features helps researchers to develop better turbulence diagnostics. Recent research suggests that turbulence will increase in frequency and strength with climate change, and therefore, turbulence forecasting may become more important in the future. The current methods of forecasting are unable to predict every turbulence event, and research is ongoing to find the best solution to this problem by combining turbulence predictors and using ensemble forecasts to increase skill. The skill of operational turbulence forecasts has increased steadily over recent decades, mirroring improvements in our understanding. However, more work is needed—ideally in collaboration with the aviation industry—to improve observations and increase forecast skill, to help maintain and enhance aviation safety standards in the future.
NASA Astrophysics Data System (ADS)
Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris
2018-02-01
We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.
Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
2017-07-01
In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.
Probabilistic Forecasting of Surface Ozone with a Novel Statistical Approach
NASA Technical Reports Server (NTRS)
Balashov, Nikolay V.; Thompson, Anne M.; Young, George S.
2017-01-01
The recent change in the Environmental Protection Agency's surface ozone regulation, lowering the surface ozone daily maximum 8-h average (MDA8) exceedance threshold from 75 to 70 ppbv, poses significant challenges to U.S. air quality (AQ) forecasters responsible for ozone MDA8 forecasts. The forecasters, supplied by only a few AQ model products, end up relying heavily on self-developed tools. To help U.S. AQ forecasters, this study explores a surface ozone MDA8 forecasting tool that is based solely on statistical methods and standard meteorological variables from the numerical weather prediction (NWP) models. The model combines the self-organizing map (SOM), which is a clustering technique, with a step wise weighted quadratic regression using meteorological variables as predictors for ozone MDA8. The SOM method identifies different weather regimes, to distinguish between various modes of ozone variability, and groups them according to similarity. In this way, when a regression is developed for a specific regime, data from the other regimes are also used, with weights that are based on their similarity to this specific regime. This approach, regression in SOM (REGiS), yields a distinct model for each regime taking into account both the training cases for that regime and other similar training cases. To produce probabilistic MDA8 ozone forecasts, REGiS weighs and combines all of the developed regression models on the basis of the weather patterns predicted by an NWP model. REGiS is evaluated over the San Joaquin Valley in California and the northeastern plains of Colorado. The results suggest that the model performs best when trained and adjusted separately for an individual AQ station and its corresponding meteorological site.
Reservoir water level forecasting using group method of data handling
NASA Astrophysics Data System (ADS)
Zaji, Amir Hossein; Bonakdari, Hossein; Gharabaghi, Bahram
2018-06-01
Accurately forecasted reservoir water level is among the most vital data for efficient reservoir structure design and management. In this study, the group method of data handling is combined with the minimum description length method to develop a very practical and functional model for predicting reservoir water levels. The models' performance is evaluated using two groups of input combinations based on recent days and recent weeks. Four different input combinations are considered in total. The data collected from Chahnimeh#1 Reservoir in eastern Iran are used for model training and validation. To assess the models' applicability in practical situations, the models are made to predict a non-observed dataset for the nearby Chahnimeh#4 Reservoir. According to the results, input combinations (L, L -1) and (L, L -1, L -12) for recent days with root-mean-squared error (RMSE) of 0.3478 and 0.3767, respectively, outperform input combinations (L, L -7) and (L, L -7, L -14) for recent weeks with RMSE of 0.3866 and 0.4378, respectively, with the dataset from https://www.typingclub.com/st. Accordingly, (L, L -1) is selected as the best input combination for making 7-day ahead predictions of reservoir water levels.
Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew
2016-03-10
Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3-20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. People are not needed in this study. The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Studies reporting methods used to predict future health technologies within a 3-20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. The methodological fundamentals of formal 3-20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
The total probabilities from high-resolution ensemble forecasting of floods
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2015-04-01
Ensemble forecasting has for a long time been used in meteorological modelling, to give an indication of the uncertainty of the forecasts. As meteorological ensemble forecasts often show some bias and dispersion errors, there is a need for calibration and post-processing of the ensembles. Typical methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). To make optimal predictions of floods along the stream network in hydrology, we can easily use the ensemble members as input to the hydrological models. However, some of the post-processing methods will need modifications when regionalizing the forecasts outside the calibration locations, as done by Hemri et al. (2013). We present a method for spatial regionalization of the post-processed forecasts based on EMOS and top-kriging (Skøien et al., 2006). We will also look into different methods for handling the non-normality of runoff and the effect on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005. Skøien, J. O., Merz, R. and Blöschl, G.: Top-kriging - Geostatistics on stream networks, Hydrol. Earth Syst. Sci., 10(2), 277-287, 2006.
NASA Astrophysics Data System (ADS)
Sokol, Zbyněk; Mejsnar, Jan; Pop, Lukáš; Bližňák, Vojtěch
2017-09-01
A new method for the probabilistic nowcasting of instantaneous rain rates (ENS) based on the ensemble technique and extrapolation along Lagrangian trajectories of current radar reflectivity is presented. Assuming inaccurate forecasts of the trajectories, an ensemble of precipitation forecasts is calculated and used to estimate the probability that rain rates will exceed a given threshold in a given grid point. Although the extrapolation neglects the growth and decay of precipitation, their impact on the probability forecast is taken into account by the calibration of forecasts using the reliability component of the Brier score (BS). ENS forecasts the probability that the rain rates will exceed thresholds of 0.1, 1.0 and 3.0 mm/h in squares of 3 km by 3 km. The lead times were up to 60 min, and the forecast accuracy was measured by the BS. The ENS forecasts were compared with two other methods: combined method (COM) and neighbourhood method (NEI). NEI considered the extrapolated values in the square neighbourhood of 5 by 5 grid points of the point of interest as ensemble members, and the COM ensemble was comprised of united ensemble members of ENS and NEI. The results showed that the calibration technique significantly improves bias of the probability forecasts by including additional uncertainties that correspond to neglected processes during the extrapolation. In addition, the calibration can also be used for finding the limits of maximum lead times for which the forecasting method is useful. We found that ENS is useful for lead times up to 60 min for thresholds of 0.1 and 1 mm/h and approximately 30 to 40 min for a threshold of 3 mm/h. We also found that a reasonable size of the ensemble is 100 members, which provided better scores than ensembles with 10, 25 and 50 members. In terms of the BS, the best results were obtained by ENS and COM, which are comparable. However, ENS is better calibrated and thus preferable.
Statistical Post-Processing of Wind Speed Forecasts to Estimate Relative Economic Value
NASA Astrophysics Data System (ADS)
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
2013-04-01
The objective of this research is to get the best possible wind speed forecasts for the wind energy industry by using an optimal combination of well-established forecasting and post-processing methods. We start with the ECMWF 51 member ensemble prediction system (EPS) which is underdispersive and hence uncalibrated. We aim to produce wind speed forecasts that are more accurate and calibrated than the EPS. The 51 members of the EPS are clustered to 8 weighted representative members (RMs), chosen to minimize the within-cluster spread, while maximizing the inter-cluster spread. The forecasts are then downscaled using two limited area models, WRF and COSMO, at two resolutions, 14km and 3km. This process creates four distinguishable ensembles which are used as input to statistical post-processes requiring multi-model forecasts. Two such processes are presented here. The first, Bayesian Model Averaging, has been proven to provide more calibrated and accurate wind speed forecasts than the ECMWF EPS using this multi-model input data. The second, heteroscedastic censored regression is indicating positive results also. We compare the two post-processing methods, applied to a year of hindcast wind speed data around Ireland, using an array of deterministic and probabilistic verification techniques, such as MAE, CRPS, probability transform integrals and verification rank histograms, to show which method provides the most accurate and calibrated forecasts. However, the value of a forecast to an end-user cannot be fully quantified by just the accuracy and calibration measurements mentioned, as the relationship between skill and value is complex. Capturing the full potential of the forecast benefits also requires detailed knowledge of the end-users' weather sensitive decision-making processes and most importantly the economic impact it will have on their income. Finally, we present the continuous relative economic value of both post-processing methods to identify which is more beneficial to the wind energy industry of Ireland.
An improved Multimodel Approach for Global Sea Surface Temperature Forecasts
NASA Astrophysics Data System (ADS)
Khan, M. Z. K.; Mehrotra, R.; Sharma, A.
2014-12-01
The concept of ensemble combinations for formulating improved climate forecasts has gained popularity in recent years. However, many climate models share similar physics or modeling processes, which may lead to similar (or strongly correlated) forecasts. Recent approaches for combining forecasts that take into consideration differences in model accuracy over space and time have either ignored the similarity of forecast among the models or followed a pairwise dynamic combination approach. Here we present a basis for combining model predictions, illustrating the improvements that can be achieved if procedures for factoring in inter-model dependence are utilised. The utility of the approach is demonstrated by combining sea surface temperature (SST) forecasts from five climate models over a period of 1960-2005. The variable of interest, the monthly global sea surface temperature anomalies (SSTA) at a 50´50 latitude-longitude grid, is predicted three months in advance to demonstrate the utility of the proposed algorithm. Results indicate that the proposed approach offers consistent and significant improvements for majority of grid points compared to the case where the dependence among the models is ignored. Therefore, the proposed approach of combining multiple models by taking into account the existing interdependence, provides an attractive alternative to obtain improved climate forecast. In addition, an approach to combine seasonal forecasts from multiple climate models with varying periods of availability is also demonstrated.
NASA Astrophysics Data System (ADS)
Gastón, Martín; Fernández-Peruchena, Carlos; Körnich, Heiner; Landelius, Tomas
2017-06-01
The present work describes the first approach of a new procedure to forecast Direct Normal Irradiance (DNI): the #hashtdim that treats to combine ground information and Numerical Weather Predictions. The system is centered in generate predictions for the very short time. It combines the outputs from the Numerical Weather Prediction Model HARMONIE with an adaptive methodology based on Machine Learning. The DNI predictions are generated with 15-minute and hourly temporal resolutions and presents 3-hourly updates. Each update offers forecasts to the next 12 hours, the first nine hours are generated with 15-minute temporal resolution meanwhile the last three hours present hourly temporal resolution. The system is proved over a Spanish emplacement with BSRN operative station in south of Spain (PSA station). The #hashtdim has been implemented in the framework of the Direct Normal Irradiance Nowcasting methods for optimized operation of concentrating solar technologies (DNICast) project, under the European Union's Seventh Programme for research, technological development and demonstration framework.
Forecasting the 2013–2014 influenza season using Wikipedia
Hickmann, Kyle S.; Fairchild, Geoffrey; Priedhorsky, Reid; ...
2015-05-14
Infectious diseases are one of the leading causes of morbidity and mortality around the world; thus, forecasting their impact is crucial for planning an effective response strategy. According to the Centers for Disease Control and Prevention (CDC), seasonal influenza affects 5% to 20% of the U.S. population and causes major economic impacts resulting from hospitalization and absenteeism. Understanding influenza dynamics and forecasting its impact is fundamental for developing prevention and mitigation strategies. We combine modern data assimilation methods with Wikipedia access logs and CDC influenza-like illness (ILI) reports to create a weekly forecast for seasonal influenza. The methods are appliedmore » to the 2013-2014 influenza season but are sufficiently general to forecast any disease outbreak, given incidence or case count data. We adjust the initialization and parametrization of a disease model and show that this allows us to determine systematic model bias. In addition, we provide a way to determine where the model diverges from observation and evaluate forecast accuracy. Wikipedia article access logs are shown to be highly correlated with historical ILI records and allow for accurate prediction of ILI data several weeks before it becomes available. The results show that prior to the peak of the flu season, our forecasting method produced 50% and 95% credible intervals for the 2013-2014 ILI observations that contained the actual observations for most weeks in the forecast. However, since our model does not account for re-infection or multiple strains of influenza, the tail of the epidemic is not predicted well after the peak of flu season has passed.« less
Forecasting the 2013–2014 Influenza Season Using Wikipedia
Hickmann, Kyle S.; Fairchild, Geoffrey; Priedhorsky, Reid; Generous, Nicholas; Hyman, James M.; Deshpande, Alina; Del Valle, Sara Y.
2015-01-01
Infectious diseases are one of the leading causes of morbidity and mortality around the world; thus, forecasting their impact is crucial for planning an effective response strategy. According to the Centers for Disease Control and Prevention (CDC), seasonal influenza affects 5% to 20% of the U.S. population and causes major economic impacts resulting from hospitalization and absenteeism. Understanding influenza dynamics and forecasting its impact is fundamental for developing prevention and mitigation strategies. We combine modern data assimilation methods with Wikipedia access logs and CDC influenza-like illness (ILI) reports to create a weekly forecast for seasonal influenza. The methods are applied to the 2013-2014 influenza season but are sufficiently general to forecast any disease outbreak, given incidence or case count data. We adjust the initialization and parametrization of a disease model and show that this allows us to determine systematic model bias. In addition, we provide a way to determine where the model diverges from observation and evaluate forecast accuracy. Wikipedia article access logs are shown to be highly correlated with historical ILI records and allow for accurate prediction of ILI data several weeks before it becomes available. The results show that prior to the peak of the flu season, our forecasting method produced 50% and 95% credible intervals for the 2013-2014 ILI observations that contained the actual observations for most weeks in the forecast. However, since our model does not account for re-infection or multiple strains of influenza, the tail of the epidemic is not predicted well after the peak of flu season has passed. PMID:25974758
Forecasting the 2013-2014 influenza season using Wikipedia.
Hickmann, Kyle S; Fairchild, Geoffrey; Priedhorsky, Reid; Generous, Nicholas; Hyman, James M; Deshpande, Alina; Del Valle, Sara Y
2015-05-01
Infectious diseases are one of the leading causes of morbidity and mortality around the world; thus, forecasting their impact is crucial for planning an effective response strategy. According to the Centers for Disease Control and Prevention (CDC), seasonal influenza affects 5% to 20% of the U.S. population and causes major economic impacts resulting from hospitalization and absenteeism. Understanding influenza dynamics and forecasting its impact is fundamental for developing prevention and mitigation strategies. We combine modern data assimilation methods with Wikipedia access logs and CDC influenza-like illness (ILI) reports to create a weekly forecast for seasonal influenza. The methods are applied to the 2013-2014 influenza season but are sufficiently general to forecast any disease outbreak, given incidence or case count data. We adjust the initialization and parametrization of a disease model and show that this allows us to determine systematic model bias. In addition, we provide a way to determine where the model diverges from observation and evaluate forecast accuracy. Wikipedia article access logs are shown to be highly correlated with historical ILI records and allow for accurate prediction of ILI data several weeks before it becomes available. The results show that prior to the peak of the flu season, our forecasting method produced 50% and 95% credible intervals for the 2013-2014 ILI observations that contained the actual observations for most weeks in the forecast. However, since our model does not account for re-infection or multiple strains of influenza, the tail of the epidemic is not predicted well after the peak of flu season has passed.
Forecasting the 2013–2014 influenza season using Wikipedia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hickmann, Kyle S.; Fairchild, Geoffrey; Priedhorsky, Reid
Infectious diseases are one of the leading causes of morbidity and mortality around the world; thus, forecasting their impact is crucial for planning an effective response strategy. According to the Centers for Disease Control and Prevention (CDC), seasonal influenza affects 5% to 20% of the U.S. population and causes major economic impacts resulting from hospitalization and absenteeism. Understanding influenza dynamics and forecasting its impact is fundamental for developing prevention and mitigation strategies. We combine modern data assimilation methods with Wikipedia access logs and CDC influenza-like illness (ILI) reports to create a weekly forecast for seasonal influenza. The methods are appliedmore » to the 2013-2014 influenza season but are sufficiently general to forecast any disease outbreak, given incidence or case count data. We adjust the initialization and parametrization of a disease model and show that this allows us to determine systematic model bias. In addition, we provide a way to determine where the model diverges from observation and evaluate forecast accuracy. Wikipedia article access logs are shown to be highly correlated with historical ILI records and allow for accurate prediction of ILI data several weeks before it becomes available. The results show that prior to the peak of the flu season, our forecasting method produced 50% and 95% credible intervals for the 2013-2014 ILI observations that contained the actual observations for most weeks in the forecast. However, since our model does not account for re-infection or multiple strains of influenza, the tail of the epidemic is not predicted well after the peak of flu season has passed.« less
NASA Astrophysics Data System (ADS)
Hoss, F.; Fischbeck, P. S.
2014-10-01
This study further develops the method of quantile regression (QR) to predict exceedance probabilities of flood stages by post-processing forecasts. Using data from the 82 river gages, for which the National Weather Service's North Central River Forecast Center issues forecasts daily, this is the first QR application to US American river gages. Archived forecasts for lead times up to six days from 2001-2013 were analyzed. Earlier implementations of QR used the forecast itself as the only independent variable (Weerts et al., 2011; López López et al., 2014). This study adds the rise rate of the river stage in the last 24 and 48 h and the forecast error 24 and 48 h ago to the QR model. Including those four variables significantly improved the forecasts, as measured by the Brier Skill Score (BSS). Mainly, the resolution increases, as the original QR implementation already delivered high reliability. Combining the forecast with the other four variables results in much less favorable BSSs. Lastly, the forecast performance does not depend on the size of the training dataset, but on the year, the river gage, lead time and event threshold that are being forecast. We find that each event threshold requires a separate model configuration or at least calibration.
Monthly streamflow forecasting at varying spatial scales in the Rhine basin
NASA Astrophysics Data System (ADS)
Schick, Simon; Rössler, Ole; Weingartner, Rolf
2018-02-01
Model output statistics (MOS) methods can be used to empirically relate an environmental variable of interest to predictions from earth system models (ESMs). This variable often belongs to a spatial scale not resolved by the ESM. Here, using the linear model fitted by least squares, we regress monthly mean streamflow of the Rhine River at Lobith and Basel against seasonal predictions of precipitation, surface air temperature, and runoff from the European Centre for Medium-Range Weather Forecasts. To address potential effects of a scale mismatch between the ESM's horizontal grid resolution and the hydrological application, the MOS method is further tested with an experiment conducted at the subcatchment scale. This experiment applies the MOS method to 133 additional gauging stations located within the Rhine basin and combines the forecasts from the subcatchments to predict streamflow at Lobith and Basel. In doing so, the MOS method is tested for catchments areas covering 4 orders of magnitude. Using data from the period 1981-2011, the results show that skill, with respect to climatology, is restricted on average to the first month ahead. This result holds for both the predictor combination that mimics the initial conditions and the predictor combinations that additionally include the dynamical seasonal predictions. The latter, however, reduce the mean absolute error of the former in the range of 5 to 12 %, which is consistently reproduced at the subcatchment scale. An additional experiment conducted for 5-day mean streamflow indicates that the dynamical predictions help to reduce uncertainties up to about 20 days ahead, but it also reveals some shortcomings of the present MOS method.
Seasonal drought predictability in Portugal using statistical-dynamical techniques
NASA Astrophysics Data System (ADS)
Ribeiro, A. F. S.; Pires, C. A. L.
2016-08-01
Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.
NASA Astrophysics Data System (ADS)
Liu, Li; Gao, Chao; Xuan, Weidong; Xu, Yue-Ping
2017-11-01
Ensemble flood forecasts by hydrological models using numerical weather prediction products as forcing data are becoming more commonly used in operational flood forecasting applications. In this study, a hydrological ensemble flood forecasting system comprised of an automatically calibrated Variable Infiltration Capacity model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated. The hydrological model is optimized by the parallel programmed ε-NSGA II multi-objective algorithm. According to the solutions by ε-NSGA II, two differently parameterized models are determined to simulate daily flows and peak flows at each of the three hydrological stations. Then a simple yet effective modular approach is proposed to combine these daily and peak flows at the same station into one composite series. Five ensemble methods and various evaluation metrics are adopted. The results show that ε-NSGA II can provide an objective determination on parameter estimation, and the parallel program permits a more efficient simulation. It is also demonstrated that the forecasts from ECMWF have more favorable skill scores than other Ensemble Prediction Systems. The multimodel ensembles have advantages over all the single model ensembles and the multimodel methods weighted on members and skill scores outperform other methods. Furthermore, the overall performance at three stations can be satisfactory up to ten days, however the hydrological errors can degrade the skill score by approximately 2 days, and the influence persists until a lead time of 10 days with a weakening trend. With respect to peak flows selected by the Peaks Over Threshold approach, the ensemble means from single models or multimodels are generally underestimated, indicating that the ensemble mean can bring overall improvement in forecasting of flows. For peak values taking flood forecasts from each individual member into account is more appropriate.
Evaluation of Clear-Air Turbulence Diagnostics: GTG in Korea
NASA Astrophysics Data System (ADS)
Kim, J.-H.; Chun, H.-Y.; Jang, W.; Sharman, R. D.
2009-04-01
Turbulence forecasting algorithm, the Graphical Turbulence Guidance (GTG) system developed at NCAR (Sharman et al., 2006), is evaluated with available turbulence observations (e.g. pilot reports; PIREPs) reported in South Korea during the recent 4 years (2003-2007). Clear-air turbulence (CAT) is extracted from PIREPs by using cloud-to-ground lightning flash data from Korean Meteorological Administration (KMA). The GTG system includes several steps. First, 45 turbulence indices are calculated in the East Asian region near Korean peninsula using the Regional Data Assimilation and Prediction System (RDAPS) analysis data with 30 km horizontal grid spacing provided by KMA. Second, 10 CAT indices that performed ten best forecasting score are selected. The scoring method is based on the probability of detection, which is calculated using PIREPs exclusively of moderate-or-greater intensity. Various statistical examinations and sensitivity tests of the GTG system are performed by yearly and seasonally classified PIREPs in South Korea. Performance of GTG is more consistent and stable than that of any individual diagnostic in each year and season. In addition, current-year forecasting based on yearly PIREPs is better than adjacent-year forecasting and year-after-year forecasting. Seasonal forecasting is generally better than yearly forecasting, because selected CAT indices in each season represent meteorological condition much more properly than applying the selected CAT indices to all seasons. Wintertime forecasting is the best among the four seasonal forecastings. This is likely due to that the GTG system consists of many CAT indices related to jet stream, and turbulence associated with the jet can be most activated in wintertime under strong jet magnitude. On the other hand, summertime forecasting skill is much less than in wintertime. To acquire better performance for summertime forecasting, it is likely to develop more turbulence indices related to, for example, convections. By sensitivity test to the number of combined indices, it is found that yearly and seasonal GTG is the best when about 7 CAT indices are combined.
Voukantsis, Dimitris; Karatzas, Kostas; Kukkonen, Jaakko; Räsänen, Teemu; Karppinen, Ari; Kolehmainen, Mikko
2011-03-01
In this paper we propose a methodology consisting of specific computational intelligence methods, i.e. principal component analysis and artificial neural networks, in order to inter-compare air quality and meteorological data, and to forecast the concentration levels for environmental parameters of interest (air pollutants). We demonstrate these methods to data monitored in the urban areas of Thessaloniki and Helsinki in Greece and Finland, respectively. For this purpose, we applied the principal component analysis method in order to inter-compare the patterns of air pollution in the two selected cities. Then, we proceeded with the development of air quality forecasting models for both studied areas. On this basis, we formulated and employed a novel hybrid scheme in the selection process of input variables for the forecasting models, involving a combination of linear regression and artificial neural networks (multi-layer perceptron) models. The latter ones were used for the forecasting of the daily mean concentrations of PM₁₀ and PM₂.₅ for the next day. Results demonstrated an index of agreement between measured and modelled daily averaged PM₁₀ concentrations, between 0.80 and 0.85, while the kappa index for the forecasting of the daily averaged PM₁₀ concentrations reached 60% for both cities. Compared with previous corresponding studies, these statistical parameters indicate an improved performance of air quality parameters forecasting. It was also found that the performance of the models for the forecasting of the daily mean concentrations of PM₁₀ was not substantially different for both cities, despite the major differences of the two urban environments under consideration. Copyright © 2011 Elsevier B.V. All rights reserved.
Statistical security for Social Security.
Soneji, Samir; King, Gary
2012-08-01
The financial viability of Social Security, the single largest U.S. government program, depends on accurate forecasts of the solvency of its intergenerational trust fund. We begin by detailing information necessary for replicating the Social Security Administration's (SSA's) forecasting procedures, which until now has been unavailable in the public domain. We then offer a way to improve the quality of these procedures via age- and sex-specific mortality forecasts. The most recent SSA mortality forecasts were based on the best available technology at the time, which was a combination of linear extrapolation and qualitative judgments. Unfortunately, linear extrapolation excludes known risk factors and is inconsistent with long-standing demographic patterns, such as the smoothness of age profiles. Modern statistical methods typically outperform even the best qualitative judgments in these contexts. We show how to use such methods, enabling researchers to forecast using far more information, such as the known risk factors of smoking and obesity and known demographic patterns. Including this extra information makes a substantial difference. For example, by improving only mortality forecasting methods, we predict three fewer years of net surplus, $730 billion less in Social Security Trust Funds, and program costs that are 0.66% greater for projected taxable payroll by 2031 compared with SSA projections. More important than specific numerical estimates are the advantages of transparency, replicability, reduction of uncertainty, and what may be the resulting lower vulnerability to the politicization of program forecasts. In addition, by offering with this article software and detailed replication information, we hope to marshal the efforts of the research community to include ever more informative inputs and to continue to reduce uncertainties in Social Security forecasts.
Hansen, J V; Nelson, R D
1997-01-01
Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.
The Betting Odds Rating System: Using soccer forecasts to forecast soccer.
Wunderlich, Fabian; Memmert, Daniel
2018-01-01
Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods.
The Betting Odds Rating System: Using soccer forecasts to forecast soccer
Memmert, Daniel
2018-01-01
Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods. PMID:29870554
Ren, Hong; Li, Jian; Yuan, Zheng-An; Hu, Jia-Yu; Yu, Yan; Lu, Yi-Han
2013-09-08
Sporadic hepatitis E has become an important public health concern in China. Accurate forecasting of the incidence of hepatitis E is needed to better plan future medical needs. Few mathematical models can be used because hepatitis E morbidity data has both linear and nonlinear patterns. We developed a combined mathematical model using an autoregressive integrated moving average model (ARIMA) and a back propagation neural network (BPNN) to forecast the incidence of hepatitis E. The morbidity data of hepatitis E in Shanghai from 2000 to 2012 were retrieved from the China Information System for Disease Control and Prevention. The ARIMA-BPNN combined model was trained with 144 months of morbidity data from January 2000 to December 2011, validated with 12 months of data January 2012 to December 2012, and then employed to forecast hepatitis E incidence January 2013 to December 2013 in Shanghai. Residual analysis, Root Mean Square Error (RMSE), normalized Bayesian Information Criterion (BIC), and stationary R square methods were used to compare the goodness-of-fit among ARIMA models. The Bayesian regularization back-propagation algorithm was used to train the network. The mean error rate (MER) was used to assess the validity of the combined model. A total of 7,489 hepatitis E cases was reported in Shanghai from 2000 to 2012. Goodness-of-fit (stationary R2=0.531, BIC= -4.768, Ljung-Box Q statistics=15.59, P=0.482) and parameter estimates were used to determine the best-fitting model as ARIMA (0,1,1)×(0,1,1)12. Predicted morbidity values in 2012 from best-fitting ARIMA model and actual morbidity data from 2000 to 2011 were used to further construct the combined model. The MER of the ARIMA model and the ARIMA-BPNN combined model were 0.250 and 0.176, respectively. The forecasted incidence of hepatitis E in 2013 was 0.095 to 0.372 per 100,000 population. There was a seasonal variation with a peak during January-March and a nadir during August-October. Time series analysis suggested a seasonal pattern of hepatitis E morbidity in Shanghai, China. An ARIMA-BPNN combined model was used to fit the linear and nonlinear patterns of time series data, and accurately forecast hepatitis E infections.
Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.
2013-12-01
Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.
NASA Astrophysics Data System (ADS)
Rollett, T.; Möstl, C.; Isavnin, A.; Davies, J. A.; Kubicka, M.; Amerstorfer, U. V.; Harrison, R. A.
2016-06-01
In this study, we present a new method for forecasting arrival times and speeds of coronal mass ejections (CMEs) at any location in the inner heliosphere. This new approach enables the adoption of a highly flexible geometrical shape for the CME front with an adjustable CME angular width and an adjustable radius of curvature of its leading edge, I.e., the assumed geometry is elliptical. Using, as input, Solar TErrestrial RElations Observatory (STEREO) heliospheric imager (HI) observations, a new elliptic conversion (ElCon) method is introduced and combined with the use of drag-based model (DBM) fitting to quantify the deceleration or acceleration experienced by CMEs during propagation. The result is then used as input for the Ellipse Evolution Model (ElEvo). Together, ElCon, DBM fitting, and ElEvo form the novel ElEvoHI forecasting utility. To demonstrate the applicability of ElEvoHI, we forecast the arrival times and speeds of 21 CMEs remotely observed from STEREO/HI and compare them to in situ arrival times and speeds at 1 AU. Compared to the commonly used STEREO/HI fitting techniques (Fixed-ϕ, Harmonic Mean, and Self-similar Expansion fitting), ElEvoHI improves the arrival time forecast by about 2 to ±6.5 hr and the arrival speed forecast by ≈ 250 to ±53 km s-1, depending on the ellipse aspect ratio assumed. In particular, the remarkable improvement of the arrival speed prediction is potentially beneficial for predicting geomagnetic storm strength at Earth.
NASA Astrophysics Data System (ADS)
Merker, Claire; Ament, Felix; Clemens, Marco
2017-04-01
The quantification of measurement uncertainty for rain radar data remains challenging. Radar reflectivity measurements are affected, amongst other things, by calibration errors, noise, blocking and clutter, and attenuation. Their combined impact on measurement accuracy is difficult to quantify due to incomplete process understanding and complex interdependencies. An improved quality assessment of rain radar measurements is of interest for applications both in meteorology and hydrology, for example for precipitation ensemble generation, rainfall runoff simulations, or in data assimilation for numerical weather prediction. Especially a detailed description of the spatial and temporal structure of errors is beneficial in order to make best use of the areal precipitation information provided by radars. Radar precipitation ensembles are one promising approach to represent spatially variable radar measurement errors. We present a method combining ensemble radar precipitation nowcasting with data assimilation to estimate radar measurement uncertainty at each pixel. This combination of ensemble forecast and observation yields a consistent spatial and temporal evolution of the radar error field. We use an advection-based nowcasting method to generate an ensemble reflectivity forecast from initial data of a rain radar network. Subsequently, reflectivity data from single radars is assimilated into the forecast using the Local Ensemble Transform Kalman Filter. The spread of the resulting analysis ensemble provides a flow-dependent, spatially and temporally correlated reflectivity error estimate at each pixel. We will present first case studies that illustrate the method using data from a high-resolution X-band radar network.
Forecast Method of Solar Irradiance with Just-In-Time Modeling
NASA Astrophysics Data System (ADS)
Suzuki, Takanobu; Goto, Yusuke; Terazono, Takahiro; Wakao, Shinji; Oozeki, Takashi
PV power output mainly depends on the solar irradiance which is affected by various meteorological factors. So, it is required to predict solar irradiance in the future for the efficient operation of PV systems. In this paper, we develop a novel approach for solar irradiance forecast, in which we introduce to combine the black-box model (JIT Modeling) with the physical model (GPV data). We investigate the predictive accuracy of solar irradiance over wide controlled-area of each electric power company by utilizing the measured data on the 44 observation points throughout Japan offered by JMA and the 64 points around Kanto by NEDO. Finally, we propose the application forecast method of solar irradiance to the point which is difficulty in compiling the database. And we consider the influence of different GPV default time on solar irradiance prediction.
NASA Astrophysics Data System (ADS)
He, Shixuan; Xie, Wanyi; Zhang, Ping; Fang, Shaoxi; Li, Zhe; Tang, Peng; Gao, Xia; Guo, Jinsong; Tlili, Chaker; Wang, Deqiang
2018-02-01
The analysis of algae and dominant alga plays important roles in ecological and environmental fields since it can be used to forecast water bloom and control its potential deleterious effects. Herein, we combine in vivo confocal resonance Raman spectroscopy with multivariate analysis methods to preliminary identify the three algal genera in water blooms at unicellular scale. Statistical analysis of characteristic Raman peaks demonstrates that certain shifts and different normalized intensities, resulting from composition of different carotenoids, exist in Raman spectra of three algal cells. Principal component analysis (PCA) scores and corresponding loading weights show some differences from Raman spectral characteristics which are caused by vibrations of carotenoids in unicellular algae. Then, discriminant partial least squares (DPLS) classification method is used to verify the effectiveness of algal identification with confocal resonance Raman spectroscopy. Our results show that confocal resonance Raman spectroscopy combined with PCA and DPLS could handle the preliminary identification of dominant alga for forecasting and controlling of water blooms.
NASA Astrophysics Data System (ADS)
E, Jianwei; Bao, Yanling; Ye, Jimin
2017-10-01
As one of the most vital energy resources in the world, crude oil plays a significant role in international economic market. The fluctuation of crude oil price has attracted academic and commercial attention. There exist many methods in forecasting the trend of crude oil price. However, traditional models failed in predicting accurately. Based on this, a hybrid method will be proposed in this paper, which combines variational mode decomposition (VMD), independent component analysis (ICA) and autoregressive integrated moving average (ARIMA), called VMD-ICA-ARIMA. The purpose of this study is to analyze the influence factors of crude oil price and predict the future crude oil price. Major steps can be concluded as follows: Firstly, applying the VMD model on the original signal (crude oil price), the modes function can be decomposed adaptively. Secondly, independent components are separated by the ICA, and how the independent components affect the crude oil price is analyzed. Finally, forecasting the price of crude oil price by the ARIMA model, the forecasting trend demonstrates that crude oil price declines periodically. Comparing with benchmark ARIMA and EEMD-ICA-ARIMA, VMD-ICA-ARIMA can forecast the crude oil price more accurately.
Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; ...
2015-11-10
Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less
Leveraging LSTM for rapid intensifications prediction of tropical cyclones
NASA Astrophysics Data System (ADS)
Li, Y.; Yang, R.; Yang, C.; Yu, M.; Hu, F.; Jiang, Y.
2017-10-01
Tropical cyclones (TCs) usually cause severe damages and destructions. TC intensity forecasting helps people prepare for the extreme weather and could save lives and properties. Rapid Intensifications (RI) of TCs are the major error sources of TC intensity forecasting. A large number of factors, such as sea surface temperature and wind shear, affect the RI processes of TCs. Quite a lot of work have been done to identify the combination of conditions most favorable to RI. In this study, deep learning method is utilized to combine conditions for RI prediction of TCs. Experiments show that the long short-term memory (LSTM) network provides the ability to leverage past conditions to predict TC rapid intensifications.
NASA Astrophysics Data System (ADS)
Gibergans-Báguena, J.; Llasat, M. C.
2007-12-01
The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.
Fuzzy logic-based analogue forecasting and hybrid modelling of horizontal visibility
NASA Astrophysics Data System (ADS)
Tuba, Zoltán; Bottyán, Zsolt
2018-04-01
Forecasting visibility is one of the greatest challenges in aviation meteorology. At the same time, high accuracy visibility forecasts can significantly reduce or make avoidable weather-related risk in aviation as well. To improve forecasting visibility, this research links fuzzy logic-based analogue forecasting and post-processed numerical weather prediction model outputs in hybrid forecast. Performance of analogue forecasting model was improved by the application of Analytic Hierarchy Process. Then, linear combination of the mentioned outputs was applied to create ultra-short term hybrid visibility prediction which gradually shifts the focus from statistical to numerical products taking their advantages during the forecast period. It gives the opportunity to bring closer the numerical visibility forecast to the observations even it is wrong initially. Complete verification of categorical forecasts was carried out; results are available for persistence and terminal aerodrome forecasts (TAF) as well in order to compare. The average value of Heidke Skill Score (HSS) of examined airports of analogue and hybrid forecasts shows very similar results even at the end of forecast period where the rate of analogue prediction in the final hybrid output is 0.1-0.2 only. However, in case of poor visibility (1000-2500 m), hybrid (0.65) and analogue forecasts (0.64) have similar average of HSS in the first 6 h of forecast period, and have better performance than persistence (0.60) or TAF (0.56). Important achievement that hybrid model takes into consideration physics and dynamics of the atmosphere due to the increasing part of the numerical weather prediction. In spite of this, its performance is similar to the most effective visibility forecasting methods and does not follow the poor verification results of clearly numerical outputs.
NASA Astrophysics Data System (ADS)
Vislocky, Robert L.; Fritsch, J. Michael
1997-12-01
A prototype advanced model output statistics (MOS) forecast system that was entered in the 1996-97 National Collegiate Weather Forecast Contest is described and its performance compared to that of widely available objective guidance and to contest participants. The prototype system uses an optimal blend of aviation (AVN) and nested grid model (NGM) MOS forecasts, explicit output from the NGM and Eta guidance, and the latest surface weather observations from the forecast site. The forecasts are totally objective and can be generated quickly on a personal computer. Other "objective" forms of guidance tracked in the contest are 1) the consensus forecast (i.e., the average of the forecasts from all of the human participants), 2) the combination of NGM raw output (for precipitation forecasts) and NGM MOS guidance (for temperature forecasts), and 3) the combination of Eta Model raw output (for precipitation forecasts) and AVN MOS guidance (for temperature forecasts).Results show that the advanced MOS system finished in 20th place out of 737 original entrants, or better than approximately 97% of the human forecasters who entered the contest. Moreover, the advanced MOS system was slightly better than consensus (23d place). The fact that an objective forecast system finished ahead of consensus is a significant accomplishment since consensus is traditionally a very formidable "opponent" in forecast competitions. Equally significant is that the advanced MOS system was superior to the traditional guidance products available from the National Centers for Environmental Prediction (NCEP). Specifically, the combination of NGM raw output and NGM MOS guidance finished in 175th place, and the combination of Eta Model raw output and AVN MOS guidance finished in 266th place. The latter result is most intriguing since the proposed elimination of all NGM products would likely result in a serious degradation of objective products disseminated by NCEP, unless they are replaced with equal or better substitutes. On the other hand, the positive performance of the prototype advanced MOS system shows that it is possible to create a single objective product that is not only superior to currently available objective guidance products, but is also on par with some of the better human forecasters.
Novel approach for streamflow forecasting using a hybrid ANFIS-FFA model
NASA Astrophysics Data System (ADS)
Yaseen, Zaher Mundher; Ebtehaj, Isa; Bonakdari, Hossein; Deo, Ravinesh C.; Danandeh Mehr, Ali; Mohtar, Wan Hanna Melini Wan; Diop, Lamine; El-shafie, Ahmed; Singh, Vijay P.
2017-11-01
The present study proposes a new hybrid evolutionary Adaptive Neuro-Fuzzy Inference Systems (ANFIS) approach for monthly streamflow forecasting. The proposed method is a novel combination of the ANFIS model with the firefly algorithm as an optimizer tool to construct a hybrid ANFIS-FFA model. The results of the ANFIS-FFA model is compared with the classical ANFIS model, which utilizes the fuzzy c-means (FCM) clustering method in the Fuzzy Inference Systems (FIS) generation. The historical monthly streamflow data for Pahang River, which is a major river system in Malaysia that characterized by highly stochastic hydrological patterns, is used in the study. Sixteen different input combinations with one to five time-lagged input variables are incorporated into the ANFIS-FFA and ANFIS models to consider the antecedent seasonal variations in historical streamflow data. The mean absolute error (MAE), root mean square error (RMSE) and correlation coefficient (r) are used to evaluate the forecasting performance of ANFIS-FFA model. In conjunction with these metrics, the refined Willmott's Index (Drefined), Nash-Sutcliffe coefficient (ENS) and Legates and McCabes Index (ELM) are also utilized as the normalized goodness-of-fit metrics. Comparison of the results reveals that the FFA is able to improve the forecasting accuracy of the hybrid ANFIS-FFA model (r = 1; RMSE = 0.984; MAE = 0.364; ENS = 1; ELM = 0.988; Drefined = 0.994) applied for the monthly streamflow forecasting in comparison with the traditional ANFIS model (r = 0.998; RMSE = 3.276; MAE = 1.553; ENS = 0.995; ELM = 0.950; Drefined = 0.975). The results also show that the ANFIS-FFA is not only superior to the ANFIS model but also exhibits a parsimonious modelling framework for streamflow forecasting by incorporating a smaller number of input variables required to yield the comparatively better performance. It is construed that the FFA optimizer can thus surpass the accuracy of the traditional ANFIS model in general, and is able to remove the false (inaccurately) forecasted data in the ANFIS model for extremely low flows. The present results have wider implications not only for streamflow forecasting purposes, but also for other hydro-meteorological forecasting variables requiring only the historical data input data, and attaining a greater level of predictive accuracy with the incorporation of the FFA algorithm as an optimization tool in an ANFIS model.
A Four-Stage Hybrid Model for Hydrological Time Series Forecasting
Di, Chongli; Yang, Xiaohua; Wang, Xiaochao
2014-01-01
Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782
A four-stage hybrid model for hydrological time series forecasting.
Di, Chongli; Yang, Xiaohua; Wang, Xiaochao
2014-01-01
Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.
Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods
Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.
2011-01-01
Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.
Wang, K W; Deng, C; Li, J P; Zhang, Y Y; Li, X Y; Wu, M C
2017-04-01
Tuberculosis (TB) affects people globally and is being reconsidered as a serious public health problem in China. Reliable forecasting is useful for the prevention and control of TB. This study proposes a hybrid model combining autoregressive integrated moving average (ARIMA) with a nonlinear autoregressive (NAR) neural network for forecasting the incidence of TB from January 2007 to March 2016. Prediction performance was compared between the hybrid model and the ARIMA model. The best-fit hybrid model was combined with an ARIMA (3,1,0) × (0,1,1)12 and NAR neural network with four delays and 12 neurons in the hidden layer. The ARIMA-NAR hybrid model, which exhibited lower mean square error, mean absolute error, and mean absolute percentage error of 0·2209, 0·1373, and 0·0406, respectively, in the modelling performance, could produce more accurate forecasting of TB incidence compared to the ARIMA model. This study shows that developing and applying the ARIMA-NAR hybrid model is an effective method to fit the linear and nonlinear patterns of time-series data, and this model could be helpful in the prevention and control of TB.
Multi-step-ahead crude oil price forecasting using a hybrid grey wave model
NASA Astrophysics Data System (ADS)
Chen, Yanhui; Zhang, Chuan; He, Kaijian; Zheng, Aibing
2018-07-01
Crude oil is crucial to the operation and economic well-being of the modern society. Huge changes of crude oil price always cause panics to the global economy. There are many factors influencing crude oil price. Crude oil price prediction is still a difficult research problem widely discussed among researchers. Based on the researches on Heterogeneous Market Hypothesis and the relationship between crude oil price and macroeconomic factors, exchange market, stock market, this paper proposes a hybrid grey wave forecasting model, which combines Random Walk (RW)/ARMA to forecast multi-step-ahead crude oil price. More specifically, we use grey wave forecasting model to model the periodical characteristics of crude oil price and ARMA/RW to simulate the daily random movements. The innovation also comes from using the information of the time series graph to forecast crude oil price, since grey wave forecasting is a graphical prediction method. The empirical results demonstrate that based on the daily data of crude oil price, the hybrid grey wave forecasting model performs well in 15- to 20-step-ahead prediction and it always dominates ARMA and Random Walk in correct direction prediction.
Forecasting the value-at-risk of Chinese stock market using the HARQ model and extreme value theory
NASA Astrophysics Data System (ADS)
Liu, Guangqiang; Wei, Yu; Chen, Yongfei; Yu, Jiang; Hu, Yang
2018-06-01
Using intraday data of the CSI300 index, this paper discusses value-at-risk (VaR) forecasting of the Chinese stock market from the perspective of high-frequency volatility models. First, we measure the realized volatility (RV) with 5-minute high-frequency returns of the CSI300 index and then model it with the newly introduced heterogeneous autoregressive quarticity (HARQ) model, which can handle the time-varying coefficients of the HAR model. Second, we forecast the out-of-sample VaR of the CSI300 index by combining the HARQ model and extreme value theory (EVT). Finally, using several popular backtesting methods, we compare the VaR forecasting accuracy of HARQ model with other traditional HAR-type models, such as HAR, HAR-J, CHAR, and SHAR. The empirical results show that the novel HARQ model can beat other HAR-type models in forecasting the VaR of the Chinese stock market at various risk levels.
Appraisal of artificial neural network for forecasting of economic parameters
NASA Astrophysics Data System (ADS)
Kordanuli, Bojana; Barjaktarović, Lidija; Jeremić, Ljiljana; Alizamir, Meysam
2017-01-01
The main aim of this research is to develop and apply artificial neural network (ANN) with extreme learning machine (ELM) and back propagation (BP) to forecast gross domestic product (GDP) and Hirschman-Herfindahl Index (HHI). GDP could be developed based on combination of different factors. In this investigation GDP forecasting based on the agriculture and industry added value in gross domestic product (GDP) was analysed separately. Other inputs are final consumption expenditure of general government, gross fixed capital formation (investments) and fertility rate. The relation between product market competition and corporate investment is contentious. On one hand, the relation can be positive, but on the other hand, the relation can be negative. Several methods have been proposed to monitor market power for the purpose of developing procedures to mitigate or eliminate the effects. The most widely used methods are based on indices such as the Hirschman-Herfindahl Index (HHI). The reliability of the ANN models were accessed based on simulation results and using several statistical indicators. Based upon simulation results, it was presented that ELM shows better performances than BP learning algorithm in applications of GDP and HHI forecasting.
Cohen, Justin M; Singh, Inder; O'Brien, Megan E
2008-01-01
Background An accurate forecast of global demand is essential to stabilize the market for artemisinin-based combination therapy (ACT) and to ensure access to high-quality, life-saving medications at the lowest sustainable prices by avoiding underproduction and excessive overproduction, each of which can have negative consequences for the availability of affordable drugs. A robust forecast requires an understanding of the resources available to support procurement of these relatively expensive antimalarials, in particular from the Global Fund, at present the single largest source of ACT funding. Methods Predictive regression models estimating the timing and rate of disbursements from the Global Fund to recipient countries for each malaria grant were derived using a repeated split-sample procedure intended to avoid over-fitting. Predictions were compared against actual disbursements in a group of validation grants, and forecasts of ACT procurement extrapolated from disbursement predictions were evaluated against actual procurement in two sub-Saharan countries. Results Quarterly forecasts were correlated highly with actual smoothed disbursement rates (r = 0.987, p < 0.0001). Additionally, predicted ACT procurement, extrapolated from forecasted disbursements, was correlated strongly with actual ACT procurement supported by two grants from the Global Fund's first (r = 0.945, p < 0.0001) and fourth (r = 0.938, p < 0.0001) funding rounds. Conclusion This analysis derived predictive regression models that successfully forecasted disbursement patterning for individual Global Fund malaria grants. These results indicate the utility of this approach for demand forecasting of ACT and, potentially, for other commodities procured using funding from the Global Fund. Further validation using data from other countries in different regions and environments will be necessary to confirm its generalizability. PMID:18831742
Tighe, Patrick J.; Harle, Christopher A.; Hurley, Robert W.; Aytug, Haldun; Boezaart, Andre P.; Fillingim, Roger B.
2015-01-01
Background Given their ability to process highly dimensional datasets with hundreds of variables, machine learning algorithms may offer one solution to the vexing challenge of predicting postoperative pain. Methods Here, we report on the application of machine learning algorithms to predict postoperative pain outcomes in a retrospective cohort of 8071 surgical patients using 796 clinical variables. Five algorithms were compared in terms of their ability to forecast moderate to severe postoperative pain: Least Absolute Shrinkage and Selection Operator (LASSO), gradient-boosted decision tree, support vector machine, neural network, and k-nearest neighbor, with logistic regression included for baseline comparison. Results In forecasting moderate to severe postoperative pain for postoperative day (POD) 1, the LASSO algorithm, using all 796 variables, had the highest accuracy with an area under the receiver-operating curve (ROC) of 0.704. Next, the gradient-boosted decision tree had an ROC of 0.665 and the k-nearest neighbor algorithm had an ROC of 0.643. For POD 3, the LASSO algorithm, using all variables, again had the highest accuracy, with an ROC of 0.727. Logistic regression had a lower ROC of 0.5 for predicting pain outcomes on POD 1 and 3. Conclusions Machine learning algorithms, when combined with complex and heterogeneous data from electronic medical record systems, can forecast acute postoperative pain outcomes with accuracies similar to methods that rely only on variables specifically collected for pain outcome prediction. PMID:26031220
Short-term wind speed prediction based on the wavelet transformation and Adaboost neural network
NASA Astrophysics Data System (ADS)
Hai, Zhou; Xiang, Zhu; Haijian, Shao; Ji, Wu
2018-03-01
The operation of the power grid will be affected inevitably with the increasing scale of wind farm due to the inherent randomness and uncertainty, so the accurate wind speed forecasting is critical for the stability of the grid operation. Typically, the traditional forecasting method does not take into account the frequency characteristics of wind speed, which cannot reflect the nature of the wind speed signal changes result from the low generality ability of the model structure. AdaBoost neural network in combination with the multi-resolution and multi-scale decomposition of wind speed is proposed to design the model structure in order to improve the forecasting accuracy and generality ability. The experimental evaluation using the data from a real wind farm in Jiangsu province is given to demonstrate the proposed strategy can improve the robust and accuracy of the forecasted variable.
NASA Astrophysics Data System (ADS)
Choi, Yonghan; Cha, Dong-Hyun; Lee, Myong-In; Kim, Joowan; Jin, Chun-Sil; Park, Sang-Hun; Joh, Min-Su
2017-06-01
A total of three binary tropical cyclone (TC) cases over the Western North Pacific are selected to investigate the effects of satellite radiance data assimilation on analyses and forecasts of binary TCs. Two parallel cycling experiments with a 6 h interval are performed for each binary TC case, and the difference between the two experiments is whether satellite radiance observations are assimilated. Satellite radiance observations are assimilated using the Weather Research and Forecasting Data Assimilation (WRFDA)'s three-dimensional variational (3D-Var) system, which includes the observation operator, quality control procedures, and bias correction algorithm for radiance observations. On average, radiance assimilation results in slight improvements of environmental fields and track forecasts of binary TC cases, but the detailed effects vary with the case. When there is no direct interaction between binary TCs, radiance assimilation leads to better depictions of environmental fields, and finally it results in improved track forecasts. However, positive effects of radiance assimilation on track forecasts can be reduced when there exists a direct interaction between binary TCs and intensities/structures of binary TCs are not represented well. An initialization method (e.g., dynamic initialization) combined with radiance assimilation and/or more advanced DA techniques (e.g., hybrid method) can be considered to overcome these limitations.
NASA Astrophysics Data System (ADS)
Yi, J.; Choi, C.
2014-12-01
Rainfall observation and forecasting using remote sensing such as RADAR(Radio Detection and Ranging) and satellite images are widely used to delineate the increased damage by rapid weather changeslike regional storm and flash flood. The flood runoff was calculated by using adaptive neuro-fuzzy inference system, the data driven models and MAPLE(McGill Algorithm for Precipitation Nowcasting by Lagrangian Extrapolation) forecasted precipitation data as the input variables.The result of flood estimation method using neuro-fuzzy technique and RADAR forecasted precipitation data was evaluated by comparing it with the actual data.The Adaptive Neuro Fuzzy method was applied to the Chungju Reservoir basin in Korea. The six rainfall events during the flood seasons in 2010 and 2011 were used for the input data.The reservoir inflow estimation results were comparedaccording to the rainfall data used for training, checking and testing data in the model setup process. The results of the 15 models with the combination of the input variables were compared and analyzed. Using the relatively larger clustering radius and the biggest flood ever happened for training data showed the better flood estimation in this study.The model using the MAPLE forecasted precipitation data showed better result for inflow estimation in the Chungju Reservoir.
On the usage of divergence nudging in the DMI nowcasting system
NASA Astrophysics Data System (ADS)
Korsholm, Ulrik; Petersen, Claus; Hansen Sass, Bent; Woetmann Nielsen, Niels; Getreuer Jensen, David; Olsen, Bjarke Tobias; Vedel, Henrik
2014-05-01
DMI has recently proposed a new method for nudging radar reflectivity CAPPI products into their operational nowcasting system. The system is based on rapid update cycles (with hourly frequency) with the High Resolution Limited Area Model combined with surface and upper air analysis at each initial time. During the first 1.5 hours of a simulation the model dynamical state is nudged in accordance with the CAPPI product after which a free forecast is produced with a forecast length of 12 hours. The nudging method is based on the assumption that precipitation is forced by low level moisture convergence and an enhanced moisture source will lead to convective triggering of the model cloud scheme. If the model under-predicts precipitation before cut-off horizontal low level divergence is nudged towards an estimated value. These pseudo observations are calculated from the CAPPI product by assuming a specific vertical profile of the change in divergence field. The strength of the nudging is proportional to the difference between observed and modelled precipitation. When over-predicting, the low level moisture source is reduced, and in-cloud moisture is nudged towards environmental values. Results have been analysed in terms of the fractions skill score and the ability of the nudging method to position the precipitation cells correctly is discussed. The ability of the model to retain memory of the precipitation systems in the free forecast has also been investigated and examples of combining the nudging method with extrapolated reflectivity fields are also shown.
Data Quality Assessment Methods for the Eastern Range 915 MHz Wind Profiler Network
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.; Taylor, Gregory E.
1998-01-01
The Eastern Range installed a network of five 915 MHz Doppler Radar Wind Profilers with Radio Acoustic Sounding Systems in the Cape Canaveral Air Station/Kennedy Space Center area to provide three-dimensional wind speed and direction and virtual temperature estimates in the boundary layer. The Applied Meteorology Unit, staffed by ENSCO, Inc., was tasked by the 45th Weather Squadron, the Spaceflight Meteorology Group, and the National Weather Service in Melbourne, Florida to investigate methods which will help forecasters assess profiler network data quality when developing forecasts and warnings for critical ground, launch and landing operations. Four routines were evaluated in this study: a consensus time period check a precipitation contamination check, a median filter, and the Weber-Wuertz (WW) algorithm. No routine was able to effectively flag suspect data when used by itself. Therefore, the routines were used in different combinations. An evaluation of all possible combinations revealed two that provided the best results. The precipitation contamination and consensus time routines were used in both combinations. The median filter or WW was used as the final routine in the combinations to flag all other suspect data points.
ERIC Educational Resources Information Center
Smith, Curtis A.
"EnrollForecast for Excel" will generate a 5-year forecast of K-12 student enrollment. It will also work for any combination of grades between kindergarten and twelth. The forecasts can be printed as either a table or a graph. The user must provide birth history (only if forecasting kindergarten) and enrollment history information. The user also…
NASA Astrophysics Data System (ADS)
Yoon, S.; Lee, B.; Nakakita, E.; Lee, G.
2016-12-01
Recent climate changes and abnormal weather phenomena have resulted in increased occurrences of localized torrential rainfall. Urban areas in Korea have suffered from localized heavy rainfall, including the notable Seoul flood disaster in 2010 and 2011. The urban hydrological environment has changed in relation to precipitation, such as reduced concentration time, a decreased storage rate, and increased peak discharge. These changes have altered and accelerated the severity of damage to urban areas. In order to prevent such urban flash flood damages, we have to secure the lead time for evacuation through the improvement of radar-based quantitative precipitation forecasting (QPF). The purpose of this research is to improve the QPF products using spatial-scale decomposition method for considering the life time of storm and to assess the accuracy between traditional QPF method and proposed method in terms of urban flood management. The layout of this research is as below. First, this research applies the image filtering to separate the spatial-scale of rainfall field. Second, the separated small and large-scale rainfall fields are extrapolated by each different forecasting method. Third, forecasted rainfall fields are combined at each lead time. Finally, results of this method are evaluated and compared with the results of uniform advection model for urban flood modeling. It is expected that urban flood information using improved QPF will help to reduce casualties and property damage caused by urban flooding through this research.
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
Evaluation of annual, global seismicity forecasts, including ensemble models
NASA Astrophysics Data System (ADS)
Taroni, Matteo; Zechar, Jeremy; Marzocchi, Warner
2013-04-01
In 2009, the Collaboratory for the Study of the Earthquake Predictability (CSEP) initiated a prototype global earthquake forecast experiment. Three models participated in this experiment for 2009, 2010 and 2011—each model forecast the number of earthquakes above magnitude 6 in 1x1 degree cells that span the globe. Here we use likelihood-based metrics to evaluate the consistency of the forecasts with the observed seismicity. We compare model performance with statistical tests and a new method based on the peer-to-peer gambling score. The results of the comparisons are used to build ensemble models that are a weighted combination of the individual models. Notably, in these experiments the ensemble model always performs significantly better than the single best-performing model. Our results indicate the following: i) time-varying forecasts, if not updated after each major shock, may not provide significant advantages with respect to time-invariant models in 1-year forecast experiments; ii) the spatial distribution seems to be the most important feature to characterize the different forecasting performances of the models; iii) the interpretation of consistency tests may be misleading because some good models may be rejected while trivial models may pass consistency tests; iv) a proper ensemble modeling seems to be a valuable procedure to get the best performing model for practical purposes.
Plazas-Nossa, Leonardo; Torres, Andrés
2014-01-01
The objective of this work is to introduce a forecasting method for UV-Vis spectrometry time series that combines principal component analysis (PCA) and discrete Fourier transform (DFT), and to compare the results obtained with those obtained by using DFT. Three time series for three different study sites were used: (i) Salitre wastewater treatment plant (WWTP) in Bogotá; (ii) Gibraltar pumping station in Bogotá; and (iii) San Fernando WWTP in Itagüí (in the south part of Medellín). Each of these time series had an equal number of samples (1051). In general terms, the results obtained are hardly generalizable, as they seem to be highly dependent on specific water system dynamics; however, some trends can be outlined: (i) for UV range, DFT and PCA/DFT forecasting accuracy were almost the same; (ii) for visible range, the PCA/DFT forecasting procedure proposed gives systematically lower forecasting errors and variability than those obtained with the DFT procedure; and (iii) for short forecasting times the PCA/DFT procedure proposed is more suitable than the DFT procedure, according to processing times obtained.
Air Pollution Forecasts: An Overview
Bai, Lu; Wang, Jianzhou; Lu, Haiyan
2018-01-01
Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies. PMID:29673227
Air Pollution Forecasts: An Overview.
Bai, Lu; Wang, Jianzhou; Ma, Xuejiao; Lu, Haiyan
2018-04-17
Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies.
Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin
Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.
Lu, Chi-Jie; Chang, Chi-Chang
2014-01-01
Sales forecasting plays an important role in operating a business since it can be used to determine the required inventory level to meet consumer demand and avoid the problem of under/overstocking. Improving the accuracy of sales forecasting has become an important issue of operating a business. This study proposes a hybrid sales forecasting scheme by combining independent component analysis (ICA) with K-means clustering and support vector regression (SVR). The proposed scheme first uses the ICA to extract hidden information from the observed sales data. The extracted features are then applied to K-means algorithm for clustering the sales data into several disjoined clusters. Finally, the SVR forecasting models are applied to each group to generate final forecasting results. Experimental results from information technology (IT) product agent sales data reveal that the proposed sales forecasting scheme outperforms the three comparison models and hence provides an efficient alternative for sales forecasting.
NASA Technical Reports Server (NTRS)
Blankenship, Clay; Zavodsky, Bradley; Jedlovec, Gary; Wick, Gary; Neiman, Paul
2013-01-01
Atmospheric rivers are transient, narrow regions in the atmosphere responsible for the transport of large amounts of water vapor. These phenomena can have a large impact on precipitation. In particular, they can be responsible for intense rain events on the western coast of North America during the winter season. This paper focuses on attempts to improve forecasts of heavy precipitation events in the Western US due to atmospheric rivers. Profiles of water vapor derived from from Atmospheric Infrared Sounder (AIRS) observations are combined with GFS forecasts by a three-dimensional variational data assimilation in the Gridpoint Statistical Interpolation (GSI). Weather Research and Forecasting (WRF) forecasts initialized from the combined field are compared to forecasts initialized from the GFS forecast only for 3 test cases in the winter of 2011. Results will be presented showing the impact of the AIRS profile data on water vapor and temperature fields, and on the resultant precipitation forecasts.
2014-01-01
Sales forecasting plays an important role in operating a business since it can be used to determine the required inventory level to meet consumer demand and avoid the problem of under/overstocking. Improving the accuracy of sales forecasting has become an important issue of operating a business. This study proposes a hybrid sales forecasting scheme by combining independent component analysis (ICA) with K-means clustering and support vector regression (SVR). The proposed scheme first uses the ICA to extract hidden information from the observed sales data. The extracted features are then applied to K-means algorithm for clustering the sales data into several disjoined clusters. Finally, the SVR forecasting models are applied to each group to generate final forecasting results. Experimental results from information technology (IT) product agent sales data reveal that the proposed sales forecasting scheme outperforms the three comparison models and hence provides an efficient alternative for sales forecasting. PMID:25045738
Some New Mathematical Methods for Variational Objective Analysis
NASA Technical Reports Server (NTRS)
Wahba, G.; Johnson, D. R.
1984-01-01
New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.
An operational procedure for rapid flood risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc
2017-07-01
The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.
On the skill of various ensemble spread estimators for probabilistic short range wind forecasting
NASA Astrophysics Data System (ADS)
Kann, A.
2012-05-01
A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.
NASA Astrophysics Data System (ADS)
Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.
2018-07-01
Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.
NASA Astrophysics Data System (ADS)
Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles
2017-04-01
An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973
NASA Astrophysics Data System (ADS)
Salvage, R. O.; Neuberg, J. W.
2016-09-01
Prior to many volcanic eruptions, an acceleration in seismicity has been observed, suggesting the potential for this as a forecasting tool. The Failure Forecast Method (FFM) relates an accelerating precursor to the timing of failure by an empirical power law, with failure being defined in this context as the onset of an eruption. Previous applications of the FFM have used a wide variety of accelerating time series, often generating questionable forecasts with large misfits between data and the forecast, as well as the generation of a number of different forecasts from the same data series. Here, we show an alternative approach applying the FFM in combination with a cross correlation technique which identifies seismicity from a single active source mechanism and location at depth. Isolating a single system at depth avoids additional uncertainties introduced by averaging data over a number of different accelerating phenomena, and consequently reduces the misfit between the data and the forecast. Similar seismic waveforms were identified in the precursory accelerating seismicity to dome collapses at Soufrière Hills volcano, Montserrat in June 1997, July 2003 and February 2010. These events were specifically chosen since they represent a spectrum of collapse scenarios at this volcano. The cross correlation technique generates a five-fold increase in the number of seismic events which could be identified from continuous seismic data rather than using triggered data, thus providing a more holistic understanding of the ongoing seismicity at the time. The use of similar seismicity as a forecasting tool for collapses in 1997 and 2003 greatly improved the forecasted timing of the dome collapse, as well as improving the confidence in the forecast, thereby outperforming the classical application of the FFM. We suggest that focusing on a single active seismic system at depth allows a more accurate forecast of some of the major dome collapses from the ongoing eruption at Soufrière Hills volcano, and provides a simple addition to the well-used methodology of the FFM.
NASA Astrophysics Data System (ADS)
Cooper, Elizabeth; Dance, Sarah; Garcia-Pintado, Javier; Nichols, Nancy; Smith, Polly
2017-04-01
Timely and accurate inundation forecasting provides vital information about the behaviour of fluvial flood water, enabling mitigating actions to be taken by residents and emergency services. Data assimilation is a powerful mathematical technique for combining forecasts from hydrodynamic models with observations to produce a more accurate forecast. We discuss the effect of both domain size and channel friction parameter estimation on observation impact in data assimilation for inundation forecasting. Numerical shallow water simulations are carried out in a simple, idealized river channel topography. Data assimilation is performed using an Ensemble Transform Kalman Filter (ETKF) and synthetic observations of water depth in identical twin experiments. We show that reinitialising the numerical inundation model with corrected water levels after an assimilation can cause an initialisation shock if a hydrostatic assumption is made, leading to significant degradation of the forecast for several hours immediately following an assimilation. We demonstrate an effective and novel method for dealing with this. We find that using data assimilation to combine observations of water depth with forecasts from a hydrodynamic model corrects the forecast very effectively at time of the observations. In agreement with other authors we find that the corrected forecast then moves quickly back to the open loop forecast which does not take the observations into account. Our investigations show that the time taken for the forecast to decay back to the open loop case depends on the length of the domain of interest when only water levels are corrected. This is because the assimilation corrects water depths in all parts of the domain, even when observations are only available in one area. Error growth in the forecast step then starts at the upstream part of the domain and propagates downstream. The impact of the observations is therefore longer-lived in a longer domain. We have found that the upstream-downstream pattern of error growth can be due to incorrect friction parameter specification, rather than errors in inflow as shown elsewhere. Our results show that joint state-parameter estimation can recover accurate values for the parameter controlling channel friction processes in the model, even when observations of water level are only available on part of the flood plain. Correcting water levels and the channel friction parameter together leads to a large improvement in the forecast water levels at all simulation times. The impact of the observations is therefore much greater when the channel friction parameter is corrected along with water levels. We find that domain length effects disappear for joint state-parameter estimation.
Population forecasts for Bangladesh, using a Bayesian methodology.
Mahsin, Md; Hossain, Syed Shahadat
2012-12-01
Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.
Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J
2015-01-01
In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron.
Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J.
2015-01-01
In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron. PMID:25849483
NASA Astrophysics Data System (ADS)
Herman, J. D.; Steinschneider, S.; Nayak, M. A.
2017-12-01
Short-term weather forecasts are not codified into the operating policies of federal, multi-purpose reservoirs, despite their potential to improve service provision. This is particularly true for facilities that provide flood protection and water supply, since the potential flood damages are often too severe to accept the risk of inaccurate forecasts. Instead, operators must maintain empty storage capacity to mitigate flood risk, even if the system is currently in drought, as occurred in California from 2012-2016. This study investigates the potential for forecast-informed operating rules to improve water supply efficiency while maintaining flood protection, combining state-of-the-art weather hindcasts with a novel tree-based policy optimization framework. We hypothesize that forecasts need only accurately predict the occurrence of a storm, rather than its intensity, to be effective in regions like California where wintertime, synoptic-scale storms dominate the flood regime. We also investigate the potential for downstream groundwater injection to improve the utility of forecasts. These hypotheses are tested in a case study of Folsom Reservoir on the American River. Because available weather hindcasts are relatively short (10-20 years), we propose a new statistical framework to develop synthetic forecasts to assess the risk associated with inaccurate forecasts. The efficiency of operating policies is tested across a range of scenarios that include varying forecast skill and additional groundwater pumping capacity. Results suggest that the combined use of groundwater storage and short-term weather forecasts can substantially improve the tradeoff between water supply and flood control objectives in large, multi-purpose reservoirs in California.
Handique, Bijoy K; Khan, Siraj A; Mahanta, J; Sudhakar, S
2014-09-01
Japanese encephalitis (JE) is one of the dreaded mosquito-borne viral diseases mostly prevalent in south Asian countries including India. Early warning of the disease in terms of disease intensity is crucial for taking adequate and appropriate intervention measures. The present study was carried out in Dibrugarh district in the state of Assam located in the northeastern region of India to assess the accuracy of selected forecasting methods based on historical morbidity patterns of JE incidence during the past 22 years (1985-2006). Four selected forecasting methods, viz. seasonal average (SA), seasonal adjustment with last three observations (SAT), modified method adjusting long-term and cyclic trend (MSAT), and autoregressive integrated moving average (ARIMA) have been employed to assess the accuracy of each of the forecasting methods. The forecasting methods were validated for five consecutive years from 2007-2012 and accuracy of each method has been assessed. The forecasting method utilising seasonal adjustment with long-term and cyclic trend emerged as best forecasting method among the four selected forecasting methods and outperformed the even statistically more advanced ARIMA method. Peak of the disease incidence could effectively be predicted with all the methods, but there are significant variations in magnitude of forecast errors among the selected methods. As expected, variation in forecasts at primary health centre (PHC) level is wide as compared to that of district level forecasts. The study showed that adopted forecasting techniques could reasonably forecast the intensity of JE cases at PHC level without considering the external variables. The results indicate that the understanding of long-term and cyclic trend of the disease intensity will improve the accuracy of the forecasts, but there is a need for making the forecast models more robust to explain sudden variation in the disease intensity with detail analysis of parasite and host population dynamics.
Researches on High Accuracy Prediction Methods of Earth Orientation Parameters
NASA Astrophysics Data System (ADS)
Xu, X. Q.
2015-09-01
The Earth rotation reflects the coupling process among the solid Earth, atmosphere, oceans, mantle, and core of the Earth on multiple spatial and temporal scales. The Earth rotation can be described by the Earth's orientation parameters, which are abbreviated as EOP (mainly including two polar motion components PM_X and PM_Y, and variation in the length of day ΔLOD). The EOP is crucial in the transformation between the terrestrial and celestial reference systems, and has important applications in many areas such as the deep space exploration, satellite precise orbit determination, and astrogeodynamics. However, the EOP products obtained by the space geodetic technologies generally delay by several days to two weeks. The growing demands for modern space navigation make high-accuracy EOP prediction be a worthy topic. This thesis is composed of the following three aspects, for the purpose of improving the EOP forecast accuracy. (1) We analyze the relation between the length of the basic data series and the EOP forecast accuracy, and compare the EOP prediction accuracy for the linear autoregressive (AR) model and the nonlinear artificial neural network (ANN) method by performing the least squares (LS) extrapolations. The results show that the high precision forecast of EOP can be realized by appropriate selection of the basic data series length according to the required time span of EOP prediction: for short-term prediction, the basic data series should be shorter, while for the long-term prediction, the series should be longer. The analysis also showed that the LS+AR model is more suitable for the short-term forecasts, while the LS+ANN model shows the advantages in the medium- and long-term forecasts. (2) We develop for the first time a new method which combines the autoregressive model and Kalman filter (AR+Kalman) in short-term EOP prediction. The equations of observation and state are established using the EOP series and the autoregressive coefficients respectively, which are used to improve/re-evaluate the AR model. Comparing to the single AR model, the AR+Kalman method performs better in the prediction of UT1-UTC and ΔLOD, and the improvement in the prediction of the polar motion is significant. (3) Following the successful Earth Orientation Parameter Prediction Comparison Campaign (EOP PCC), the Earth Orientation Parameter Combination of Prediction Pilot Project (EOPC PPP) was sponsored in 2010. As one of the participants from China, we update and submit the short- and medium-term (1 to 90 days) EOP predictions every day. From the current comparative statistics, our prediction accuracy is on the medium international level. We will carry out more innovative researches to improve the EOP forecast accuracy and enhance our level in EOP forecast.
The Operational Forecasting of Undesirable Pollution Levels Based on a Combined Pollution Index
ERIC Educational Resources Information Center
McAdie, H. G.; Gillies, D. K. A.
1973-01-01
Describes the application of an air pollution index, in conjunction with synoptic meteorological forecasting, to an operational program for forecasting pollution potential in the Sarnia (Ontario) petrochemical complex. (JR)
Hybrid robust predictive optimization method of power system dispatch
Chandra, Ramu Sharat [Niskayuna, NY; Liu, Yan [Ballston Lake, NY; Bose, Sumit [Niskayuna, NY; de Bedout, Juan Manuel [West Glenville, NY
2011-08-02
A method of power system dispatch control solves power system dispatch problems by integrating a larger variety of generation, load and storage assets, including without limitation, combined heat and power (CHP) units, renewable generation with forecasting, controllable loads, electric, thermal and water energy storage. The method employs a predictive algorithm to dynamically schedule different assets in order to achieve global optimization and maintain the system normal operation.
A hybrid least squares support vector machines and GMDH approach for river flow forecasting
NASA Astrophysics Data System (ADS)
Samsudin, R.; Saad, P.; Shabri, A.
2010-06-01
This paper proposes a novel hybrid forecasting model, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM), known as GLSSVM. The GMDH is used to determine the useful input variables for LSSVM model and the LSSVM model which works as time series forecasting. In this study the application of GLSSVM for monthly river flow forecasting of Selangor and Bernam River are investigated. The results of the proposed GLSSVM approach are compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA) model, GMDH and LSSVM models using the long term observations of monthly river flow discharge. The standard statistical, the root mean square error (RMSE) and coefficient of correlation (R) are employed to evaluate the performance of various models developed. Experiment result indicates that the hybrid model was powerful tools to model discharge time series and can be applied successfully in complex hydrological modeling.
Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression
NASA Astrophysics Data System (ADS)
Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli
2018-06-01
Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.
NASA Technical Reports Server (NTRS)
Duda, David P.; Minnis, Patrick
2009-01-01
Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
NASA Astrophysics Data System (ADS)
Arsenault, R.; Mai, J.; Latraverse, M.; Tolson, B.
2017-12-01
Probabilistic ensemble forecasts generated by the ensemble streamflow prediction (ESP) methodology are subject to biases due to errors in the hydrological model's initial states. In day-to-day operations, hydrologists must compensate for discrepancies between observed and simulated states such as streamflow. However, in data-scarce regions, little to no information is available to guide the streamflow assimilation process. The manual assimilation process can then lead to more uncertainty due to the numerous options available to the forecaster. Furthermore, the model's mass balance may be compromised and could affect future forecasts. In this study we propose a data-driven approach in which specific variables that may be adjusted during assimilation are defined. The underlying principle was to identify key variables that would be the most appropriate to modify during streamflow assimilation depending on the initial conditions such as the time period of the assimilation, the snow water equivalent of the snowpack and meteorological conditions. The variables to adjust were determined by performing an automatic variational data assimilation on individual (or combinations of) model state variables and meteorological forcing. The assimilation aimed to simultaneously optimize: (1) the error between the observed and simulated streamflow at the timepoint where the forecasts starts and (2) the bias between medium to long-term observed and simulated flows, which were simulated by running the model with the observed meteorological data on a hindcast period. The optimal variables were then classified according to the initial conditions at the time period where the forecast is initiated. The proposed method was evaluated by measuring the average electricity generation of a hydropower complex in Québec, Canada driven by this method. A test-bed which simulates the real-world assimilation, forecasting, water release optimization and decision-making of a hydropower cascade was developed to assess the performance of each individual process in the reservoir management chain. Here the proposed method was compared to the PF algorithm while keeping all other elements intact. Preliminary results are encouraging in terms of power generation and robustness for the proposed approach.
NASA Astrophysics Data System (ADS)
Anastasiadis, Anastasios; Sandberg, Ingmar; Papaioannou, Athanasios; Georgoulis, Manolis; Tziotziou, Kostas; Jiggens, Piers; Hilgers, Alain
2015-04-01
We present a novel integrated prediction system, of both solar flares and solar energetic particle (SEP) events, which is in place to provide short-term warnings for hazardous solar radiation storms. FORSPEF system provides forecasting of solar eruptive events, such as solar flares with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. It also provides nowcasting of SEP events based on actual solar flare and CME near real-time alerts, as well as SEP characteristics (peak flux, fluence, rise time, duration) per parent solar event. The prediction of solar flares relies on a morphological method which is based on the sophisticated derivation of the effective connected magnetic field strength (Beff) of potentially flaring active-region (AR) magnetic configurations and it utilizes analysis of a large number of AR magnetograms. For the prediction of SEP events a new reductive statistical method has been implemented based on a newly constructed database of solar flares, CMEs and SEP events that covers a large time span from 1984-2013. The method is based on flare location (longitude), flare size (maximum soft X-ray intensity), and the occurrence (or not) of a CME. Warnings are issued for all > C1.0 soft X-ray flares. The warning time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective warning time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes. We discuss the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of solar flare and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK
NASA Astrophysics Data System (ADS)
Kaltenboeck, Rudolf; Kerschbaum, Markus; Hennermann, Karin; Mayer, Stefan
2013-04-01
Nowcasting of precipitation events, especially thunderstorm events or winter storms, has high impact on flight safety and efficiency for air traffic management. Future strategic planning by air traffic control will result in circumnavigation of potential hazardous areas, reduction of load around efficiency hot spots by offering alternatives, increase of handling capacity, anticipation of avoidance manoeuvres and increase of awareness before dangerous areas are entered by aircraft. To facilitate this rapid update forecasts of location, intensity, size, movement and development of local storms are necessary. Weather radar data deliver precipitation analysis of high temporal and spatial resolution close to real time by using clever scanning strategies. These data are the basis to generate rapid update forecasts in a time frame up to 2 hours and more for applications in aviation meteorological service provision, such as optimizing safety and economic impact in the context of sub-scale phenomena. On the basis of tracking radar echoes by correlation the movement vectors of successive weather radar images are calculated. For every new successive radar image a set of ensemble precipitation fields is collected by using different parameter sets like pattern match size, different time steps, filter methods and an implementation of history of tracking vectors and plausibility checks. This method considers the uncertainty in rain field displacement and different scales in time and space. By validating manually a set of case studies, the best verification method and skill score is defined and implemented into an online-verification scheme which calculates the optimized forecasts for different time steps and different areas by using different extrapolation ensemble members. To get information about the quality and reliability of the extrapolation process additional information of data quality (e.g. shielding in Alpine areas) is extrapolated and combined with an extrapolation-quality-index. Subsequently the probability and quality information of the forecast ensemble is available and flexible blending to numerical prediction model for each subarea is possible. Simultaneously with automatic processing the ensemble nowcasting product is visualized in a new innovative way which combines the intensity, probability and quality information for different subareas in one forecast image.
Improving medium-range and seasonal hydroclimate forecasts in the southeast USA
NASA Astrophysics Data System (ADS)
Tian, Di
Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.
Mobile geophysics for searching and exploration of Domanic hydrocarbon deposits
NASA Astrophysics Data System (ADS)
Borovsky, M. Ya; Uspensky, B. V.; Valeeva, S. E.; Borisov, A. S.
2018-05-01
There are noted features of shale hydrocarbons occurrence. It is shown the role of geophysical prospecting in the geological prospecting process for non-traditional sources of hydrocarbon. There are considered the possibilities of non-seismic methods for forecasting, prospecting, exploration and preparation of Domanikovian hydrocarbons accumulations for exploration. It is emphasized the need for geophysical studies of tectonic disturbances. Modern aerogeophysical instrumentation and methodological support allows to combine high-precision magneto-prospecting with gravimetric and gamma spectrometry. This combination of geophysical methods contributes to the diagnosis of active and latent faults.
Research on light rail electric load forecasting based on ARMA model
NASA Astrophysics Data System (ADS)
Huang, Yifan
2018-04-01
The article compares a variety of time series models and combines the characteristics of power load forecasting. Then, a light load forecasting model based on ARMA model is established. Based on this model, a light rail system is forecasted. The prediction results show that the accuracy of the model prediction is high.
NASA Technical Reports Server (NTRS)
Meng, Huan; Ferraro, Ralph; Kongoli, Cezar; Yan, Banghua; Zavodsky, Bradley; Zhao, Limin; Dong, Jun; Wang, Nai-Yu
2015-01-01
(AMSU), Microwave Humidity Sounder (MHS) and Advance Technology Microwave Sounder (ATMS). ATMS is the follow-on sensor to AMSU and MHS. Currently, an AMSU and MHS based land snowfall rate (SFR) product is running operationally at NOAA/NESDIS. Based on the AMSU/MHS SFR, an ATMS SFR algorithm has also been developed. The algorithm performs retrieval in three steps: snowfall detection, retrieval of cloud properties, and estimation of snow particle terminal velocity and snowfall rate. The snowfall detection component utilizes principal component analysis and a logistic regression model. It employs a combination of temperature and water vapor sounding channels to detect the scattering signal from falling snow and derives the probability of snowfall. Cloud properties are retrieved using an inversion method with an iteration algorithm and a two-stream radiative transfer model. A method adopted to calculate snow particle terminal velocity. Finally, snowfall rate is computed by numerically solving a complex integral. The SFR products are being used mainly in two communities: hydrology and weather forecast. Global blended precipitation products traditionally do not include snowfall derived from satellites because such products were not available operationally in the past. The ATMS and AMSU/MHS SFR now provide the winter precipitation information for these blended precipitation products. Weather forecasters mainly rely on radar and station observations for snowfall forecast. The SFR products can fill in gaps where no conventional snowfall data are available to forecasters. The products can also be used to confirm radar and gauge snowfall data and increase forecasters' confidence in their prediction.
NASA Astrophysics Data System (ADS)
Haiyang, Yu; Yanmei, Liu; Guijun, Yang; Xiaodong, Yang; Dong, Ren; Chenwei, Nie
2014-03-01
To achieve dynamic winter wheat quality monitoring and forecasting in larger scale regions, the objective of this study was to design and develop a winter wheat quality monitoring and forecasting system by using a remote sensing index and environmental factors. The winter wheat quality trend was forecasted before the harvest and quality was monitored after the harvest, respectively. The traditional quality-vegetation index from remote sensing monitoring and forecasting models were improved. Combining with latitude information, the vegetation index was used to estimate agronomy parameters which were related with winter wheat quality in the early stages for forecasting the quality trend. A combination of rainfall in May, temperature in May, illumination at later May, the soil available nitrogen content and other environmental factors established the quality monitoring model. Compared with a simple quality-vegetation index, the remote sensing monitoring and forecasting model used in this system get greatly improved accuracy. Winter wheat quality was monitored and forecasted based on the above models, and this system was completed based on WebGIS technology. Finally, in 2010 the operation process of winter wheat quality monitoring system was presented in Beijing, the monitoring and forecasting results was outputted as thematic maps.
NASA Astrophysics Data System (ADS)
Ma, Chaoqun; Wang, Tijian; Zang, Zengliang; Li, Zhijin
2018-07-01
Atmospheric chemistry models usually perform badly in forecasting wintertime air pollution because of their uncertainties. Generally, such uncertainties can be decreased effectively by techniques such as data assimilation (DA) and model output statistics (MOS). However, the relative importance and combined effects of the two techniques have not been clarified. Here, a one-month air quality forecast with the Weather Research and Forecasting-Chemistry (WRF-Chem) model was carried out in a virtually operational setup focusing on Hebei Province, China. Meanwhile, three-dimensional variational (3DVar) DA and MOS based on one-dimensional Kalman filtering were implemented separately and simultaneously to investigate their performance in improving the model forecast. Comparison with observations shows that the chemistry forecast with MOS outperforms that with 3DVar DA, which could be seen in all the species tested over the whole 72 forecast hours. Combined use of both techniques does not guarantee a better forecast than MOS only, with the improvements and degradations being small and appearing rather randomly. Results indicate that the implementation of MOS is more suitable than 3DVar DA in improving the operational forecasting ability of WRF-Chem.
A simple Lagrangian forecast system with aviation forecast potential
NASA Technical Reports Server (NTRS)
Petersen, R. A.; Homan, J. H.
1983-01-01
A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.
Water demand forecasting: review of soft computing methods.
Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R
2017-07-01
Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.
NASA Astrophysics Data System (ADS)
Foster, Kean; Bertacchi Uvo, Cintia; Olsson, Jonas
2018-05-01
Hydropower makes up nearly half of Sweden's electrical energy production. However, the distribution of the water resources is not aligned with demand, as most of the inflows to the reservoirs occur during the spring flood period. This means that carefully planned reservoir management is required to help redistribute water resources to ensure optimal production and accurate forecasts of the spring flood volume (SFV) is essential for this. The current operational SFV forecasts use a historical ensemble approach where the HBV model is forced with historical observations of precipitation and temperature. In this work we develop and test a multi-model prototype, building on previous work, and evaluate its ability to forecast the SFV in 84 sub-basins in northern Sweden. The hypothesis explored in this work is that a multi-model seasonal forecast system incorporating different modelling approaches is generally more skilful at forecasting the SFV in snow dominated regions than a forecast system that utilises only one approach. The testing is done using cross-validated hindcasts for the period 1981-2015 and the results are evaluated against both climatology and the current system to determine skill. Both the multi-model methods considered showed skill over the reference forecasts. The version that combined the historical modelling chain, dynamical modelling chain, and statistical modelling chain performed better than the other and was chosen for the prototype. The prototype was able to outperform the current operational system 57 % of the time on average and reduce the error in the SFV by ˜ 6 % across all sub-basins and forecast dates.
Development of predictive weather scenarios for early prediction of rice yield in South Korea
NASA Astrophysics Data System (ADS)
Shin, Y.; Cho, J.; Jung, I.
2017-12-01
International grain prices are becoming unstable due to frequent occurrence of abnormal weather phenomena caused by climate change. Early prediction of grain yield using weather forecast data is important for stabilization of international grain prices. The APEC Climate Center (APCC) is providing seasonal forecast data based on monthly climate prediction models for global seasonal forecasting services. The 3-month and 6-month seasonal forecast data using the multi-model ensemble (MME) technique are provided in their own website, ADSS (APCC Data Service System, http://adss.apcc21.org/). The spatial resolution of seasonal forecast data for each individual model is 2.5°×2.5°(about 250km) and the time scale is created as monthly. In this study, we developed customized weather forecast scenarios that are combined seasonal forecast data and observational data apply to early rice yield prediction model. Statistical downscale method was applied to produce meteorological input data of crop model because field scale crop model (ORYZA2000) requires daily weather data. In order to determine whether the forecasting data is suitable for the crop model, we produced spatio-temporal downscaled weather scenarios and evaluated the predictability by comparison with observed weather data at 57 ASOS stations in South Korea. The customized weather forecast scenarios can be applied to various application fields not only early rice yield prediction. Acknowledgement This work was carried out with the support of "Cooperative Research Program for Agriculture Science and Technology Development (Project No: PJ012855022017)" Rural Development Administration, Republic of Korea.
Ultra-Short-Term Wind Power Prediction Using a Hybrid Model
NASA Astrophysics Data System (ADS)
Mohammed, E.; Wang, S.; Yu, J.
2017-05-01
This paper aims to develop and apply a hybrid model of two data analytical methods, multiple linear regressions and least square (MLR&LS), for ultra-short-term wind power prediction (WPP), for example taking, Northeast China electricity demand. The data was obtained from the historical records of wind power from an offshore region, and from a wind farm of the wind power plant in the areas. The WPP achieved in two stages: first, the ratios of wind power were forecasted using the proposed hybrid method, and then the transformation of these ratios of wind power to obtain forecasted values. The hybrid model combines the persistence methods, MLR and LS. The proposed method included two prediction types, multi-point prediction and single-point prediction. WPP is tested by applying different models such as autoregressive moving average (ARMA), autoregressive integrated moving average (ARIMA) and artificial neural network (ANN). By comparing results of the above models, the validity of the proposed hybrid model is confirmed in terms of error and correlation coefficient. Comparison of results confirmed that the proposed method works effectively. Additional, forecasting errors were also computed and compared, to improve understanding of how to depict highly variable WPP and the correlations between actual and predicted wind power.
NASA Astrophysics Data System (ADS)
Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward
2018-04-01
A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.
NASA Astrophysics Data System (ADS)
Bailey, Monika E.; Isaac, George A.; Gultepe, Ismail; Heckman, Ivan; Reid, Janti
2014-01-01
An automated short-range forecasting system, adaptive blending of observations and model (ABOM), was tested in real time during the 2010 Vancouver Olympic and Paralympic Winter Games in British Columbia. Data at 1-min time resolution were available from a newly established, dense network of surface observation stations. Climatological data were not available at these new stations. This, combined with output from new high-resolution numerical models, provided a unique and exciting setting to test nowcasting systems in mountainous terrain during winter weather conditions. The ABOM method blends extrapolations in time of recent local observations with numerical weather predictions (NWP) model predictions to generate short-range point forecasts of surface variables out to 6 h. The relative weights of the model forecast and the observation extrapolation are based on performance over recent history. The average performance of ABOM nowcasts during February and March 2010 was evaluated using standard scores and thresholds important for Olympic events. Significant improvements over the model forecasts alone were obtained for continuous variables such as temperature, relative humidity and wind speed. The small improvements to forecasts of variables such as visibility and ceiling, subject to discontinuous changes, are attributed to the persistence component of ABOM.
MAG4 versus alternative techniques for forecasting active region flare productivity.
Falconer, David A; Moore, Ronald L; Barghouty, Abdulnasser F; Khazanov, Igor
2014-05-01
MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free magnetic energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the "Present MAG4" technique and each of three alternative techniques, called "McIntosh Active-Region Class," "Total Magnetic Flux," and "Next MAG4." We do this by using (1) the MAG4 database of magnetograms and major flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4). Quantitative comparison of performance of pairs of forecasting techniques Next MAG4 forecasts major flares more accurately than Present MAG4 Present MAG4 forecast outperforms McIntosh AR Class and total magnetic flux.
MAG4 versus alternative techniques for forecasting active region flare productivity
Falconer, David A; Moore, Ronald L; Barghouty, Abdulnasser F; Khazanov, Igor
2014-01-01
MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free magnetic energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the “Present MAG4” technique and each of three alternative techniques, called “McIntosh Active-Region Class,” “Total Magnetic Flux,” and “Next MAG4.” We do this by using (1) the MAG4 database of magnetograms and major flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4). Key Points Quantitative comparison of performance of pairs of forecasting techniques Next MAG4 forecasts major flares more accurately than Present MAG4 Present MAG4 forecast outperforms McIntosh AR Class and total magnetic flux PMID:26213517
An overview of health forecasting.
Soyiri, Ireneous N; Reidpath, Daniel D
2013-01-01
Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.
ERIC Educational Resources Information Center
Kvetan, Vladimir, Ed.
2014-01-01
Reliable and consistent time series are essential to any kind of economic forecasting. Skills forecasting needs to combine data from national accounts and labour force surveys, with the pan-European dimension of Cedefop's skills supply and demand forecasts, relying on different international classification standards. Sectoral classification (NACE)…
Time-varying loss forecast for an earthquake scenario in Basel, Switzerland
NASA Astrophysics Data System (ADS)
Herrmann, Marcus; Zechar, Jeremy D.; Wiemer, Stefan
2014-05-01
When an unexpected earthquake occurs, people suddenly want advice on how to cope with the situation. The 2009 L'Aquila quake highlighted the significance of public communication and pushed the usage of scientific methods to drive alternative risk mitigation strategies. For instance, van Stiphout et al. (2010) suggested a new approach for objective evacuation decisions on short-term: probabilistic risk forecasting combined with cost-benefit analysis. In the present work, we apply this approach to an earthquake sequence that simulated a repeat of the 1356 Basel earthquake, one of the most damaging events in Central Europe. A recent development to benefit society in case of an earthquake are probabilistic forecasts of the aftershock occurrence. But seismic risk delivers a more direct expression of the socio-economic impact. To forecast the seismic risk on short-term, we translate aftershock probabilities to time-varying seismic hazard and combine this with time-invariant loss estimation. Compared with van Stiphout et al. (2010), we use an advanced aftershock forecasting model and detailed settlement data to allow us spatial forecasts and settlement-specific decision-making. We quantify the risk forecast probabilistically in terms of human loss. For instance one minute after the M6.6 mainshock, the probability for an individual to die within the next 24 hours is 41 000 times higher than the long-term average; but the absolute value remains at minor 0.04 %. The final cost-benefit analysis adds value beyond a pure statistical approach: it provides objective statements that may justify evacuations. To deliver supportive information in a simple form, we propose a warning approach in terms of alarm levels. Our results do not justify evacuations prior to the M6.6 mainshock, but in certain districts afterwards. The ability to forecast the short-term seismic risk at any time-and with sufficient data anywhere-is the first step of personal decision-making and raising risk awareness among the public. Reference Van Stiphout, T., S. Wiemer, and W. Marzocchi (2010). 'Are short-term evacuations warranted? Case of the 2009 L'Aquila earthquake'. In: Geophysical Research Letters 37.6, pp. 1-5. url: http://onlinelibrary.wiley.com/doi/10.1029/ 2009GL042352/abstract.
A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data
NASA Astrophysics Data System (ADS)
Awajan, Ahmad Mohd; Ismail, Mohd Tahir
2017-08-01
Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.
Impact of data assimilation on ocean current forecasts in the Angola Basin
NASA Astrophysics Data System (ADS)
Phillipson, Luke; Toumi, Ralf
2017-06-01
The ocean current predictability in the data limited Angola Basin was investigated using the Regional Ocean Modelling System (ROMS) with four-dimensional variational data assimilation. Six experiments were undertaken comprising a baseline case of the assimilation of salinity/temperature profiles and satellite sea surface temperature, with the subsequent addition of altimetry, OSCAR (satellite-derived sea surface currents), drifters, altimetry and drifters combined, and OSCAR and drifters combined. The addition of drifters significantly improves Lagrangian predictability in comparison to the baseline case as well as the addition of either altimetry or OSCAR. OSCAR assimilation only improves Lagrangian predictability as much as altimetry assimilation. On average the assimilation of either altimetry or OSCAR with drifter velocities does not significantly improve Lagrangian predictability compared to the drifter assimilation alone, even degrading predictability in some cases. When the forecast current speed is large, it is more likely that the combination improves trajectory forecasts. Conversely, when the currents are weaker, it is more likely that the combination degrades the trajectory forecast.
Dynamic and static initialization of a mesoscale model using VAS satellite data. M.S. Thesis
NASA Technical Reports Server (NTRS)
Beauchamp, James G.
1985-01-01
Various combinations of temperature and moisture data from the VISSR Atmospheric Sounder (VAS), conventional radiosonde data, and National Meteorological Center (NMC) global analysis, were used in a successive-correction type of objective-analysis procedure to produce analyses for 1200 GMT. The NMC global analyses served as the first-guess field for all of the objective analysis procedures. The first-guess field was enhanced by radiosonde data alone, VAS data alone, both radiosonde and VAS data, or by neither data source. In addition, two objective analyses were used in a dynamic initialization: one included only radiosonde data and the other used both radiosonde and VAS data. The dependence of 12 hour forecast skill on data type and the methods by which the data were used in the analysis/initialization were then investigated. This was done by comparison of forecast and observed fields, of sea-level pressure, temperature, wind, moisture, and accumulated precipitation. The use of VAS data in the initial conditions had a slight positive impact upon forecast temperature and moisture but a negative impact upon forecast wind. This was true for both the static and dynamic initialization experiments. Precipitation forecasts from all of the model simulations were nearly the same.
NASA Astrophysics Data System (ADS)
Pan, Yujie; Xue, Ming; Zhu, Kefeng; Wang, Mingjun
2018-05-01
A dual-resolution (DR) version of a regional ensemble Kalman filter (EnKF)-3D ensemble variational (3DEnVar) coupled hybrid data assimilation system is implemented as a prototype for the operational Rapid Refresh forecasting system. The DR 3DEnVar system combines a high-resolution (HR) deterministic background forecast with lower-resolution (LR) EnKF ensemble perturbations used for flow-dependent background error covariance to produce a HR analysis. The computational cost is substantially reduced by running the ensemble forecasts and EnKF analyses at LR. The DR 3DEnVar system is tested with 3-h cycles over a 9-day period using a 40/˜13-km grid spacing combination. The HR forecasts from the DR hybrid analyses are compared with forecasts launched from HR Gridpoint Statistical Interpolation (GSI) 3D variational (3DVar) analyses, and single LR hybrid analyses interpolated to the HR grid. With the DR 3DEnVar system, a 90% weight for the ensemble covariance yields the lowest forecast errors and the DR hybrid system clearly outperforms the HR GSI 3DVar. Humidity and wind forecasts are also better than those launched from interpolated LR hybrid analyses, but the temperature forecasts are slightly worse. The humidity forecasts are improved most. For precipitation forecasts, the DR 3DEnVar always outperforms HR GSI 3DVar. It also outperforms the LR 3DEnVar, except for the initial forecast period and lower thresholds.
A probabilistic neural network based approach for predicting the output power of wind turbines
NASA Astrophysics Data System (ADS)
Tabatabaei, Sajad
2017-03-01
Finding the authentic predicting tools of eliminating the uncertainty of wind speed forecasts is highly required while wind power sources are strongly penetrating. Recently, traditional predicting models of generating point forecasts have no longer been trustee. Thus, the present paper aims at utilising the concept of prediction intervals (PIs) to assess the uncertainty of wind power generation in power systems. Besides, this paper uses a newly introduced non-parametric approach called lower upper bound estimation (LUBE) to build the PIs since the forecasting errors are unable to be modelled properly by applying distribution probability functions. In the present proposed LUBE method, a PI combination-based fuzzy framework is used to overcome the performance instability of neutral networks (NNs) used in LUBE. In comparison to other methods, this formulation more suitably has satisfied the PI coverage and PI normalised average width (PINAW). Since this non-linear problem has a high complexity, a new heuristic-based optimisation algorithm comprising a novel modification is introduced to solve the aforesaid problems. Based on data sets taken from a wind farm in Australia, the feasibility and satisfying performance of the suggested method have been investigated.
NASA Astrophysics Data System (ADS)
Claverie, M.; Franch, B.; Vermote, E.; Becker-Reshef, I.; Justice, C. O.
2015-12-01
Wheat is one of the key cereals crop grown worldwide. Thus, accurate and timely forecasts of its production are critical for informing agricultural policies and investments, as well as increasing market efficiency and stability. Becker-Reshef et al. (2010) used an empirical generalized model for forecasting winter wheat production using combined BRDF-corrected daily surface reflectance from the Moderate resolution Imaging Spectroradiometer (MODIS) Climate Modeling Grid (CMG) with detailed official crop statistics and crop type masks. It is based on the relationship between the Normalized Difference Vegetation Index (NDVI) at the peak of the growing season, percent wheat within the CMG pixel, and the final yields. This method predicts the yield approximately one month to six weeks prior to harvest. Recently, Franch et al. (2015) included Growing Degree Day (GDD) information extracted from NCEP/NCAR reanalysis data in order to improve the winter wheat production forecast by increasing the timeliness of the forecasts between a month to a month and a half prior to the peak NDVI (i.e. 1-2.5 months prior to harvest), while conserving the accuracy of the original model. In this study, we apply these methods to historical data from the Advanced Very High Resolution Radiometer (AVHRR). We apply both the original and the modified model to United States of America from 1990 to 2014 and inter-compare the AVHRR results to MODIS from 2000 to 2014.
NASA Astrophysics Data System (ADS)
Hernandez, F.; Liang, X.
2017-12-01
Reliable real-time hydrological forecasting, to predict important phenomena such as floods, is invaluable to the society. However, modern high-resolution distributed models have faced challenges when dealing with uncertainties that are caused by the large number of parameters and initial state estimations involved. Therefore, to rely on these high-resolution models for critical real-time forecast applications, considerable improvements on the parameter and initial state estimation techniques must be made. In this work we present a unified data assimilation algorithm called Optimized PareTo Inverse Modeling through Inverse STochastic Search (OPTIMISTS) to deal with the challenge of having robust flood forecasting for high-resolution distributed models. This new algorithm combines the advantages of particle filters and variational methods in a unique way to overcome their individual weaknesses. The analysis of candidate particles compares model results with observations in a flexible time frame, and a multi-objective approach is proposed which attempts to simultaneously minimize differences with the observations and departures from the background states by using both Bayesian sampling and non-convex evolutionary optimization. Moreover, the resulting Pareto front is given a probabilistic interpretation through kernel density estimation to create a non-Gaussian distribution of the states. OPTIMISTS was tested on a low-resolution distributed land surface model using VIC (Variable Infiltration Capacity) and on a high-resolution distributed hydrological model using the DHSVM (Distributed Hydrology Soil Vegetation Model). In the tests streamflow observations are assimilated. OPTIMISTS was also compared with a traditional particle filter and a variational method. Results show that our method can reliably produce adequate forecasts and that it is able to outperform those resulting from assimilating the observations using a particle filter or an evolutionary 4D variational method alone. In addition, our method is shown to be efficient in tackling high-resolution applications with robust results.
How do I know if I’ve improved my continental scale flood early warning system?
NASA Astrophysics Data System (ADS)
Cloke, Hannah L.; Pappenberger, Florian; Smith, Paul J.; Wetterhall, Fredrik
2017-04-01
Flood early warning systems mitigate damages and loss of life and are an economically efficient way of enhancing disaster resilience. The use of continental scale flood early warning systems is rapidly growing. The European Flood Awareness System (EFAS) is a pan-European flood early warning system forced by a multi-model ensemble of numerical weather predictions. Responses to scientific and technical changes can be complex in these computationally expensive continental scale systems, and improvements need to be tested by evaluating runs of the whole system. It is demonstrated here that forecast skill is not correlated with the value of warnings. In order to tell if the system has been improved an evaluation strategy is required that considers both forecast skill and warning value. The combination of a multi-forcing ensemble of EFAS flood forecasts is evaluated with a new skill-value strategy. The full multi-forcing ensemble is recommended for operational forecasting, but, there are spatial variations in the optimal forecast combination. Results indicate that optimizing forecasts based on value rather than skill alters the optimal forcing combination and the forecast performance. Also indicated is that model diversity and ensemble size are both important in achieving best overall performance. The use of several evaluation measures that consider both skill and value is strongly recommended when considering improvements to early warning systems.
Comparison of Adaline and Multiple Linear Regression Methods for Rainfall Forecasting
NASA Astrophysics Data System (ADS)
Sutawinaya, IP; Astawa, INGA; Hariyanti, NKD
2018-01-01
Heavy rainfall can cause disaster, therefore need a forecast to predict rainfall intensity. Main factor that cause flooding is there is a high rainfall intensity and it makes the river become overcapacity. This will cause flooding around the area. Rainfall factor is a dynamic factor, so rainfall is very interesting to be studied. In order to support the rainfall forecasting, there are methods that can be used from Artificial Intelligence (AI) to statistic. In this research, we used Adaline for AI method and Regression for statistic method. The more accurate forecast result shows the method that used is good for forecasting the rainfall. Through those methods, we expected which is the best method for rainfall forecasting here.
A stacking ensemble learning framework for annual river ice breakup dates
NASA Astrophysics Data System (ADS)
Sun, Wei; Trevor, Bernard
2018-06-01
River ice breakup dates (BDs) are not merely a proxy indicator of climate variability and change, but a direct concern in the management of local ice-caused flooding. A framework of stacking ensemble learning for annual river ice BDs was developed, which included two-level components: member and combining models. The member models described the relations between BD and their affecting indicators; the combining models linked the predicted BD by each member models with the observed BD. Especially, Bayesian regularization back-propagation artificial neural network (BRANN), and adaptive neuro fuzzy inference systems (ANFIS) were employed as both member and combining models. The candidate combining models also included the simple average methods (SAM). The input variables for member models were selected by a hybrid filter and wrapper method. The performances of these models were examined using the leave-one-out cross validation. As the largest unregulated river in Alberta, Canada with ice jams frequently occurring in the vicinity of Fort McMurray, the Athabasca River at Fort McMurray was selected as the study area. The breakup dates and candidate affecting indicators in 1980-2015 were collected. The results showed that, the BRANN member models generally outperformed the ANFIS member models in terms of better performances and simpler structures. The difference between the R and MI rankings of inputs in the optimal member models may imply that the linear correlation based filter method would be feasible to generate a range of candidate inputs for further screening through other wrapper or embedded IVS methods. The SAM and BRANN combining models generally outperformed all member models. The optimal SAM combining model combined two BRANN member models and improved upon them in terms of average squared errors by 14.6% and 18.1% respectively. In this study, for the first time, the stacking ensemble learning was applied to forecasting of river ice breakup dates, which appeared promising for other river ice forecasting problems.
Smart sensorless prediction diagnosis of electric drives
NASA Astrophysics Data System (ADS)
Kruglova, TN; Glebov, NA; Shoshiashvili, ME
2017-10-01
In this paper, the discuss diagnostic method and prediction of the technical condition of an electrical motor using artificial intelligent method, based on the combination of fuzzy logic and neural networks, are discussed. The fuzzy sub-model determines the degree of development of each fault. The neural network determines the state of the object as a whole and the number of serviceable work periods for motors actuator. The combination of advanced techniques reduces the learning time and increases the forecasting accuracy. The experimental implementation of the method for electric drive diagnosis and associated equipment is carried out at different speeds. As a result, it was found that this method allows troubleshooting the drive at any given speed.
NASA Astrophysics Data System (ADS)
Dreano, Denis; Tsiaras, Kostas; Triantafyllou, George; Hoteit, Ibrahim
2017-07-01
Forecasting the state of large marine ecosystems is important for many economic and public health applications. However, advanced three-dimensional (3D) ecosystem models, such as the European Regional Seas Ecosystem Model (ERSEM), are computationally expensive, especially when implemented within an ensemble data assimilation system requiring several parallel integrations. As an alternative to 3D ecological forecasting systems, we propose to implement a set of regional one-dimensional (1D) water-column ecological models that run at a fraction of the computational cost. The 1D model domains are determined using a Gaussian mixture model (GMM)-based clustering method and satellite chlorophyll-a (Chl-a) data. Regionally averaged Chl-a data is assimilated into the 1D models using the singular evolutive interpolated Kalman (SEIK) filter. To laterally exchange information between subregions and improve the forecasting skills, we introduce a new correction step to the assimilation scheme, in which we assimilate a statistical forecast of future Chl-a observations based on information from neighbouring regions. We apply this approach to the Red Sea and show that the assimilative 1D ecological models can forecast surface Chl-a concentration with high accuracy. The statistical assimilation step further improves the forecasting skill by as much as 50%. This general approach of clustering large marine areas and running several interacting 1D ecological models is very flexible. It allows many combinations of clustering, filtering and regression technics to be used and can be applied to build efficient forecasting systems in other large marine ecosystems.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
Combining SVM and flame radiation to forecast BOF end-point
NASA Astrophysics Data System (ADS)
Wen, Hongyuan; Zhao, Qi; Xu, Lingfei; Zhou, Munchun; Chen, Yanru
2009-05-01
Because of complex reactions in Basic Oxygen Furnace (BOF) for steelmaking, the main end-point control methods of steelmaking have insurmountable difficulties. Aiming at these problems, a support vector machine (SVM) method for forecasting the BOF steelmaking end-point is presented based on flame radiation information. The basis is that the furnace flame is the performance of the carbon oxygen reaction, because the carbon oxygen reaction is the major reaction in the steelmaking furnace. The system can acquire spectrum and image data quickly in the steelmaking adverse environment. The structure of SVM and the multilayer feed-ward neural network are similar, but SVM model could overcome the inherent defects of the latter. The model is trained and forecasted by using SVM and some appropriate variables of light and image characteristic information. The model training process follows the structure risk minimum (SRM) criterion and the design parameter can be adjusted automatically according to the sampled data in the training process. Experimental results indicate that the prediction precision of the SVM model and the executive time both meet the requirements of end-point judgment online.
An Integrated Urban Flood Analysis System in South Korea
NASA Astrophysics Data System (ADS)
Moon, Young-Il; Kim, Min-Seok; Yoon, Tae-Hyung; Choi, Ji-Hyeok
2017-04-01
Due to climate change and the rapid growth of urbanization, the frequency of concentrated heavy rainfall has caused urban floods. As a result, we studied climate change in Korea and developed an integrated flood analysis system that systematized technology to quantify flood risk and flood forecasting in urban areas. This system supports synthetic decision-making through real-time monitoring and prediction on flash rain or short-term rainfall by using radar and satellite information. As part of the measures to deal with the increase of inland flood damage, we have found it necessary to build a systematic city flood prevention system that systematizes technology to quantify flood risk as well as flood forecast, taking into consideration both inland and river water. This combined inland-river flood analysis system conducts prediction on flash rain or short-term rainfall by using radar and satellite information and performs prompt and accurate prediction on the inland flooded area. In addition, flood forecasts should be accurate and immediate. Accurate flood forecasts signify that the prediction of the watch, warning time and water level is precise. Immediate flood forecasts represent the forecasts lead time which is the time needed to evacuate. Therefore, in this study, in order to apply rainfall-runoff method to medium and small urban stream for flood forecasts, short-term rainfall forecasting using radar is applied to improve immediacy. Finally, it supports synthetic decision-making for prevention of flood disaster through real-time monitoring. Keywords: Urban Flood, Integrated flood analysis system, Rainfall forecasting, Korea Acknowledgments This research was supported by a grant (16AWMP-B066744-04) from Advanced Water Management Research Program (AWMP) funded by Ministry of Land, Infrastructure and Transport of Korean government.
NASA Astrophysics Data System (ADS)
Vermote, E.; Franch, B.; Becker-Reshef, I.; Claverie, M.; Huang, J.; Zhang, J.; Sobrino, J. A.
2014-12-01
Wheat is the most important cereal crop traded on international markets and winter wheat constitutes approximately 80% of global wheat production. Thus, accurate and timely forecasts of its production are critical for informing agricultural policies and investments, as well as increasing market efficiency and stability. Becker-Reshef et al. (2010) used an empirical generalized model for forecasting winter wheat production. Their approach combined BRDF-corrected daily surface reflectance from Moderate resolution Imaging Spectroradiometer (MODIS) Climate Modeling Grid (CMG) with detailed official crop statistics and crop type masks. It is based on the relationship between the Normalized Difference Vegetation Index (NDVI) at the peak of the growing season, percent wheat within the CMG pixel, and the final yields. This method predicts the yield approximately one month to six weeks prior to harvest. In this study, we include the Growing Degree Day (GDD) information extracted from NCEP/NCAR reanalysis data in order to improve the winter wheat production forecast by increasing the timeliness of the forecasts while conserving the accuracy of the original model. We apply this modified model to three major wheat-producing countries: United States of America, Ukraine and China from 2001 to 2012. We show that a reliable forecast can be made between one month to a month and a half prior to the peak NDVI (meaning two months to two and a half months prior to harvest) while conserving an accuracy of 10% in the production forecast.
Olshansky, S Jay; Goldman, Dana P; Zheng, Yuhui; Rowe, John W
2009-01-01
Context: The aging of the baby boom generation, the extension of life, and progressive increases in disability-free life expectancy have generated a dramatic demographic transition in the United States. Official government forecasts may, however, have inadvertently underestimated life expectancy, which would have major policy implications, since small differences in forecasts of life expectancy produce very large differences in the number of people surviving to an older age. This article presents a new set of population and life expectancy forecasts for the United States, focusing on transitions that will take place by midcentury. Methods: Forecasts were made with a cohort-components methodology, based on the premise that the risk of death will be influenced in the coming decades by accelerated advances in biomedical technology that either delay the onset and age progression of major fatal diseases or that slow the aging process itself. Findings: Results indicate that the current forecasts of the U.S. Social Security Administration and U.S. Census Bureau may underestimate the rise in life expectancy at birth for men and women combined, by 2050, from 3.1 to 7.9 years. Conclusions: The cumulative outlays for Medicare and Social Security could be higher by $3.2 to $8.3 trillion relative to current government forecasts. This article discusses the implications of these results regarding the benefits and costs of an aging society and the prospect that health disparities could attenuate some of these changes. PMID:20021588
Using Time-Series Regression to Predict Academic Library Circulations.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1984-01-01
Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…
Teodoro, Douglas; Lovis, Christian
2013-01-01
Background Antibiotic resistance is a major worldwide public health concern. In clinical settings, timely antibiotic resistance information is key for care providers as it allows appropriate targeted treatment or improved empirical treatment when the specific results of the patient are not yet available. Objective To improve antibiotic resistance trend analysis algorithms by building a novel, fully data-driven forecasting method from the combination of trend extraction and machine learning models for enhanced biosurveillance systems. Methods We investigate a robust model for extraction and forecasting of antibiotic resistance trends using a decade of microbiology data. Our method consists of breaking down the resistance time series into independent oscillatory components via the empirical mode decomposition technique. The resulting waveforms describing intrinsic resistance trends serve as the input for the forecasting algorithm. The algorithm applies the delay coordinate embedding theorem together with the k-nearest neighbor framework to project mappings from past events into the future dimension and estimate the resistance levels. Results The algorithms that decompose the resistance time series and filter out high frequency components showed statistically significant performance improvements in comparison with a benchmark random walk model. We present further qualitative use-cases of antibiotic resistance trend extraction, where empirical mode decomposition was applied to highlight the specificities of the resistance trends. Conclusion The decomposition of the raw signal was found not only to yield valuable insight into the resistance evolution, but also to produce novel models of resistance forecasters with boosted prediction performance, which could be utilized as a complementary method in the analysis of antibiotic resistance trends. PMID:23637796
Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin
2018-01-01
Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.
An interdisciplinary approach for earthquake modelling and forecasting
NASA Astrophysics Data System (ADS)
Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.
2016-12-01
Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.
Time series modeling and forecasting using memetic algorithms for regime-switching models.
Bergmeir, Christoph; Triguero, Isaac; Molina, Daniel; Aznarte, José Luis; Benitez, José Manuel
2012-11-01
In this brief, we present a novel model fitting procedure for the neuro-coefficient smooth transition autoregressive model (NCSTAR), as presented by Medeiros and Veiga. The model is endowed with a statistically founded iterative building procedure and can be interpreted in terms of fuzzy rule-based systems. The interpretability of the generated models and a mathematically sound building procedure are two very important properties of forecasting models. The model fitting procedure employed by the original NCSTAR is a combination of initial parameter estimation by a grid search procedure with a traditional local search algorithm. We propose a different fitting procedure, using a memetic algorithm, in order to obtain more accurate models. An empirical evaluation of the method is performed, applying it to various real-world time series originating from three forecasting competitions. The results indicate that we can significantly enhance the accuracy of the models, making them competitive to models commonly used in the field.
Forecasting stochastic neural network based on financial empirical mode decomposition.
Wang, Jie; Wang, Jun
2017-06-01
In an attempt to improve the forecasting accuracy of stock price fluctuations, a new one-step-ahead model is developed in this paper which combines empirical mode decomposition (EMD) with stochastic time strength neural network (STNN). The EMD is a processing technique introduced to extract all the oscillatory modes embedded in a series, and the STNN model is established for considering the weight of occurrence time of the historical data. The linear regression performs the predictive availability of the proposed model, and the effectiveness of EMD-STNN is revealed clearly through comparing the predicted results with the traditional models. Moreover, a new evaluated method (q-order multiscale complexity invariant distance) is applied to measure the predicted results of real stock index series, and the empirical results show that the proposed model indeed displays a good performance in forecasting stock market fluctuations. Copyright © 2017 Elsevier Ltd. All rights reserved.
A practitioner's tool for assessing glide crack activity
Hendrikx, Jordy; Peitzsch, Erich H.; Fagre, Daniel B.
2010-01-01
Glide cracks can result in full-depth glide avalanche release. Avalanches from glide cracks are notoriously difficult to forecast, but are a reoccurring problem in a number of different avalanche forecasting programs across a range of snow climates. Despite this, there is no consensus for how to best manage, mitigate, or even observe glide cracks and the potential resultant avalanche activity. It is thought that an increase in the rate of snow gliding occurs prior to full-depth avalanche activity, so frequent measuring of glide crack movement provides an index of instability. Therefore, a comprehensive avalanche program with glide crack avalanche activity, should at the least, undertake some form of direct monitoring of glide crack movement. In this paper we present a simple, cheap and repeatable method to track glide crack activity using a series of stakes, reflectors and a laser rangefinder (LaserTech TruPulse360B) linked to a GPS (Trimble Geo XH). We tested the methodology in April 2010, on a glide crack above the Going to the Sun Road in Glacier National Park, Montana, USA. This study suggests a new method to better track the development and movement of glide cracks. It is hoped that by introducing a workable method to easily record glide crack movement, avalanche forecasters will improve their understanding of when, or if, avalanche activity will ensue. Our initial results suggest that these new observations, when combined with local micrometeorological data will result in improved process understanding and forecasting of these phenomena.
Forecasting approaches to the Mekong River
NASA Astrophysics Data System (ADS)
Plate, E. J.
2009-04-01
Hydrologists distinguish between flood forecasts, which are concerned with events of the immediate future, and flood predictions, which are concerned with events that are possible, but whose date of occurrence is not determined. Although in principle both involve the determination of runoff from rainfall, the analytical approaches differ because of different objectives. The differences between the two approaches will be discussed, starting with an analysis of the forecasting process. The Mekong River in south-east Asia is used as an example. Prediction is defined as forecast for a hypothetical event, such as the 100-year flood, which is usually sufficiently specified by its magnitude and its probability of occurrence. It forms the basis for designing flood protection structures and risk management activities. The method for determining these quantities is hydrological modeling combined with extreme value statistics, today usually applied both to rainfall events and to observed river discharges. A rainfall-runoff model converts extreme rainfall events into extreme discharges, which at certain gage points along a river are calibrated against observed discharges. The quality of the model output is assessed against the mean value by means of the Nash-Sutcliffe quality criterion. The result of this procedure is a design hydrograph (or a family of design hydrographs) which are used as inputs into a hydraulic model, which converts the hydrograph into design water levels according to the hydraulic situation of the location. The accuracy of making a prediction in this sense is not particularly high: hydrologists know that the 100-year flood is a statistical quantity which can be estimated only within comparatively wide error bounds, and the hydraulics of a river site, in particular under conditions of heavy sediment loads has many uncertainties. Safety margins, such as additional freeboards are arranged to compensate for the uncertainty of the prediction. Forecasts, on the other hand, have as objective to obtain an accurate hydrograph of the near future. The method by means of which this is done is not as important as the accuracy of the forecast. A mathematical rainfall-runoff model is not necessarily a good forecast model. It has to be very carefully designed, and in many cases statistical models are found to give better results than mathematical models. Forecasters have the advantage of knowing the course of the hydrographs up to the point in time where forecasts have to be made. Therefore, models can be calibrated on line against the hydrograph of the immediate past. To assess the quality of a forecast, the quality criterion should not be based on the mean value, as does the Nash-Sutcliffe criterion, but should be based on the best forecast given the information up to the forecast time. Without any additional information, the best forecast when only the present day value is known is to assume a no-change scenario, i.e. to assume that the present value does not change in the immediate future. For the Mekong there exists a forecasting system which is based on a rainfall-runoff model operated by the Mekong River Commission. This model is found not to be adequate for forecasting for periods longer than one or two days ahead. Improvements are sought through two approaches: a strictly deterministic rainfall-runoff model, and a strictly statistical model based on regression with upstream stations. The two approaches are com-pared, and suggestions are made how to best combine the advantages of both approaches. This requires that due consideration is given to critical hydraulic conditions of the river at and in between the gauging stations. Critical situations occur in two ways: when the river overtops, in which case the rainfall-runoff model is incomplete unless overflow losses are considered, and at the confluence with tributaries. Of particular importance is the role of the large Tonle Sap Lake, which dampens the hydrograph downstream of Phnom Penh. The effect of these components of river hydraulics on forecasting accuracy will be assessed.
NASA Astrophysics Data System (ADS)
Sinsky, E.; Zhu, Y.; Li, W.; Guan, H.; Melhauser, C.
2017-12-01
Optimal forecast quality is crucial for the preservation of life and property. Improving monthly forecast performance over both the tropics and extra-tropics requires attention to various physical aspects such as the representation of the underlying SST, model physics and the representation of the model physics uncertainty for an ensemble forecast system. This work focuses on the impact of stochastic physics, SST and the convection scheme on forecast performance for the sub-seasonal scale over the tropics and extra-tropics with emphasis on the Madden-Julian Oscillation (MJO). A 2-year period is evaluated using the National Centers for Environmental Prediction (NCEP) Global Ensemble Forecast System (GEFS). Three experiments with different configurations than the operational GEFS were performed to illustrate the impact of the stochastic physics, SST and convection scheme. These experiments are compared against a control experiment (CTL) which consists of the operational GEFS but its integration is extended from 16 to 35 days. The three configurations are: 1) SPs, which uses a Stochastically Perturbed Physics Tendencies (SPPT), Stochastic Perturbed Humidity (SHUM) and Stochastic Kinetic Energy Backscatter (SKEB); 2) SPs+SST_bc, which uses a combination of SPs and a bias-corrected forecast SST from the NCEP Climate Forecast System Version 2 (CFSv2); and 3) SPs+SST_bc+SA_CV, which combines SPs, a bias-corrected forecast SST and a scale aware convection scheme. When comparing to the CTL experiment, SPs shows substantial improvement. The MJO skill has improved by about 4 lead days during the 2-year period. Improvement is also seen over the extra-tropics due to the updated stochastic physics, where there is a 3.1% and a 4.2% improvement during weeks 3 and 4 over the northern hemisphere and southern hemisphere, respectively. Improvement is also seen when the bias-corrected CFSv2 SST is combined with SPs. Additionally, forecast performance enhances when the scale aware convection scheme (SPs+SST_bc+SA_CV) is added, especially over the tropics. Among the three experiments, the SPs+SST_bc+SA_CV is the best configuration in MJO forecast skill.
Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras
2018-05-01
The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.
A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.
Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan
2017-01-01
Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.
Projecting technology change to improve space technology planning and systems management
NASA Astrophysics Data System (ADS)
Walk, Steven Robert
2011-04-01
Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.
Layered Ensemble Architecture for Time Series Forecasting.
Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin
2016-01-01
Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.
Results on SSH neural network forecasting in the Mediterranean Sea
NASA Astrophysics Data System (ADS)
Rixen, Michel; Beckers, Jean-Marie; Alvarez, Alberto; Tintore, Joaquim
2002-01-01
Nowadays, satellites are the only monitoring systems that cover almost continuously all possible ocean areas and are now an essential part of operational oceanography. A novel approach based on artificial intelligence (AI) concepts, exploits pasts time series of satellite images to infer near future ocean conditions at the surface by neural networks and genetic algorithms. The size of the AI problem is drastically reduced by splitting the spatio-temporal variability contained in the remote sensing data by using empirical orthogonal function (EOF) decomposition. The problem of forecasting the dynamics of a 2D surface field can thus be reduced by selecting the most relevant empirical modes, and non-linear time series predictors are then applied on the amplitudes only. In the present case study, we use altimetric maps of the Mediterranean Sea, combining TOPEX-POSEIDON and ERS-1/2 data for the period 1992 to 1997. The learning procedure is applied to each mode individually. The final forecast is then reconstructed form the EOFs and the forecasted amplitudes and compared to the real observed field for validation of the method.
NASA Astrophysics Data System (ADS)
Vouterakos, P. A.; Moustris, K. P.; Bartzokas, A.; Ziomas, I. C.; Nastos, P. T.; Paliatsos, A. G.
2012-12-01
In this work, artificial neural networks (ANNs) were developed and applied in order to forecast the discomfort levels due to the combination of high temperature and air humidity, during the hot season of the year, in eight different regions within the Greater Athens area (GAA), Greece. For the selection of the best type and architecture of ANNs-forecasting models, the multiple criteria analysis (MCA) technique was applied. Three different types of ANNs were developed and tested with the MCA method. Concretely, the multilayer perceptron, the generalized feed forward networks (GFFN), and the time-lag recurrent networks were developed and tested. Results showed that the best ANNs type performance was achieved by using the GFFN model for the prediction of discomfort levels due to high temperature and air humidity within GAA. For the evaluation of the constructed ANNs, appropriate statistical indices were used. The analysis proved that the forecasting ability of the developed ANNs models is very satisfactory at a significant statistical level of p < 0.01.
Forecasting daily patient volumes in the emergency department.
Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L
2008-02-01
Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.
Stream-flow forecasting using extreme learning machines: A case study in a semi-arid region in Iraq
NASA Astrophysics Data System (ADS)
Yaseen, Zaher Mundher; Jaafar, Othman; Deo, Ravinesh C.; Kisi, Ozgur; Adamowski, Jan; Quilty, John; El-Shafie, Ahmed
2016-11-01
Monthly stream-flow forecasting can yield important information for hydrological applications including sustainable design of rural and urban water management systems, optimization of water resource allocations, water use, pricing and water quality assessment, and agriculture and irrigation operations. The motivation for exploring and developing expert predictive models is an ongoing endeavor for hydrological applications. In this study, the potential of a relatively new data-driven method, namely the extreme learning machine (ELM) method, was explored for forecasting monthly stream-flow discharge rates in the Tigris River, Iraq. The ELM algorithm is a single-layer feedforward neural network (SLFNs) which randomly selects the input weights, hidden layer biases and analytically determines the output weights of the SLFNs. Based on the partial autocorrelation functions of historical stream-flow data, a set of five input combinations with lagged stream-flow values are employed to establish the best forecasting model. A comparative investigation is conducted to evaluate the performance of the ELM compared to other data-driven models: support vector regression (SVR) and generalized regression neural network (GRNN). The forecasting metrics defined as the correlation coefficient (r), Nash-Sutcliffe efficiency (ENS), Willmott's Index (WI), root-mean-square error (RMSE) and mean absolute error (MAE) computed between the observed and forecasted stream-flow data are employed to assess the ELM model's effectiveness. The results revealed that the ELM model outperformed the SVR and the GRNN models across a number of statistical measures. In quantitative terms, superiority of ELM over SVR and GRNN models was exhibited by ENS = 0.578, 0.378 and 0.144, r = 0.799, 0.761 and 0.468 and WI = 0.853, 0.802 and 0.689, respectively and the ELM model attained lower RMSE value by approximately 21.3% (relative to SVR) and by approximately 44.7% (relative to GRNN). Based on the findings of this study, several recommendations were suggested for further exploration of the ELM model in hydrological forecasting problems.
NASA Astrophysics Data System (ADS)
He, Zhibin; Wen, Xiaohu; Liu, Hu; Du, Jun
2014-02-01
Data driven models are very useful for river flow forecasting when the underlying physical relationships are not fully understand, but it is not clear whether these data driven models still have a good performance in the small river basin of semiarid mountain regions where have complicated topography. In this study, the potential of three different data driven methods, artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for forecasting river flow in the semiarid mountain region, northwestern China. The models analyzed different combinations of antecedent river flow values and the appropriate input vector has been selected based on the analysis of residuals. The performance of the ANN, ANFIS and SVM models in training and validation sets are compared with the observed data. The model which consists of three antecedent values of flow has been selected as the best fit model for river flow forecasting. To get more accurate evaluation of the results of ANN, ANFIS and SVM models, the four quantitative standard statistical performance evaluation measures, the coefficient of correlation (R), root mean squared error (RMSE), Nash-Sutcliffe efficiency coefficient (NS) and mean absolute relative error (MARE), were employed to evaluate the performances of various models developed. The results indicate that the performance obtained by ANN, ANFIS and SVM in terms of different evaluation criteria during the training and validation period does not vary substantially; the performance of the ANN, ANFIS and SVM models in river flow forecasting was satisfactory. A detailed comparison of the overall performance indicated that the SVM model performed better than ANN and ANFIS in river flow forecasting for the validation data sets. The results also suggest that ANN, ANFIS and SVM method can be successfully applied to establish river flow with complicated topography forecasting models in the semiarid mountain regions.
A Solar Time-Based Analog Ensemble Method for Regional Solar Power Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Zhang, Xinmin; Li, Yuan
This paper presents a new analog ensemble method for day-ahead regional photovoltaic (PV) power forecasting with hourly resolution. By utilizing open weather forecast and power measurement data, this prediction method is processed within a set of historical data with similar meteorological data (temperature and irradiance), and astronomical date (solar time and earth declination angle). Further, clustering and blending strategies are applied to improve its accuracy in regional PV forecasting. The robustness of the proposed method is demonstrated with three different numerical weather prediction models, the North American Mesoscale Forecast System, the Global Forecast System, and the Short-Range Ensemble Forecast, formore » both region level and single site level PV forecasts. Using real measured data, the new forecasting approach is applied to the load zone in Southeastern Massachusetts as a case study. The normalized root mean square error (NRMSE) has been reduced by 13.80%-61.21% when compared with three tested baselines.« less
Nowcasting of rainfall and of combined sewage flow in urban drainage systems.
Achleitner, Stefan; Fach, Stefan; Einfalt, Thomas; Rauch, Wolfgang
2009-01-01
Nowcasting of rainfall may be used additionally to online rain measurements to optimize the operation of urban drainage systems. Uncertainties quoted for the rain volume are in the range of 5% to 10% mean square error (MSE), where for rain intensities 45% to 75% MSE are noted. For larger forecast periods up to 3 hours, the uncertainties will increase up to some hundred percents. Combined with the growing number of real time control concepts in sewer systems, rainfall forecast is used more and more in urban drainage systems. Therefore it is of interest how the uncertainties influence the final evaluation of a defined objective function. Uncertainty levels associated with the forecast itself are not necessarily transferable to resulting uncertainties in the catchment's flow dynamics. The aim of this paper is to analyse forecasts of rainfall and specific sewer output variables. For this study the combined sewer system of the city of Linz in the northern part of Austria located on the Danube has been selected. The city itself represents a total area of 96 km2 with 39 municipalities connected. It was found that the available weather radar data leads to large deviations in the forecast for precipitation at forecast horizons larger than 90 minutes. The same is true for sewer variables such a CSO overflow for small sub-catchments. Although the results improve for larger spatial scales, acceptable levels at forecast horizons larger than 90 minutes are not reached.
How is the weather? Forecasting inpatient glycemic control
Saulnier, George E; Castro, Janna C; Cook, Curtiss B; Thompson, Bithika M
2017-01-01
Aim: Apply methods of damped trend analysis to forecast inpatient glycemic control. Method: Observed and calculated point-of-care blood glucose data trends were determined over 62 weeks. Mean absolute percent error was used to calculate differences between observed and forecasted values. Comparisons were drawn between model results and linear regression forecasting. Results: The forecasted mean glucose trends observed during the first 24 and 48 weeks of projections compared favorably to the results provided by linear regression forecasting. However, in some scenarios, the damped trend method changed inferences compared with linear regression. In all scenarios, mean absolute percent error values remained below the 10% accepted by demand industries. Conclusion: Results indicate that forecasting methods historically applied within demand industries can project future inpatient glycemic control. Additional study is needed to determine if forecasting is useful in the analyses of other glucometric parameters and, if so, how to apply the techniques to quality improvement. PMID:29134125
NASA Astrophysics Data System (ADS)
dos Santos, A. F.; Freitas, S. R.; de Mattos, J. G. Z.; de Campos Velho, H. F.; Gan, M. A.; da Luz, E. F. P.; Grell, G. A.
2013-09-01
In this paper we consider an optimization problem applying the metaheuristic Firefly algorithm (FY) to weight an ensemble of rainfall forecasts from daily precipitation simulations with the Brazilian developments on the Regional Atmospheric Modeling System (BRAMS) over South America during January 2006. The method is addressed as a parameter estimation problem to weight the ensemble of precipitation forecasts carried out using different options of the convective parameterization scheme. Ensemble simulations were performed using different choices of closures, representing different formulations of dynamic control (the modulation of convection by the environment) in a deep convection scheme. The optimization problem is solved as an inverse problem of parameter estimation. The application and validation of the methodology is carried out using daily precipitation fields, defined over South America and obtained by merging remote sensing estimations with rain gauge observations. The quadratic difference between the model and observed data was used as the objective function to determine the best combination of the ensemble members to reproduce the observations. To reduce the model rainfall biases, the set of weights determined by the algorithm is used to weight members of an ensemble of model simulations in order to compute a new precipitation field that represents the observed precipitation as closely as possible. The validation of the methodology is carried out using classical statistical scores. The algorithm has produced the best combination of the weights, resulting in a new precipitation field closest to the observations.
Trend Analysis of Betel Nut-associated Oral Cancer and Health Burden in China.
Hu, Yan Jia; Chen, Jie; Zhong, Wai Sheng; Ling, Tian You; Jian, Xin Chun; Lu, Ruo Huang; Tang, Zhan Gui; Tao, Lin
To forecast the future trend of betel nut-associated oral cancer and the resulting burden on health based on historical oral cancer patient data in Hunan province, China. Oral cancer patient data in five hospitals in Changsha (the capital city of Hunan province) were collected for the past 12 years. Three methods were used to analyse the data; Microsoft Excel Forecast Sheet, Excel Trendline, and the Logistic growth model. A combination of these three methods was used to forecast the future trend of betel nut-associated oral cancer and the resulting burden on health. Betel nut-associated oral cancer cases have been increasing rapidly in the past 12 years in Changsha. As of 2016, betel nuts had caused 8,222 cases of oral cancer in Changsha and close to 25,000 cases in Hunan, resulting in about ¥5 billion in accumulated financial loss. The combined trend analysis predicts that by 2030, betel nuts will cause more than 100,000 cases of oral cancer in Changsha and more than 300,000 cases in Hunan, and more than ¥64 billion in accumulated financial loss in medical expenses. The trend analysis of oral cancer patient data predicts that the growing betel nut industry in Hunan province will cause a humanitarian catastrophe with massive loss of human life and national resources. To prevent this catastrophe, China should ban betel nuts and provide early oral cancer screening for betel nut consumers as soon as possible.
Improved Forecasting of Next Day Ozone Concentrations in the Eastern U.S.
There is an urgent need to provide accurate air quality information and forecasts to the general public. A hierarchical space-time model is used to forecast next day spatial patterns of daily maximum 8-hr ozone concentrations. The model combines ozone monitoring data and gridded...
Foretelling Flares and Solar Energetic Particle Events: the FORSPEF tool
NASA Astrophysics Data System (ADS)
Anastasiadis, Anastasios; Papaioannou, Athanasios; Sandberg, Ingmar; Georgoulis, Manolis K.; Tziotziou, Kostas; Jiggens, Piers
2017-04-01
A novel integrated prediction system, for both solar flares (SFs) and solar energetic particle (SEP) events is being presented. The Forecasting Solar Particle Events and Flares (FORSPEF) provides forecasting of solar eruptive events, such as SFs with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. In addition, FORSPEF, also provides nowcasting of SEP events based on actual SF and CME near real-time data, as well as the complete SEP profile (peak flux, fluence, rise time, duration) per parent solar event. The prediction of SFs relies on a morphological method: the effective connected magnetic field strength (Beff); it is based on an assessment of potentially flaring active-region (AR) magnetic configurations and it utilizes sophisticated analysis of a large number of AR magnetograms. For the prediction of SEP events new methods have been developed for both the likelihood of SEP occurrence and the expected SEP characteristics. In particular, using the location of the flare (longitude) and the flare size (maximum soft X-ray intensity), a reductive statistical method has been implemented. Moreover, employing CME parameters (velocity and width), proper functions per width (i.e. halo, partial halo, non-halo) and integral energy (E>30, 60, 100 MeV) have been identified. In our technique warnings are issued for all > C1.0 soft X-ray flares. The prediction time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective prediction time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes for solar flares and 6 hours for CMEs. We present the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of SF and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. Finally, we demonstrate the validation of the modules of the FORSPEF tool using categorical scores constructed on archived data and we further discuss independent case studies. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK and the "SPECS: Solar Particle Events foreCasting Studies" project of the National Observatory of Athens.
A novel application of artificial neural network for wind speed estimation
NASA Astrophysics Data System (ADS)
Fang, Da; Wang, Jianzhou
2017-05-01
Providing accurate multi-steps wind speed estimation models has increasing significance, because of the important technical and economic impacts of wind speed on power grid security and environment benefits. In this study, the combined strategies for wind speed forecasting are proposed based on an intelligent data processing system using artificial neural network (ANN). Generalized regression neural network and Elman neural network are employed to form two hybrid models. The approach employs one of ANN to model the samples achieving data denoising and assimilation and apply the other to predict wind speed using the pre-processed samples. The proposed method is demonstrated in terms of the predicting improvements of the hybrid models compared with single ANN and the typical forecasting method. To give sufficient cases for the study, four observation sites with monthly average wind speed of four given years in Western China were used to test the models. Multiple evaluation methods demonstrated that the proposed method provides a promising alternative technique in monthly average wind speed estimation.
An experimental system for flood risk forecasting and monitoring at global scale
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Alfieri, Lorenzo; Kalas, Milan; Lorini, Valerio; Salamon, Peter
2017-04-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by a wide range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasting, combining streamflow estimations with expected inundated areas and flood impacts. Finally, emerging technologies such as crowdsourcing and social media monitoring can play a crucial role in flood disaster management and preparedness. Here, we present some recent advances of an experimental procedure for near-real time flood mapping and impact assessment. The procedure translates in near real-time the daily streamflow forecasts issued by the Global Flood Awareness System (GloFAS) into event-based flood hazard maps, which are then combined with exposure and vulnerability information at global scale to derive risk forecast. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To increase the reliability of our forecasts we propose the integration of model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification and correction of impact forecasts. Finally, we present the results of preliminary tests which show the potential of the proposed procedure in supporting emergency response and management.
New Approach To Hour-By-Hour Weather Forecast
NASA Astrophysics Data System (ADS)
Liao, Q. Q.; Wang, B.
2017-12-01
Fine hourly forecast in single station weather forecast is required in many human production and life application situations. Most previous MOS (Model Output Statistics) which used a linear regression model are hard to solve nonlinear natures of the weather prediction and forecast accuracy has not been sufficient at high temporal resolution. This study is to predict the future meteorological elements including temperature, precipitation, relative humidity and wind speed in a local region over a relatively short period of time at hourly level. By means of hour-to-hour NWP (Numeral Weather Prediction)meteorological field from Forcastio (https://darksky.net/dev/docs/forecast) and real-time instrumental observation including 29 stations in Yunnan and 3 stations in Tianjin of China from June to October 2016, predictions are made of the 24-hour hour-by-hour ahead. This study presents an ensemble approach to combine the information of instrumental observation itself and NWP. Use autoregressive-moving-average (ARMA) model to predict future values of the observation time series. Put newest NWP products into the equations derived from the multiple linear regression MOS technique. Handle residual series of MOS outputs with autoregressive (AR) model for the linear property presented in time series. Due to the complexity of non-linear property of atmospheric flow, support vector machine (SVM) is also introduced . Therefore basic data quality control and cross validation makes it able to optimize the model function parameters , and do 24 hours ahead residual reduction with AR/SVM model. Results show that AR model technique is better than corresponding multi-variant MOS regression method especially at the early 4 hours when the predictor is temperature. MOS-AR combined model which is comparable to MOS-SVM model outperform than MOS. Both of their root mean square error and correlation coefficients for 2 m temperature are reduced to 1.6 degree Celsius and 0.91 respectively. The forecast accuracy of 24- hour forecast deviation no more than 2 degree Celsius is 78.75 % for MOS-AR model and 81.23 % for AR model.
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
A national-scale seasonal hydrological forecast system: development and evaluation over Britain
NASA Astrophysics Data System (ADS)
Bell, Victoria A.; Davies, Helen N.; Kay, Alison L.; Brookshaw, Anca; Scaife, Adam A.
2017-09-01
Skilful winter seasonal predictions for the North Atlantic circulation and northern Europe have now been demonstrated and the potential for seasonal hydrological forecasting in the UK is now being explored. One of the techniques being used combines seasonal rainfall forecasts provided by operational weather forecast systems with hydrological modelling tools to provide estimates of seasonal mean river flows up to a few months ahead. The work presented here shows how spatial information contained in a distributed hydrological model typically requiring high-resolution (daily or better) rainfall data can be used to provide an initial condition for a much simpler forecast model tailored to use low-resolution monthly rainfall forecasts. Rainfall forecasts (hindcasts
) from the GloSea5 model (1996 to 2009) are used to provide the first assessment of skill in these national-scale flow forecasts. The skill in the combined modelling system is assessed for different seasons and regions of Britain, and compared to what might be achieved using other approaches such as use of an ensemble of historical rainfall in a hydrological model, or a simple flow persistence forecast. The analysis indicates that only limited forecast skill is achievable for Spring and Summer seasonal hydrological forecasts; however, Autumn and Winter flows can be reasonably well forecast using (ensemble mean) rainfall forecasts based on either GloSea5 forecasts or historical rainfall (the preferred type of forecast depends on the region). Flow forecasts using ensemble mean GloSea5 rainfall perform most consistently well across Britain, and provide the most skilful forecasts overall at the 3-month lead time. Much of the skill (64 %) in the 1-month ahead seasonal flow forecasts can be attributed to the hydrological initial condition (particularly in regions with a significant groundwater contribution to flows), whereas for the 3-month ahead lead time, GloSea5 forecasts account for ˜ 70 % of the forecast skill (mostly in areas of high rainfall to the north and west) and only 30 % of the skill arises from hydrological memory (typically groundwater-dominated areas). Given the high spatial heterogeneity in typical patterns of UK rainfall and evaporation, future development of skilful spatially distributed seasonal forecasts could lead to substantial improvements in seasonal flow forecast capability, potentially benefitting practitioners interested in predicting hydrological extremes, not only in the UK but also across Europe.
NASA Astrophysics Data System (ADS)
Cheng, K.; Guo, L. M.; Wang, Y. K.; Zafar, M. T.
2017-11-01
In order to select effective samples in the large number of data of PV power generation years and improve the accuracy of PV power generation forecasting model, this paper studies the application of clustering analysis in this field and establishes forecasting model based on neural network. Based on three different types of weather on sunny, cloudy and rainy days, this research screens samples of historical data by the clustering analysis method. After screening, it establishes BP neural network prediction models using screened data as training data. Then, compare the six types of photovoltaic power generation prediction models before and after the data screening. Results show that the prediction model combining with clustering analysis and BP neural networks is an effective method to improve the precision of photovoltaic power generation.
NASA Astrophysics Data System (ADS)
Tito Arandia Martinez, Fabian
2014-05-01
Adequate uncertainty assessment is an important issue in hydrological modelling. An important issue for hydropower producers is to obtain ensemble forecasts which truly grasp the uncertainty linked to upcoming streamflows. If properly assessed, this uncertainty can lead to optimal reservoir management and energy production (ex. [1]). The meteorological inputs to the hydrological model accounts for an important part of the total uncertainty in streamflow forecasting. Since the creation of the THORPEX initiative and the TIGGE database, access to meteorological ensemble forecasts from nine agencies throughout the world have been made available. This allows for hydrological ensemble forecasts based on multiple meteorological ensemble forecasts. Consequently, both the uncertainty linked to the architecture of the meteorological model and the uncertainty linked to the initial condition of the atmosphere can be accounted for. The main objective of this work is to show that a weighted combination of meteorological ensemble forecasts based on different atmospheric models can lead to improved hydrological ensemble forecasts, for horizons from one to ten days. This experiment is performed for the Baskatong watershed, a head subcatchment of the Gatineau watershed in the province of Quebec, in Canada. Baskatong watershed is of great importance for hydro-power production, as it comprises the main reservoir for the Gatineau watershed, on which there are six hydropower plants managed by Hydro-Québec. Since the 70's, they have been using pseudo ensemble forecast based on deterministic meteorological forecasts to which variability derived from past forecasting errors is added. We use a combination of meteorological ensemble forecasts from different models (precipitation and temperature) as the main inputs for hydrological model HSAMI ([2]). The meteorological ensembles from eight of the nine agencies available through TIGGE are weighted according to their individual performance and combined to form a grand ensemble. Results show that the hydrological forecasts derived from the grand ensemble perform better than the pseudo ensemble forecasts actually used operationally at Hydro-Québec. References: [1] M. Verbunt, A. Walser, J. Gurtz et al., "Probabilistic flood forecasting with a limited-area ensemble prediction system: Selected case studies," Journal of Hydrometeorology, vol. 8, no. 4, pp. 897-909, Aug, 2007. [2] N. Evora, Valorisation des prévisions météorologiques d'ensemble, Institu de recherceh d'Hydro-Québec 2005. [3] V. Fortin, Le modèle météo-apport HSAMI: historique, théorie et application, Institut de recherche d'Hydro-Québec, 2000.
Using a Hybrid Model to Forecast the Prevalence of Schistosomiasis in Humans
Zhou, Lingling; Xia, Jing; Yu, Lijing; Wang, Ying; Shi, Yun; Cai, Shunxiang; Nie, Shaofa
2016-01-01
Background: We previously proposed a hybrid model combining both the autoregressive integrated moving average (ARIMA) and the nonlinear autoregressive neural network (NARNN) models in forecasting schistosomiasis. Our purpose in the current study was to forecast the annual prevalence of human schistosomiasis in Yangxin County, using our ARIMA-NARNN model, thereby further certifying the reliability of our hybrid model. Methods: We used the ARIMA, NARNN and ARIMA-NARNN models to fit and forecast the annual prevalence of schistosomiasis. The modeling time range included was the annual prevalence from 1956 to 2008 while the testing time range included was from 2009 to 2012. The mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) were used to measure the model performance. We reconstructed the hybrid model to forecast the annual prevalence from 2013 to 2016. Results: The modeling and testing errors generated by the ARIMA-NARNN model were lower than those obtained from either the single ARIMA or NARNN models. The predicted annual prevalence from 2013 to 2016 demonstrated an initial decreasing trend, followed by an increase. Conclusions: The ARIMA-NARNN model can be well applied to analyze surveillance data for early warning systems for the control and elimination of schistosomiasis. PMID:27023573
Valizadeh, Nariman; El-Shafie, Ahmed; Mirzaei, Majid; Galavi, Hadi; Mukhlisin, Muhammad; Jaafar, Othman
2014-01-01
Water level forecasting is an essential topic in water management affecting reservoir operations and decision making. Recently, modern methods utilizing artificial intelligence, fuzzy logic, and combinations of these techniques have been used in hydrological applications because of their considerable ability to map an input-output pattern without requiring prior knowledge of the criteria influencing the forecasting procedure. The artificial neurofuzzy interface system (ANFIS) is one of the most accurate models used in water resource management. Because the membership functions (MFs) possess the characteristics of smoothness and mathematical components, each set of input data is able to yield the best result using a certain type of MF in the ANFIS models. The objective of this study is to define the different ANFIS model by applying different types of MFs for each type of input to forecast the water level in two case studies, the Klang Gates Dam and Rantau Panjang station on the Johor river in Malaysia, to compare the traditional ANFIS model with the new introduced one in two different situations, reservoir and stream, showing the new approach outweigh rather than the traditional one in both case studies. This objective is accomplished by evaluating the model fitness and performance in daily forecasting.
Validation of reactive gases and aerosols in the MACC global analysis and forecast system
NASA Astrophysics Data System (ADS)
Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.
2015-11-01
The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols, and greenhouse gases, and is based on the Integrated Forecasting System of the European Centre for Medium-Range Weather Forecasts (ECMWF). The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past 3 years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high-pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.
Mirzaei, Majid; Jaafar, Othman
2014-01-01
Water level forecasting is an essential topic in water management affecting reservoir operations and decision making. Recently, modern methods utilizing artificial intelligence, fuzzy logic, and combinations of these techniques have been used in hydrological applications because of their considerable ability to map an input-output pattern without requiring prior knowledge of the criteria influencing the forecasting procedure. The artificial neurofuzzy interface system (ANFIS) is one of the most accurate models used in water resource management. Because the membership functions (MFs) possess the characteristics of smoothness and mathematical components, each set of input data is able to yield the best result using a certain type of MF in the ANFIS models. The objective of this study is to define the different ANFIS model by applying different types of MFs for each type of input to forecast the water level in two case studies, the Klang Gates Dam and Rantau Panjang station on the Johor river in Malaysia, to compare the traditional ANFIS model with the new introduced one in two different situations, reservoir and stream, showing the new approach outweigh rather than the traditional one in both case studies. This objective is accomplished by evaluating the model fitness and performance in daily forecasting. PMID:24790567
Zhao, Xiuli; Yiranbon, Ethel
2014-01-01
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, “least-cost,” and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor. PMID:24511292
Zhao, Xiuli; Asante Antwi, Henry; Yiranbon, Ethel
2014-01-01
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, "least-cost," and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor.
Short-term load forecasting of power system
NASA Astrophysics Data System (ADS)
Xu, Xiaobin
2017-05-01
In order to ensure the scientific nature of optimization about power system, it is necessary to improve the load forecasting accuracy. Power system load forecasting is based on accurate statistical data and survey data, starting from the history and current situation of electricity consumption, with a scientific method to predict the future development trend of power load and change the law of science. Short-term load forecasting is the basis of power system operation and analysis, which is of great significance to unit combination, economic dispatch and safety check. Therefore, the load forecasting of the power system is explained in detail in this paper. First, we use the data from 2012 to 2014 to establish the partial least squares model to regression analysis the relationship between daily maximum load, daily minimum load, daily average load and each meteorological factor, and select the highest peak by observing the regression coefficient histogram Day maximum temperature, daily minimum temperature and daily average temperature as the meteorological factors to improve the accuracy of load forecasting indicators. Secondly, in the case of uncertain climate impact, we use the time series model to predict the load data for 2015, respectively, the 2009-2014 load data were sorted out, through the previous six years of the data to forecast the data for this time in 2015. The criterion for the accuracy of the prediction is the average of the standard deviations for the prediction results and average load for the previous six years. Finally, considering the climate effect, we use the BP neural network model to predict the data in 2015, and optimize the forecast results on the basis of the time series model.
Statistical earthquake focal mechanism forecasts
NASA Astrophysics Data System (ADS)
Kagan, Yan Y.; Jackson, David D.
2014-04-01
Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.
NASA Astrophysics Data System (ADS)
Hasan, Md Alfi; Islam, A. K. M. Saiful
2018-05-01
Accurate forecasting of heavy rainfall is crucial for the improvement of flood warning to prevent loss of life and property damage due to flash-flood-related landslides in the hilly region of Bangladesh. Forecasting heavy rainfall events is challenging where microphysics and cumulus parameterization schemes of Weather Research and Forecast (WRF) model play an important role. In this study, a comparison was made between observed and simulated rainfall using 19 different combinations of microphysics and cumulus schemes available in WRF over Bangladesh. Two severe rainfall events during 11th June 2007 and 24-27th June 2012, over the eastern hilly region of Bangladesh, were selected for performance evaluation using a number of indicators. A combination of the Stony Brook University microphysics scheme with Tiedtke cumulus scheme is found as the most suitable scheme for reproducing those events. Another combination of the single-moment 6-class microphysics scheme with New Grell 3D cumulus schemes also showed reasonable performance in forecasting heavy rainfall over this region. The sensitivity analysis confirms that cumulus schemes play a greater role than microphysics schemes for reproducing the heavy rainfall events using WRF.
NASA Astrophysics Data System (ADS)
Kutty, Govindan; Muraleedharan, Rohit; Kesarkar, Amit P.
2018-03-01
Uncertainties in the numerical weather prediction models are generally not well-represented in ensemble-based data assimilation (DA) systems. The performance of an ensemble-based DA system becomes suboptimal, if the sources of error are undersampled in the forecast system. The present study examines the effect of accounting for model error treatments in the hybrid ensemble transform Kalman filter—three-dimensional variational (3DVAR) DA system (hybrid) in the track forecast of two tropical cyclones viz. Hudhud and Thane, formed over the Bay of Bengal, using Advanced Research Weather Research and Forecasting (ARW-WRF) model. We investigated the effect of two types of model error treatment schemes and their combination on the hybrid DA system; (i) multiphysics approach, which uses different combination of cumulus, microphysics and planetary boundary layer schemes, (ii) stochastic kinetic energy backscatter (SKEB) scheme, which perturbs the horizontal wind and potential temperature tendencies, (iii) a combination of both multiphysics and SKEB scheme. Substantial improvements are noticed in the track positions of both the cyclones, when flow-dependent ensemble covariance is used in 3DVAR framework. Explicit model error representation is found to be beneficial in treating the underdispersive ensembles. Among the model error schemes used in this study, a combination of multiphysics and SKEB schemes has outperformed the other two schemes with improved track forecast for both the tropical cyclones.
NASA Astrophysics Data System (ADS)
WANG, D.; Wang, Y.; Zeng, X.
2017-12-01
Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Florita, Anthony R; Krishnan, Venkat K
2017-08-31
Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less
Model-free aftershock forecasts constructed from similar sequences in the past
NASA Astrophysics Data System (ADS)
van der Elst, N.; Page, M. T.
2017-12-01
The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.
Detection and forecasting of oyster norovirus outbreaks: recent advances and future perspectives.
Wang, Jiao; Deng, Zhiqiang
2012-09-01
Norovirus is a highly infectious pathogen that is commonly found in oysters growing in fecally contaminated waters. Norovirus outbreaks can cause the closure of oyster harvesting waters and acute gastroenteritis in humans associated with consumption of contaminated raw oysters. Extensive efforts and progresses have been made in detection and forecasting of oyster norovirus outbreaks over the past decades. The main objective of this paper is to provide a literature review of methods and techniques for detecting and forecasting oyster norovirus outbreaks and thereby to identify the future directions for improving the detection and forecasting of norovirus outbreaks. It is found that (1) norovirus outbreaks display strong seasonality with the outbreak peak occurring commonly in December-March in the U.S. and April-May in the Europe; (2) norovirus outbreaks are affected by multiple environmental factors, including but not limited to precipitation, temperature, solar radiation, wind, and salinity; (3) various modeling approaches may be employed to forecast norovirus outbreaks, including Bayesian models, regression models, Artificial Neural Networks, and process-based models; and (4) diverse techniques are available for near real-time detection of norovirus outbreaks, including multiplex PCR, seminested PCR, real-time PCR, quantitative PCR, and satellite remote sensing. The findings are important to the management of oyster growing waters and to future investigations into norovirus outbreaks. It is recommended that a combined approach of sensor-assisted real time monitoring and modeling-based forecasting should be utilized for an efficient and effective detection and forecasting of norovirus outbreaks caused by consumption of contaminated oysters. Copyright © 2012 Elsevier Ltd. All rights reserved.
An impact analysis of forecasting methods and forecasting parameters on bullwhip effect
NASA Astrophysics Data System (ADS)
Silitonga, R. Y. H.; Jelly, N.
2018-04-01
Bullwhip effect is an increase of variance of demand fluctuation from downstream to upstream of supply chain. Forecasting methods and forecasting parameters were recognized as some factors that affect bullwhip phenomena. To study these factors, we can develop simulations. There are several ways to simulate bullwhip effect in previous studies, such as mathematical equation modelling, information control modelling, computer program, and many more. In this study a spreadsheet program named Bullwhip Explorer was used to simulate bullwhip effect. Several scenarios were developed to show the change in bullwhip effect ratio because of the difference in forecasting methods and forecasting parameters. Forecasting methods used were mean demand, moving average, exponential smoothing, demand signalling, and minimum expected mean squared error. Forecasting parameters were moving average period, smoothing parameter, signalling factor, and safety stock factor. It showed that decreasing moving average period, increasing smoothing parameter, increasing signalling factor can create bigger bullwhip effect ratio. Meanwhile, safety stock factor had no impact to bullwhip effect.
The development rainfall forecasting using kalman filter
NASA Astrophysics Data System (ADS)
Zulfi, Mohammad; Hasan, Moh.; Dwidja Purnomo, Kosala
2018-04-01
Rainfall forecasting is very interesting for agricultural planing. Rainfall information is useful to make decisions about the plan planting certain commodities. In this studies, the rainfall forecasting by ARIMA and Kalman Filter method. Kalman Filter method is used to declare a time series model of which is shown in the form of linear state space to determine the future forecast. This method used a recursive solution to minimize error. The rainfall data in this research clustered by K-means clustering. Implementation of Kalman Filter method is for modelling and forecasting rainfall in each cluster. We used ARIMA (p,d,q) to construct a state space for KalmanFilter model. So, we have four group of the data and one model in each group. In conclusions, Kalman Filter method is better than ARIMA model for rainfall forecasting in each group. It can be showed from error of Kalman Filter method that smaller than error of ARIMA model.
Forecasting electricity usage using univariate time series models
NASA Astrophysics Data System (ADS)
Hock-Eam, Lim; Chee-Yin, Yip
2014-12-01
Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.
Compensated Box-Jenkins transfer function for short term load forecast
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breipohl, A.; Yu, Z.; Lee, F.N.
In the past years, the Box-Jenkins ARIMA method and the Box-Jenkins transfer function method (BJTF) have been among the most commonly used methods for short term electrical load forecasting. But when there exists a sudden change in the temperature, both methods tend to exhibit larger errors in the forecast. This paper demonstrates that the load forecasting errors resulting from either the BJ ARIMA model or the BJTF model are not simply white noise, but rather well-patterned noise, and the patterns in the noise can be used to improve the forecasts. Thus a compensated Box-Jenkins transfer method (CBJTF) is proposed tomore » improve the accuracy of the load prediction. Some case studies have been made which result in about a 14-33% reduction of the root mean square (RMS) errors of the forecasts, depending on the compensation time period as well as the compensation method used.« less
Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1984-01-01
Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…
NASA Astrophysics Data System (ADS)
Abaza, Mabrouk; Anctil, François; Fortin, Vincent; Perreault, Luc
2017-12-01
Meteorological and hydrological ensemble prediction systems are imperfect. Their outputs could often be improved through the use of a statistical processor, opening up the question of the necessity of using both processors (meteorological and hydrological), only one of them, or none. This experiment compares the predictive distributions from four hydrological ensemble prediction systems (H-EPS) utilising the Ensemble Kalman filter (EnKF) probabilistic sequential data assimilation scheme. They differ in the inclusion or not of the Distribution Based Scaling (DBS) method for post-processing meteorological forecasts and the ensemble Bayesian Model Averaging (ensemble BMA) method for hydrological forecast post-processing. The experiment is implemented on three large watersheds and relies on the combination of two meteorological reforecast products: the 4-member Canadian reforecasts from the Canadian Centre for Meteorological and Environmental Prediction (CCMEP) and the 10-member American reforecasts from the National Oceanic and Atmospheric Administration (NOAA), leading to 14 members at each time step. Results show that all four tested H-EPS lead to resolution and sharpness values that are quite similar, with an advantage to DBS + EnKF. The ensemble BMA is unable to compensate for any bias left in the precipitation ensemble forecasts. On the other hand, it succeeds in calibrating ensemble members that are otherwise under-dispersed. If reliability is preferred over resolution and sharpness, DBS + EnKF + ensemble BMA performs best, making use of both processors in the H-EPS system. Conversely, for enhanced resolution and sharpness, DBS is the preferred method.
Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou
2007-01-01
Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322
Medina, Daniel C; Findley, Sally E; Guindo, Boubacar; Doumbia, Seydou
2007-11-21
Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. In this longitudinal retrospective (01/1996-06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel.
Human-model hybrid Korean air quality forecasting system.
Chang, Lim-Seok; Cho, Ara; Park, Hyunju; Nam, Kipyo; Kim, Deokrae; Hong, Ji-Hyoung; Song, Chang-Keun
2016-09-01
The Korean national air quality forecasting system, consisting of the Weather Research and Forecasting, the Sparse Matrix Operator Kernel Emissions, and the Community Modeling and Analysis (CMAQ), commenced from August 31, 2013 with target pollutants of particulate matters (PM) and ozone. Factors contributing to PM forecasting accuracy include CMAQ inputs of meteorological field and emissions, forecasters' capacity, and inherent CMAQ limit. Four numerical experiments were conducted including two global meteorological inputs from the Global Forecast System (GFS) and the Unified Model (UM), two emissions from the Model Intercomparison Study Asia (MICS-Asia) and the Intercontinental Chemical Transport Experiment (INTEX-B) for the Northeast Asia with Clear Air Policy Support System (CAPSS) for South Korea, and data assimilation of the Monitoring Atmospheric Composition and Climate (MACC). Significant PM underpredictions by using both emissions were found for PM mass and major components (sulfate and organic carbon). CMAQ predicts PM2.5 much better than PM10 (NMB of PM2.5: -20~-25%, PM10: -43~-47%). Forecasters' error usually occurred at the next day of high PM event. Once CMAQ fails to predict high PM event the day before, forecasters are likely to dismiss the model predictions on the next day which turns out to be true. The best combination of CMAQ inputs is the set of UM global meteorological field, MICS-Asia and CAPSS 2010 emissions with the NMB of -12.3%, the RMSE of 16.6μ/m(3) and the R(2) of 0.68. By using MACC data as an initial and boundary condition, the performance skill of CMAQ would be improved, especially in the case of undefined coarse emission. A variety of methods such as ensemble and data assimilation are considered to improve further the accuracy of air quality forecasting, especially for high PM events to be comparable to for all cases. The growing utilization of the air quality forecast induced the public strongly to demand that the accuracy of the national forecasting be improved. In this study, we investigated the problems in the current forecasting as well as various alternatives to solve the problems. Such efforts to improve the accuracy of the forecast are expected to contribute to the protection of public health by increasing the availability of the forecast system.
Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity
NASA Astrophysics Data System (ADS)
Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján
2017-06-01
It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.
An experimental system for flood risk forecasting at global scale
NASA Astrophysics Data System (ADS)
Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.
2016-12-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.
Löwe, Roland; Mikkelsen, Peter Steen; Rasmussen, Michael R; Madsen, Henrik
2013-01-01
Merging of radar rainfall data with rain gauge measurements is a common approach to overcome problems in deriving rain intensities from radar measurements. We extend an existing approach for adjustment of C-band radar data using state-space models and use the resulting rainfall intensities as input for forecasting outflow from two catchments in the Copenhagen area. Stochastic grey-box models are applied to create the runoff forecasts, providing us with not only a point forecast but also a quantification of the forecast uncertainty. Evaluating the results, we can show that using the adjusted radar data improves runoff forecasts compared with using the original radar data and that rain gauge measurements as forecast input are also outperformed. Combining the data merging approach with short-term rainfall forecasting algorithms may result in further improved runoff forecasts that can be used in real time control.
Energy Consumption Forecasting Using Semantic-Based Genetic Programming with Local Search Optimizer.
Castelli, Mauro; Trujillo, Leonardo; Vanneschi, Leonardo
2015-01-01
Energy consumption forecasting (ECF) is an important policy issue in today's economies. An accurate ECF has great benefits for electric utilities and both negative and positive errors lead to increased operating costs. The paper proposes a semantic based genetic programming framework to address the ECF problem. In particular, we propose a system that finds (quasi-)perfect solutions with high probability and that generates models able to produce near optimal predictions also on unseen data. The framework blends a recently developed version of genetic programming that integrates semantic genetic operators with a local search method. The main idea in combining semantic genetic programming and a local searcher is to couple the exploration ability of the former with the exploitation ability of the latter. Experimental results confirm the suitability of the proposed method in predicting the energy consumption. In particular, the system produces a lower error with respect to the existing state-of-the art techniques used on the same dataset. More importantly, this case study has shown that including a local searcher in the geometric semantic genetic programming system can speed up the search process and can result in fitter models that are able to produce an accurate forecasting also on unseen data.
School District Enrollment Projections: A Comparison of Three Methods.
ERIC Educational Resources Information Center
Pettibone, Timothy J.; Bushan, Latha
This study assesses three methods of forecasting school enrollments: the cohort-sruvival method (grade progression), the statistical forecasting procedure developed by the Statistical Analysis System (SAS) Institute, and a simple ratio computation. The three methods were used to forecast school enrollments for kindergarten through grade 12 in a…
NASA Astrophysics Data System (ADS)
Ito, Shigenobu; Yukita, Kazuto; Goto, Yasuyuki; Ichiyanagi, Katsuhiro; Nakano, Hiroyuki
By the development of industry, in recent years; dependence to electric energy is growing year by year. Therefore, reliable electric power supply is in need. However, to stock a huge amount of electric energy is very difficult. Also, there is a necessity to keep balance between the demand and supply, which changes hour after hour. Consequently, to supply the high quality and highly dependable electric power supply, economically, and with high efficiency, there is a need to forecast the movement of the electric power demand carefully in advance. And using that forecast as the source, supply and demand management plan should be made. Thus load forecasting is said to be an important job among demand investment of electric power companies. So far, forecasting method using Fuzzy logic, Neural Net Work, Regression model has been suggested for the development of forecasting accuracy. Those forecasting accuracy is in a high level. But to invest electric power in higher accuracy more economically, a new forecasting method with higher accuracy is needed. In this paper, to develop the forecasting accuracy of the former methods, the daily peak load forecasting method using the weather distribution of highest and lowest temperatures, and comparison value of each nearby date data is suggested.
NASA Astrophysics Data System (ADS)
Manikumari, N.; Murugappan, A.; Vinodhini, G.
2017-07-01
Time series forecasting has gained remarkable interest of researchers in the last few decades. Neural networks based time series forecasting have been employed in various application areas. Reference Evapotranspiration (ETO) is one of the most important components of the hydrologic cycle and its precise assessment is vital in water balance and crop yield estimation, water resources system design and management. This work aimed at achieving accurate time series forecast of ETO using a combination of neural network approaches. This work was carried out using data collected in the command area of VEERANAM Tank during the period 2004 - 2014 in India. In this work, the Neural Network (NN) models were combined by ensemble learning in order to improve the accuracy for forecasting Daily ETO (for the year 2015). Bagged Neural Network (Bagged-NN) and Boosted Neural Network (Boosted-NN) ensemble learning were employed. It has been proved that Bagged-NN and Boosted-NN ensemble models are better than individual NN models in terms of accuracy. Among the ensemble models, Boosted-NN reduces the forecasting errors compared to Bagged-NN and individual NNs. Regression co-efficient, Mean Absolute Deviation, Mean Absolute Percentage error and Root Mean Square Error also ascertain that Boosted-NN lead to improved ETO forecasting performance.
Siedlecki, Samantha A.; Kaplan, Isaac C.; Hermann, Albert J.; Nguyen, Thanh Tam; Bond, Nicholas A.; Newton, Jan A.; Williams, Gregory D.; Peterson, William T.; Alin, Simone R.; Feely, Richard A.
2016-01-01
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO’s Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA’s Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders. PMID:27273473
NASA Astrophysics Data System (ADS)
Siedlecki, Samantha A.; Kaplan, Isaac C.; Hermann, Albert J.; Nguyen, Thanh Tam; Bond, Nicholas A.; Newton, Jan A.; Williams, Gregory D.; Peterson, William T.; Alin, Simone R.; Feely, Richard A.
2016-06-01
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO’s Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA’s Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders.
Siedlecki, Samantha A; Kaplan, Isaac C; Hermann, Albert J; Nguyen, Thanh Tam; Bond, Nicholas A; Newton, Jan A; Williams, Gregory D; Peterson, William T; Alin, Simone R; Feely, Richard A
2016-06-07
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO's Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA's Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders.
Refining value-at-risk estimates using a Bayesian Markov-switching GJR-GARCH copula-EVT model.
Sampid, Marius Galabe; Hasim, Haslifah M; Dai, Hongsheng
2018-01-01
In this paper, we propose a model for forecasting Value-at-Risk (VaR) using a Bayesian Markov-switching GJR-GARCH(1,1) model with skewed Student's-t innovation, copula functions and extreme value theory. A Bayesian Markov-switching GJR-GARCH(1,1) model that identifies non-constant volatility over time and allows the GARCH parameters to vary over time following a Markov process, is combined with copula functions and EVT to formulate the Bayesian Markov-switching GJR-GARCH(1,1) copula-EVT VaR model, which is then used to forecast the level of risk on financial asset returns. We further propose a new method for threshold selection in EVT analysis, which we term the hybrid method. Empirical and back-testing results show that the proposed VaR models capture VaR reasonably well in periods of calm and in periods of crisis.
NASA Astrophysics Data System (ADS)
Susanti, D.; Hartini, E.; Permana, A.
2017-01-01
Sale and purchase of the growing competition between companies in Indonesian, make every company should have a proper planning in order to win the competition with other companies. One of the things that can be done to design the plan is to make car sales forecast for the next few periods, it’s required that the amount of inventory of cars that will be sold in proportion to the number of cars needed. While to get the correct forecasting, on of the methods that can be used is the method of Adaptive Spline Threshold Autoregression (ASTAR). Therefore, this time the discussion will focus on the use of Adaptive Spline Threshold Autoregression (ASTAR) method in forecasting the volume of car sales in PT.Srikandi Diamond Motors using time series data.In the discussion of this research, forecasting using the method of forecasting value Adaptive Spline Threshold Autoregression (ASTAR) produce approximately correct.
Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie
2014-01-01
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.
Dai, Wensheng
2014-01-01
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740
NASA Astrophysics Data System (ADS)
Guoxing, Zheng; Minghu, Jiang; Hongliang, Gong; Nannan, Zhang; Jianguang, Wei
2018-02-01
According to basic principles of combining series of strata and demands of same-well injection-production technique, the optimization designing method of same-well injection-production technique’s injection-production circulatory system is given. Based on oil-water two-phase model with condition of arbitrarily well network, a dynamic forecast method for the application of same-well injection-production reservoir is established with considering the demands and capacity of same-well injection-production technique, sample wells are selected to launch the forecast evaluation and analysis of same-well injection-production reservoir application’s effect. Results show: single-test-well composite water cut decreases by 4.7% and test-well-group composite water cut decreases by 1.56% under the condition of basically invariant ground water injection rate. The method provides theoretical support for the proof of same-well injection-production technique’s reservoir development improving effect and further tests.
2017-07-01
forecasts and observations on a common grid, which enables the application a number of different spatial verification methods that reveal various...forecasts of continuous meteorological variables using categorical and object-based methods . White Sands Missile Range (NM): Army Research Laboratory (US... Research version of the Weather Research and Forecasting Model adapted for generating short-range nowcasts and gridded observations produced by the
A scoping review of malaria forecasting: past work and future directions
Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L
2012-01-01
Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505
Predictability of Bristol Bay, Alaska, sockeye salmon returns one to four years in the future
Adkison, Milo D.; Peterson, R.M.
2000-01-01
Historically, forecast error for returns of sockeye salmon Oncorhynchus nerka to Bristol Bay, Alaska, has been large. Using cross-validation forecast error as our criterion, we selected forecast models for each of the nine principal Bristol Bay drainages. Competing forecast models included stock-recruitment relationships, environmental variables, prior returns of siblings, or combinations of these predictors. For most stocks, we found prior returns of siblings to be the best single predictor of returns; however, forecast accuracy was low even when multiple predictors were considered. For a typical drainage, an 80% confidence interval ranged from one half to double the point forecast. These confidence intervals appeared to be appropriately wide.
Inference and forecast of H7N9 influenza in China, 2013 to 2015.
Li, Ruiyun; Bai, Yuqi; Heaney, Alex; Kandula, Sasikiran; Cai, Jun; Zhao, Xuyi; Xu, Bing; Shaman, Jeffrey
2017-02-16
The recent emergence of A(H7N9) avian influenza poses a significant challenge to public health in China and around the world; however, understanding of the transmission dynamics and progression of influenza A(H7N9) infection in domestic poultry, as well as spillover transmission to humans, remains limited. Here, we develop a mathematical model-Bayesian inference system which combines a simple epidemic model and data assimilation method, and use it in conjunction with data on observed human influenza A(H7N9) cases from 19 February 2013 to 19 September 2015 to estimate key epidemiological parameters and to forecast infection in both poultry and humans. Our findings indicate a high outbreak attack rate of 33% among poultry but a low rate of chicken-to-human spillover transmission. In addition, we generated accurate forecasts of the peak timing and magnitude of human influenza A(H7N9) cases. This work demonstrates that transmission dynamics within an avian reservoir can be estimated and that real-time forecast of spillover avian influenza in humans is possible. This article is copyright of The Authors, 2017.
NASA Astrophysics Data System (ADS)
Pan, Xiaoduo; Li, Xin; Cheng, Guodong
2017-04-01
Traditionally, ground-based, in situ observations, remote sensing, and regional climate modeling, individually, cannot provide the high-quality precipitation data required for hydrological prediction, especially over complex terrain. Data assimilation techniques are often used to assimilate ground observations and remote sensing products into models for dynamic downscaling. In this study, the Weather Research and Forecasting (WRF) model was used to assimilate two satellite precipitation products (TRMM 3B42 and FY-2D) using the 4D-Var data assimilation method. The results show that the assimilation of remote sensing precipitation products can improve the initial WRF fields of humidity and temperature, thereby improving precipitation forecasting and decreasing the spin-up time. Hence, assimilating TRMM and FY-2D remote sensing precipitation products using WRF 4D-Var can be viewed as a positive step toward improving the accuracy and lead time of numerical weather prediction models, particularly for short-term weather forecasting. Future work is proposed to assimilate a suite of remote sensing data, e.g., the combination of precipitation and soil moisture data, into a WRF model to improve 7-8 day forecasts of precipitation and other atmospheric variables.
Validation of reactive gases and aerosols in the MACC global analysis and forecast system
NASA Astrophysics Data System (ADS)
Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.
2015-02-01
The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.
NASA Astrophysics Data System (ADS)
Courdent, Vianney; Grum, Morten; Munk-Nielsen, Thomas; Mikkelsen, Peter S.
2017-05-01
Precipitation is the cause of major perturbation to the flow in urban drainage and wastewater systems. Flow forecasts, generated by coupling rainfall predictions with a hydrologic runoff model, can potentially be used to optimize the operation of integrated urban drainage-wastewater systems (IUDWSs) during both wet and dry weather periods. Numerical weather prediction (NWP) models have significantly improved in recent years, having increased their spatial and temporal resolution. Finer resolution NWP are suitable for urban-catchment-scale applications, providing longer lead time than radar extrapolation. However, forecasts are inevitably uncertain, and fine resolution is especially challenging for NWP. This uncertainty is commonly addressed in meteorology with ensemble prediction systems (EPSs). Handling uncertainty is challenging for decision makers and hence tools are necessary to provide insight on ensemble forecast usage and to support the rationality of decisions (i.e. forecasts are uncertain and therefore errors will be made; decision makers need tools to justify their choices, demonstrating that these choices are beneficial in the long run). This study presents an economic framework to support the decision-making process by providing information on when acting on the forecast is beneficial and how to handle the EPS. The relative economic value (REV) approach associates economic values with the potential outcomes and determines the preferential use of the EPS forecast. The envelope curve of the REV diagram combines the results from each probability forecast to provide the highest relative economic value for a given gain-loss ratio. This approach is traditionally used at larger scales to assess mitigation measures for adverse events (i.e. the actions are taken when events are forecast). The specificity of this study is to optimize the energy consumption in IUDWS during low-flow periods by exploiting the electrical smart grid market (i.e. the actions are taken when no events are forecast). Furthermore, the results demonstrate the benefit of NWP neighbourhood post-processing methods to enhance the forecast skill and increase the range of beneficial uses.
NASA Astrophysics Data System (ADS)
Pillosu, F. M.; Jurlina, T.; Baugh, C.; Tsonevsky, I.; Hewson, T.; Prates, F.; Pappenberger, F.; Prudhomme, C.
2017-12-01
During hurricane Harvey the greater east Texas area was affected by extensive flash flooding. Their localised nature meant they were too small for conventional large scale flood forecasting systems to capture. We are testing the use of two real time forecast products from the European Centre for Medium-range Weather Forecasts (ECMWF) in combination with local vulnerability information to provide flash flood forecasting tools at the medium range (up to 7 days ahead). Meteorological forecasts are the total precipitation extreme forecast index (EFI), a measure of how the ensemble forecast probability distribution differs from the model-climate distribution for the chosen location, time of year and forecast lead time; and the shift of tails (SOT) which complements the EFI by quantifying how extreme an event could potentially be. Both products give the likelihood of flash flood generating precipitation. For hurricane Harvey, 3-day EFI and SOT products for the period 26th - 29th August 2017 were used, generated from the twice daily, 18 km, 51 ensemble member ECMWF Integrated Forecast System. After regridding to 1 km resolution the forecasts were combined with vulnerable area data to produce a flash flood hazard risk area. The vulnerability data were floodplains (EU Joint Research Centre), road networks (Texas Department of Transport) and urban areas (Census Bureau geographic database), together reflecting the susceptibility to flash floods from the landscape. The flash flood hazard risk area forecasts were verified using a traditional approach against observed National Weather Service flash flood reports, a total of 153 reported flash floods have been detected in that period. Forecasts performed best for SOT = 5 (hit ratio = 65%, false alarm ratio = 44%) and EFI = 0.7 (hit ratio = 74%, false alarm ratio = 45%) at 72 h lead time. By including the vulnerable areas data, our verification results improved by 5-15%, demonstrating the value of vulnerability information within natural hazard forecasts. This research shows that flash flooding from hurricane Harvey was predictable up to 4 days ahead and that filtering the forecasts to vulnerable areas provides a more focused guidance to civil protection agencies planning their emergency response.
A multiscale forecasting method for power plant fleet management
NASA Astrophysics Data System (ADS)
Chen, Hongmei
In recent years the electric power industry has been challenged by a high level of uncertainty and volatility brought on by deregulation and globalization. A power producer must minimize the life cycle cost while meeting stringent safety and regulatory requirements and fulfilling customer demand for high reliability. Therefore, to achieve true system excellence, a more sophisticated system-level decision-making process with a more accurate forecasting support system to manage diverse and often widely dispersed generation units as a single, easily scaled and deployed fleet system in order to fully utilize the critical assets of a power producer has been created as a response. The process takes into account the time horizon for each of the major decision actions taken in a power plant and develops methods for information sharing between them. These decisions are highly interrelated and no optimal operation can be achieved without sharing information in the overall process. The process includes a forecasting system to provide information for planning for uncertainty. A new forecasting method is proposed, which utilizes a synergy of several modeling techniques properly combined at different time-scales of the forecasting objects. It can not only take advantages of the abundant historical data but also take into account the impact of pertinent driving forces from the external business environment to achieve more accurate forecasting results. Then block bootstrap is utilized to measure the bias in the estimate of the expected life cycle cost which will actually be needed to drive the business for a power plant in the long run. Finally, scenario analysis is used to provide a composite picture of future developments for decision making or strategic planning. The decision-making process is applied to a typical power producer chosen to represent challenging customer demand during high-demand periods. The process enhances system excellence by providing more accurate market information, evaluating the impact of external business environment, and considering cross-scale interactions between decision actions. Along with this process, system operation strategies, maintenance schedules, and capacity expansion plans that guide the operation of the power plant are optimally identified, and the total life cycle costs are estimated.
A stochastic post-processing method for solar irradiance forecasts derived from NWPs models
NASA Astrophysics Data System (ADS)
Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.
2010-09-01
Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
Post-processing method for wind speed ensemble forecast using wind speed and direction
NASA Astrophysics Data System (ADS)
Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin
2017-04-01
Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.
NASA Astrophysics Data System (ADS)
Jolliff, Jason Keith; Smith, Travis A.; Ladner, Sherwin; Arnone, Robert A.
2014-03-01
The U.S. Naval Research Laboratory (NRL) is developing nowcast/forecast software systems designed to combine satellite ocean color data streams with physical circulation models in order to produce prognostic fields of ocean surface materials. The Deepwater Horizon oil spill in the Gulf of Mexico provided a test case for the Bio-Optical Forecasting (BioCast) system to rapidly combine the latest satellite imagery of the oil slick distribution with surface circulation fields in order to produce oil slick transport scenarios and forecasts. In one such sequence of experiments, MODIS satellite true color images were combined with high-resolution ocean circulation forecasts from the Coupled Ocean-Atmosphere Mesoscale Prediction System (COAMPS®) to produce 96-h oil transport simulations. These oil forecasts predicted a major oil slick landfall at Grand Isle, Louisiana, USA that was subsequently observed. A key driver of the landfall scenario was the development of a coastal buoyancy current associated with Mississippi River Delta freshwater outflow. In another series of experiments, longer-term regional circulation model results were combined with oil slick source/sink scenarios to simulate the observed containment of surface oil within the Gulf of Mexico. Both sets of experiments underscore the importance of identifying and simulating potential hydrodynamic conduits of surface oil transport. The addition of explicit sources and sinks of surface oil concentrations provides a framework for increasingly complex oil spill modeling efforts that extend beyond horizontal trajectory analysis.
A Feature Fusion Based Forecasting Model for Financial Time Series
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
2014-01-01
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455
NASA Astrophysics Data System (ADS)
Qiu, Yunfei; Li, Xizhong; Zheng, Wei; Hu, Qinghe; Wei, Zhanmeng; Yue, Yaqin
2017-08-01
The climate changes have great impact on the residents’ electricity consumption, so the study on the impact of climatic factors on electric power load is of significance. In this paper, the effects of the data of temperature, rainfall and wind of smart city on short-term power load is studied to predict power load. The authors studied the relation between power load and daily temperature, rainfall and wind in the 31 days of January of one year. In the research, the authors used the Matlab neural network toolbox to establish the combinational forecasting model. The authors trained the original input data continuously to get the internal rules inside the data and used the rules to predict the daily power load in the next January. The prediction method relies on the accuracy of weather forecasting. If the weather forecasting is different from the actual weather, we need to correct the climatic factors to ensure accurate prediction.
A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.
2013-07-25
This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load valuesmore » to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less
Sufficient Forecasting Using Factor Models
Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei
2017-01-01
We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537
NASA Astrophysics Data System (ADS)
Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.
2005-12-01
Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.
MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity
NASA Technical Reports Server (NTRS)
Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor
2014-01-01
MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free-magnetic-energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the "Present MAG4" technique and each of three alternative techniques, called "McIntosh Active-Region Class," "Total Magnetic Flux," and "Next MAG4." We do this by using (1) the MAG4 database of magnetograms and major-flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique-performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4).
Evolving forecasting classifications and applications in health forecasting
Soyiri, Ireneous N; Reidpath, Daniel D
2012-01-01
Health forecasting forewarns the health community about future health situations and disease episodes so that health systems can better allocate resources and manage demand. The tools used for developing and measuring the accuracy and validity of health forecasts commonly are not defined although they are usually adapted forms of statistical procedures. This review identifies previous typologies used in classifying the forecasting methods commonly used in forecasting health conditions or situations. It then discusses the strengths and weaknesses of these methods and presents the choices available for measuring the accuracy of health-forecasting models, including a note on the discrepancies in the modes of validation. PMID:22615533
Physician supply forecast: better than peering in a crystal ball?
Roberfroid, Dominique; Leonard, Christian; Stordeur, Sabine
2009-01-01
Background Anticipating physician supply to tackle future health challenges is a crucial but complex task for policy planners. A number of forecasting tools are available, but the methods, advantages and shortcomings of such tools are not straightforward and not always well appraised. Therefore this paper had two objectives: to present a typology of existing forecasting approaches and to analyse the methodology-related issues. Methods A literature review was carried out in electronic databases Medline-Ovid, Embase and ERIC. Concrete examples of planning experiences in various countries were analysed. Results Four main forecasting approaches were identified. The supply projection approach defines the necessary inflow to maintain or to reach in the future an arbitrary predefined level of service offer. The demand-based approach estimates the quantity of health care services used by the population in the future to project physician requirements. The needs-based approach involves defining and predicting health care deficits so that they can be addressed by an adequate workforce. Benchmarking health systems with similar populations and health profiles is the last approach. These different methods can be combined to perform a gap analysis. The methodological challenges of such projections are numerous: most often static models are used and their uncertainty is not assessed; valid and comprehensive data to feed into the models are often lacking; and a rapidly evolving environment affects the likelihood of projection scenarios. As a result, the internal and external validity of the projections included in our review appeared limited. Conclusion There is no single accepted approach to forecasting physician requirements. The value of projections lies in their utility in identifying the current and emerging trends to which policy-makers need to respond. A genuine gap analysis, an effective monitoring of key parameters and comprehensive workforce planning are key elements to improving the usefulness of physician supply projections. PMID:19216772
A Study on the Potential Applications of Satellite Data in Air Quality Monitoring and Forecasting
NASA Technical Reports Server (NTRS)
Li, Can; Hsu, N. Christina; Tsay, Si-Chee
2011-01-01
In this study we explore the potential applications of MODIS (Moderate Resolution Imaging Spectroradiometer) -like satellite sensors in air quality research for some Asian regions. The MODIS aerosol optical thickness (AOT), NCEP global reanalysis meteorological data, and daily surface PM(sub 10) concentrations over China and Thailand from 2001 to 2009 were analyzed using simple and multiple regression models. The AOT-PM(sub 10) correlation demonstrates substantial seasonal and regional difference, likely reflecting variations in aerosol composition and atmospheric conditions, Meteorological factors, particularly relative humidity, were found to influence the AOT-PM(sub 10) relationship. Their inclusion in regression models leads to more accurate assessment of PM(sub 10) from space borne observations. We further introduced a simple method for employing the satellite data to empirically forecast surface particulate pollution, In general, AOT from the previous day (day 0) is used as a predicator variable, along with the forecasted meteorology for the following day (day 1), to predict the PM(sub 10) level for day 1. The contribution of regional transport is represented by backward trajectories combined with AOT. This method was evaluated through PM(sub 10) hindcasts for 2008-2009, using ohservations from 2005 to 2007 as a training data set to obtain model coefficients. For five big Chinese cities, over 50% of the hindcasts have percentage error less than or equal to 30%. Similar performance was achieved for cities in northern Thailand. The MODIS AOT data are responsible for at least part of the demonstrated forecasting skill. This method can be easily adapted for other regions, but is probably most useful for those having sparse ground monitoring networks or no access to sophisticated deterministic models. We also highlight several existing issues, including some inherent to a regression-based approach as exemplified by a case study for Beijing, Further studies will be necessa1Y before satellite data can see more extensive applications in the operational air quality monitoring and forecasting.
Automated time series forecasting for biosurveillance.
Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit
2007-09-30
For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.
Real-time eruption forecasting using the material Failure Forecast Method with a Bayesian approach
NASA Astrophysics Data System (ADS)
Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.
2015-04-01
Many attempts for deterministic forecasting of eruptions and landslides have been performed using the material Failure Forecast Method (FFM). This method consists in adjusting an empirical power law on precursory patterns of seismicity or deformation. Until now, most of the studies have presented hindsight forecasts based on complete time series of precursors and do not evaluate the ability of the method for carrying out real-time forecasting with partial precursory sequences. In this study, we present a rigorous approach of the FFM designed for real-time applications on volcano-seismic precursors. We use a Bayesian approach based on the FFM theory and an automatic classification of seismic events. The probability distributions of the data deduced from the performance of this classification are used as input. As output, it provides the probability of the forecast time at each observation time before the eruption. The spread of the a posteriori probability density function of the prediction time and its stability with respect to the observation time are used as criteria to evaluate the reliability of the forecast. We test the method on precursory accelerations of long-period seismicity prior to vulcanian explosions at Volcán de Colima (Mexico). For explosions preceded by a single phase of seismic acceleration, we obtain accurate and reliable forecasts using approximately 80% of the whole precursory sequence. It is, however, more difficult to apply the method to multiple acceleration patterns.
Benefits of Sharing Information: Supermodel Ensemble and Applications in South America
NASA Astrophysics Data System (ADS)
Dias, P. L.
2006-05-01
A model intercomparison program involving a large number of academic and operational institutions has been implemented in South America since 2003, motivated by the SALLJEX Intercomparison Program in 2003 (a research program focused on the identification of the role of the Andes low level jet moisture transport from the Amazon to the Plata basin) and the WMO/THORPEX (www.wmo.int/thorpex) goals to improve predictability through the proper combination of numerical weather forecasts. This program also explores the potential predictability associated with the combination of a large number of possible scenarios in the time scale of a few days to up to 15 days. Five academic institutions and five operational forecasting centers in several countries in South America, 1 academic institution in the USA, and the main global forecasting centers (NCEP, UKMO, ECMWF) agreed to provide numerical products based on operational and experimental models. The metric for model validation is concentrated on the fit of the forecast to surface observations. Meteorological data from airports, synoptic stations operated by national weather services, automatic data platforms maintained by different institutions, the PIRATA buoys etc are all collected through LDM/NCAR or direct transmission. Approximately 40 models outputs are available on a daily basis, twice a day. A simple procedure based on data assimilation principles was quite successful in combining the available forecasts in order to produce temperature, dew point, wind, pressure and precipitation forecasts at station points in S. America. The procedure is based on removing each model bias at the observational point and a weighted average based on the mean square error of the forecasts. The base period for estimating the bias and mean square error is of the order of 15 to 30 days. Products of the intercomparison model program and the optimal statistical combination of the available forecasts are public and available in real time (www.master.iag.usp.br/). Monitoring of the use of the products reveal a growing trend in the last year (reaching about 10.000 accesses per day in recent months). The intercomparison program provides a rich data set for educational products (real time use in Synoptic Meteorology and Numerical Weather Forecasting lectures), operational weather forecasts in national or regional weather centers and for research purposes. During the first phase of the program it was difficult to convince potential participants to share the information in the public homepage. However, as the system evolved, more and more institutions became associated with the program. The general opinion of the participants is that the system provides an unified metric for evaluation, a forum for discussion of the physical origin of the model forecast differences and therefore improvement of the quality of the numerical guidance.
Economic indicators selection for crime rates forecasting using cooperative feature selection
NASA Astrophysics Data System (ADS)
Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Salleh Sallehuddin, Roselina
2013-04-01
Features selection in multivariate forecasting model is very important to ensure that the model is accurate. The purpose of this study is to apply the Cooperative Feature Selection method for features selection. The features are economic indicators that will be used in crime rate forecasting model. The Cooperative Feature Selection combines grey relational analysis and artificial neural network to establish a cooperative model that can rank and select the significant economic indicators. Grey relational analysis is used to select the best data series to represent each economic indicator and is also used to rank the economic indicators according to its importance to the crime rate. After that, the artificial neural network is used to select the significant economic indicators for forecasting the crime rates. In this study, we used economic indicators of unemployment rate, consumer price index, gross domestic product and consumer sentiment index, as well as data rates of property crime and violent crime for the United States. Levenberg-Marquardt neural network is used in this study. From our experiments, we found that consumer price index is an important economic indicator that has a significant influence on the violent crime rate. While for property crime rate, the gross domestic product, unemployment rate and consumer price index are the influential economic indicators. The Cooperative Feature Selection is also found to produce smaller errors as compared to Multiple Linear Regression in forecasting property and violent crime rates.
Forecasting the short-term passenger flow on high-speed railway with neural networks.
Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing
2014-01-01
Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway.
Use of JPSS ATMS, CrIS, and VIIRS data to Improve Tropical Cyclone Track and Intensity Forecasting
NASA Astrophysics Data System (ADS)
Chirokova, G.; Demaria, M.; DeMaria, R.; Knaff, J. A.; Dostalek, J.; Musgrave, K. D.; Beven, J. L.
2015-12-01
JPSS data provide unique information that could be critical for the forecasting of tropical cyclone (TC) track and intensity and is currently underutilized. Preliminary results from several TC applications using data from the Advanced Technology Microwave Sounder (ATMS), the Cross-Track Infrared Sounder (CrIS), and the Visible Infrared Imaging Radiometer Suite (VIIRS), carried by the Suomi National Polar-Orbiting Partnership satellite (SNPP), will be discussed. The first group of applications, which includes applications for moisture flux and for eye-detection, aims to improve rapid intensification (RI) forecasts, which is one of the highest priorities within NOAA. The applications could be used by forecasters directly and will also provide additional input to the Rapid Intensification Index (RII), the statistical-dynamical tool for forecasting RI events that is operational at the National Hurricane Center. The moisture flux application uses bias-corrected ATMS-MIRS (Microwave Integrated Retrieval System) and NUCAPS (NOAA Unique CrIS ATMS Processing System), retrievals that provide very accurate temperature and humidity soundings in the TC environment to detect dry air intrusions. The objective automated eye-detection application uses geostationary and VIIRS data in combination with machine learning and computer vision techniques for determining the onset of eye formation which is very important for TC intensity forecast but is usually determined by subjective methods. First version of the algorithm showed very promising results with a 75% success rate. The second group of applications develops tools to better utilize VIIRS data, including day-night band (DNB) imagery, for tropical cyclone forecasting. Disclaimer: The views, opinions, and findings contained in this article are those of the authors and should not be construed as an official National Oceanic and Atmospheric Administration (NOAA) or U.S. Government position, policy, or decision.
Shukla, Shraddhanand; McEvoy, Daniel; Hobbins, Michael; Husak, Gregory; Huntington, Justin; Funk, Chris; Macharia, Denis; Verdin, James P.
2017-01-01
The Famine Early Warning Systems Network (FEWS NET) team provides food insecurity outlooks for several developing countries in Africa, Central Asia, and Central America. This study describes development of a new global reference evapotranspiration (ETo) seasonal reforecast and skill evaluation with a particular emphasis on the potential use of this dataset by the FEWS NET to support food insecurity early warning. The ETo reforecasts span the 1982-2009 period and are calculated following ASCE’s formulation of Penman-Monteith method driven by seasonal climate forecasts of monthly mean temperature, humidity, wind speed, and solar radiation from NCEP’s CFSv2 and NASA’s GEOS-5 models. The skill evaluation using deterministic and probabilistic scores, focuses on the December-February (DJF), March-May (MAM), June-August (JJA) and September-November (SON) seasons. The results indicate that ETo forecasts are a promising tool for early warning of drought and food insecurity. Globally, the regions where forecasts are most skillful (correlation >0.35 at lead-2) include Western U.S., northern parts of South America, parts of Sahel region and Southern Africa. The FEWS NET regions where forecasts are most skillful (correlation >0.35 at lead-3) include Northern Sub-Saharan Africa (DJF, dry season), Central America (DJF, dry season), parts of East Africa (JJA, wet Season), Southern Africa (JJA, dry season), and Central Asia (MAM, wet season). A case study over parts of East Africa for the JJA season shows that ETo forecasts in combination with the precipitation forecasts could have provided early warning of recent severe drought events (e.g., 2002, 2004, 2009) that contributed to substantial food insecurity in the region.
Initial assessment of a multi-model approach to spring flood forecasting in Sweden
NASA Astrophysics Data System (ADS)
Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.
2015-06-01
Hydropower is a major energy source in Sweden and proper reservoir management prior to the spring flood onset is crucial for optimal production. This requires useful forecasts of the accumulated discharge in the spring flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialised set-up of the HBV model. In this study, a number of new approaches to spring flood forecasting, that reflect the latest developments with respect to analysis and modelling on seasonal time scales, are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for three main Swedish rivers over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for specific locations and lead times improvements of 20-30 % are found. When combining all forecasts in a weighted multi-model approach, a mean improvement over all locations and lead times of nearly 10 % was indicated. This demonstrates the potential of the approach and further development and optimisation into an operational system is ongoing.
Evaluating simplified methods for liquefaction assessment for loss estimation
NASA Astrophysics Data System (ADS)
Kongar, Indranil; Rossetto, Tiziana; Giovinazzi, Sonia
2017-06-01
Currently, some catastrophe models used by the insurance industry account for liquefaction by applying a simple factor to shaking-induced losses. The factor is based only on local liquefaction susceptibility and this highlights the need for a more sophisticated approach to incorporating the effects of liquefaction in loss models. This study compares 11 unique models, each based on one of three principal simplified liquefaction assessment methods: liquefaction potential index (LPI) calculated from shear-wave velocity, the HAZUS software method and a method created specifically to make use of USGS remote sensing data. Data from the September 2010 Darfield and February 2011 Christchurch earthquakes in New Zealand are used to compare observed liquefaction occurrences to forecasts from these models using binary classification performance measures. The analysis shows that the best-performing model is the LPI calculated using known shear-wave velocity profiles, which correctly forecasts 78 % of sites where liquefaction occurred and 80 % of sites where liquefaction did not occur, when the threshold is set at 7. However, these data may not always be available to insurers. The next best model is also based on LPI but uses shear-wave velocity profiles simulated from the combination of USGS VS30 data and empirical functions that relate VS30 to average shear-wave velocities at shallower depths. This model correctly forecasts 58 % of sites where liquefaction occurred and 84 % of sites where liquefaction did not occur, when the threshold is set at 4. These scores increase to 78 and 86 %, respectively, when forecasts are based on liquefaction probabilities that are empirically related to the same values of LPI. This model is potentially more useful for insurance since the input data are publicly available. HAZUS models, which are commonly used in studies where no local model is available, perform poorly and incorrectly forecast 87 % of sites where liquefaction occurred, even at optimal thresholds. This paper also considers two models (HAZUS and EPOLLS) for estimation of the scale of liquefaction in terms of permanent ground deformation but finds that both models perform poorly, with correlations between observations and forecasts lower than 0.4 in all cases. Therefore these models potentially provide negligible additional value to loss estimation analysis outside of the regions for which they have been developed.
NASA Astrophysics Data System (ADS)
Tanguy, M.; Prudhomme, C.; Harrigan, S.; Smith, K. A.; Parry, S.
2017-12-01
Forecasting hydrological extremes is challenging, especially at lead times over 1 month for catchments with limited hydrological memory and variable climates. One simple way to derive monthly or seasonal hydrological forecasts is to use historical climate data to drive hydrological models using the Ensemble Streamflow Prediction (ESP) method. This gives a range of possible future streamflow given known initial hydrologic conditions alone. The degree of skill of ESP depends highly on the forecast initialisation month and catchment type. Using dynamic rainfall forecasts as driving data instead of historical data could potentially improve streamflow predictions. A lot of effort is being invested within the meteorological community to improve these forecasts. However, while recent progress shows promise (e.g. NAO in winter), the skill of these forecasts at monthly to seasonal timescales is generally still limited, and the extent to which they might lead to improved hydrological forecasts is an area of active research. Additionally, these meteorological forecasts are currently being produced at 1 month or seasonal time-steps in the UK, whereas hydrological models require forcings at daily or sub-daily time-steps. Keeping in mind these limitations of available rainfall forecasts, the objectives of this study are to find out (i) how accurate monthly dynamical rainfall forecasts need to be to outperform ESP, and (ii) how the method used to disaggregate monthly rainfall forecasts into daily rainfall time series affects results. For the first objective, synthetic rainfall time series were created by increasingly degrading observed data (proxy for a `perfect forecast') from 0 % to +/-50 % error. For the second objective, three different methods were used to disaggregate monthly rainfall data into daily time series. These were used to force a simple lumped hydrological model (GR4J) to generate streamflow predictions at a one-month lead time for over 300 catchments representative of the range of UK's hydro-climatic conditions. These forecasts were then benchmarked against the traditional ESP method. It is hoped that the results of this work will help the meteorological community to identify where to focus their efforts in order to increase the usefulness of their forecasts within hydrological forecasting systems.
Man-made Boards Technology Trends based on TRIZ Evolution Theory
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin
China is one of the world's largest manufacturers and consumers of man-made board applications. A systematic and efficient method of foreseeing future technology trends and their evolutionary potentials is a key task that can help companies guide their planning and allocate their resources. Application of the law of evolution with a S-shaped curve could contribute essentially to the accuracy of the long-term forecast. This research seeks to determine the current stage and the position on the S-curve of man-made board technology in China on the TRIZ evolution theo ryand introduce a methodology which combines patent analysis and technology life cycle forecasting to find a niche space of man-made technology development in China.
Barba, Lida; Rodríguez, Nibaldo; Montt, Cecilia
2014-01-01
Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0:26%, followed by MA-ARIMA with a MAPE of 1:12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15:51%.
Supplier Short Term Load Forecasting Using Support Vector Regression and Exogenous Input
NASA Astrophysics Data System (ADS)
Matijaš, Marin; Vukićcević, Milan; Krajcar, Slavko
2011-09-01
In power systems, task of load forecasting is important for keeping equilibrium between production and consumption. With liberalization of electricity markets, task of load forecasting changed because each market participant has to forecast their own load. Consumption of end-consumers is stochastic in nature. Due to competition, suppliers are not in a position to transfer their costs to end-consumers; therefore it is essential to keep forecasting error as low as possible. Numerous papers are investigating load forecasting from the perspective of the grid or production planning. We research forecasting models from the perspective of a supplier. In this paper, we investigate different combinations of exogenous input on the simulated supplier loads and show that using points of delivery as a feature for Support Vector Regression leads to lower forecasting error, while adding customer number in different datasets does the opposite.
ERIC Educational Resources Information Center
Klopfenstein, Bruce C.
1989-01-01
Describes research that examined the strengths and weaknesses of technological forecasting methods by analyzing forecasting studies made for home video players. The discussion covers assessments and explications of correct and incorrect forecasting assumptions, and their implications for forecasting the adoption of home information technologies…
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
A Delphi forecast of technology in education
NASA Technical Reports Server (NTRS)
Robinson, B. E.
1973-01-01
The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.
Applications of the gambling score in evaluating earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe
2010-05-01
This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.
Gambling scores for earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang
2010-04-01
This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.
Multivariate postprocessing techniques for probabilistic hydrological forecasting
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian
2016-04-01
Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power generation, Applied Energy, 96, 12-20, DOI: 10.1016/j.apenergy.2011.11.004. Schefzik, R., T. L. Thorarinsdottir, and T. Gneiting (2013), Uncertainty quantification in complex simulation models using ensemble copula coupling, Statistical Science, 28, 616-640, DOI: 10.1214/13-STS443.
Evaluation of the CFSv2 CMIP5 decadal predictions
NASA Astrophysics Data System (ADS)
Bombardi, Rodrigo J.; Zhu, Jieshun; Marx, Lawrence; Huang, Bohua; Chen, Hua; Lu, Jian; Krishnamurthy, Lakshmi; Krishnamurthy, V.; Colfescu, Ioana; Kinter, James L.; Kumar, Arun; Hu, Zeng-Zhen; Moorthi, Shrinivas; Tripp, Patrick; Wu, Xingren; Schneider, Edwin K.
2015-01-01
Retrospective decadal forecasts were undertaken using the Climate Forecast System version 2 (CFSv2) as part of Coupled Model Intercomparison Project 5. Decadal forecasts were performed separately by the National Center for Environmental Prediction (NCEP) and by the Center for Ocean-Land-Atmosphere Studies (COLA), with the centers using two different analyses for the ocean initial conditions the NCEP Climate Forecast System Reanalysis (CFSR) and the NEMOVAR-COMBINE analysis. COLA also examined the sensitivity to the inclusion of forcing by specified volcanic aerosols. Biases in the CFSv2 for both sets of initial conditions include cold midlatitude sea surface temperatures, and rapid melting of sea ice associated with warm polar oceans. Forecasts from the NEMOVAR-COMBINE analysis showed strong weakening of the Atlantic Meridional Overturning Circulation (AMOC), eventually approaching the weaker AMOC associated with CFSR. The decadal forecasts showed high predictive skill over the Indian, the western Pacific, and the Atlantic Oceans and low skill over the central and eastern Pacific. The volcanic forcing shows only small regional differences in predictability of surface temperature at 2m (T2m) in comparison to forecasts without volcanic forcing, especially over the Indian Ocean. An ocean heat content (OHC) budget analysis showed that the OHC has substantial memory, indicating potential for the decadal predictability of T2m; however, the model has a systematic drift in global mean OHC. The results suggest that the reduction of model biases may be the most productive path towards improving the model's decadal forecasts.
Load Forecasting of Central Urban Area Power Grid Based on Saturated Load Density Index
NASA Astrophysics Data System (ADS)
Huping, Yang; Chengyi, Tang; Meng, Yu
2018-03-01
In the current society, coordination between urban power grid development and city development has become more and more prominent. Electricity saturated load forecasting plays an important role in the planning and development of power grids. Electricity saturated load forecasting is a new concept put forward by China in recent years in the field of grid planning. Urban saturation load forecast is different from the traditional load forecasting method for specific years, the time span of it often relatively large, and involves a wide range of aspects. This study takes a county in eastern Jiangxi as an example, this paper chooses a variety of load forecasting methods to carry on the recent load forecasting calculation to central urban area. At the same time, this paper uses load density index method to predict the Longterm load forecasting of electric saturation load of central urban area lasted until 2030. And further study shows the general distribution of the urban saturation load in space.
Long- Range Forecasting Of The Onset Of Southwest Monsoon Winds And Waves Near The Horn Of Africa
2017-12-01
SUMMARY OF CLIMATE ANALYSIS AND LONG-RANGE FORECAST METHODOLOGY Prior theses from Heidt (2006) and Lemke (2010) used methods similar to ours and to...6 II. DATA AND METHODS .......................................................................................7 A...9 D. ANALYSIS AND FORECAST METHODS .........................................10 1. Predictand Selection
Predicting Academic Library Circulations: A Forecasting Methods Competition.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Forys, John W., Jr.
Based on sample data representing five years of monthly circulation totals from 50 academic libraries in Illinois, Iowa, Michigan, Minnesota, Missouri, and Ohio, a study was conducted to determine the most efficient smoothing forecasting methods for academic libraries. Smoothing forecasting methods were chosen because they have been characterized…
Volcano Deformation and Eruption Forecasting using Data Assimilation: Building the Strategy
NASA Astrophysics Data System (ADS)
Bato, M. G.; Pinel, V.; Yan, Y.
2016-12-01
In monitoring active volcanoes, the magma overpressure is one of the key parameters used in forecasting volcanic eruptions. This can be inferred from the ground displacements measured on the Earth's surface by applying inversion techniques. However, during the inversion, we lose the temporal characteristic along with huge amount of information about the behaviour of the volcano. Our work focuses on developing a strategy in order to better forecast the magma overpressure using data assimilation. Data assimilation is a sequential time-forward process that best combines models and observations, sometimes a priori information based on error statistics, to predict the state of a dynamical system. It has gained popularity in various fields of geoscience (e.g. ocean-weather forecasting, geomagnetism and natural resources exploration), but remains a new and emerging technique in the field of volcanology. With the increasing amount of geodetic data (i.e. InSAR and GPS) recorded on volcanoes nowadays, and the wide-range availability of dynamical models that can provide better understanding about the volcano plumbing system; developing a forecasting framework that can efficiently combine them is crucial. Here, we particularly built our strategy on the basis of the Ensemble Kalman Filter (EnKF) [1]. We predict the temporal behaviours of the magma overpressures and surface deformations by adopting the two-magma chamber model proposed by Reverso et. al., 2014 [2] and by using synthetic GPS and/or InSAR data. Several tests are performed in order to answer the following: 1) know the efficiency of EnKF in forecasting volcanic unrests, 2) constrain unknown parameters of the model, 3) properly use GPS and/or InSAR during assimilation and 4) compare EnKF with classic inversion while using the same dynamical model. Results show that EnKF works well with the synthetic cases and there is a great potential in utilising the method for real-time monitoring of volcanic unrests. [1] Evensen, G., The Ensemble Kalman Filter: theoretical formulation and practical implementation. Ocean Dyn.,53, 343-367, 2003 [2] T. Reverso, J. Vandemeulebrouck, F. Jouanne, V. Pinel, T. Villemin, E. Sturkell, A two-magma chamber as a source of deformation at Grimsvötn volcano, Iceland, JGR, 2014
Development of a satellite-based nowcasting system for surface solar radiation
NASA Astrophysics Data System (ADS)
Limbach, Sebastian; Hungershoefer, Katja; Müller, Richard; Trentmann, Jörg; Asmus, Jörg; Schömer, Elmar; Groß, André
2014-05-01
The goal of the RadNowCast project was the development of a tool-chain for a satellite-based nowcasting of the all sky global and direct surface solar radiation. One important application of such short-term forecasts is the computation of the expected energy yield of photovoltaic systems. This information is of great importance for an efficient balancing of power generation and consumption in large, decentralized power grids. Our nowcasting approach is based on an optical-flow analysis of a series of Meteosat SEVIRI satellite images. For this, we extended and combined several existing software tools and set up a series of benchmarks for determining the optimal forecasting parameters. The first step in our processing-chain is the determination of the cloud albedo from the HRV (High Resolution Visible)-satellite images using a Heliosat-type method. The actual nowcasting is then performed by a commercial software system in two steps: First, vector fields characterizing the movement of the clouds are derived from the cloud albedo data from the previous 15 min to 2 hours. Next, these vector fields are combined with the most recent cloud albedo data in order to extrapolate the cloud albedo in the near future. In the last step of the processing, the Gnu-Magic software is used to calculate the global and direct solar radiation based on the forecasted cloud albedo data. For an evaluation of the strengths and weaknesses of our nowcastig system, we analyzed four different benchmarks, each of which covered different weather conditions. We compared the forecasted data with radiation data derived from the real satellite images of the corresponding time steps. The impact of different parameters on the cloud albedo nowcasting and the surface radiation computation has been analysed. Additionally, we could show that our cloud-albedo-based forecasts outperform forecasts based on the original HRV images. Possible future extension are the incorporation of additional data sources, for example NWC-SAF high resolution wind fields, in order to improve the quality of the atmospheric motion fields, and experiments with custom, optimized software components for the optical-flow estimation and the nowcasting.
Statistical Short-Range Forecast Guidance for Cloud Ceilings Over the Shuttle Landing Facility
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.
2001-01-01
This report describes the results of the AMU's Short-Range Statistical Forecasting task. The cloud ceiling forecast over the Shuttle Landing Facility (SLF) is a critical element in determining whether a Shuttle should land. Spaceflight Meteorology Group (SMG) forecasters find that ceilings at the SLF are challenging to forecast. The AMU was tasked to develop ceiling forecast equations to minimize the challenge. Studies in the literature that showed success in improving short-term forecasts of ceiling provided the basis for the AMU task. A 20-year record of cool-season hourly surface observations from stations in east-central Florida was used for the equation development. Two methods were used: an observations-based (OBS) method that incorporated data from all stations, and a persistence climatology (PCL) method used as the benchmark. Equations were developed for 1-, 2-, and 3-hour lead times at each hour of the day. A comparison between the two methods indicated that the OBS equations performed well and produced an improvement over the PCL equations. Therefore, the conclusion of the AMU study is that OBS equations produced more accurate forecasts than the PCL equations, and can be used in operations. They provide another tool with which to make the ceiling forecasts that are critical to safe Shuttle landings at KSC.
A framework for improving a seasonal hydrological forecasting system using sensitivity analysis
NASA Astrophysics Data System (ADS)
Arnal, Louise; Pappenberger, Florian; Smith, Paul; Cloke, Hannah
2017-04-01
Seasonal streamflow forecasts are of great value for the socio-economic sector, for applications such as navigation, flood and drought mitigation and reservoir management for hydropower generation and water allocation to agriculture and drinking water. However, as we speak, the performance of dynamical seasonal hydrological forecasting systems (systems based on running seasonal meteorological forecasts through a hydrological model to produce seasonal hydrological forecasts) is still limited in space and time. In this context, the ESP (Ensemble Streamflow Prediction) remains an attractive forecasting method for seasonal streamflow forecasting as it relies on forcing a hydrological model (starting from the latest observed or simulated initial hydrological conditions) with historical meteorological observations. This makes it cheaper to run than a standard dynamical seasonal hydrological forecasting system, for which the seasonal meteorological forecasts will first have to be produced, while still producing skilful forecasts. There is thus the need to focus resources and time towards improvements in dynamical seasonal hydrological forecasting systems which will eventually lead to significant improvements in the skill of the streamflow forecasts generated. Sensitivity analyses are a powerful tool that can be used to disentangle the relative contributions of the two main sources of errors in seasonal streamflow forecasts, namely the initial hydrological conditions (IHC; e.g., soil moisture, snow cover, initial streamflow, among others) and the meteorological forcing (MF; i.e., seasonal meteorological forecasts of precipitation and temperature, input to the hydrological model). Sensitivity analyses are however most useful if they inform and change current operational practices. To this end, we propose a method to improve the design of a seasonal hydrological forecasting system. This method is based on sensitivity analyses, informing the forecasters as to which element of the forecasting chain (i.e., IHC or MF) could potentially lead to the highest increase in seasonal hydrological forecasting performance, after each forecast update.
A simplified real time method to forecast semi-enclosed basins storm surge
NASA Astrophysics Data System (ADS)
Pasquali, D.; Di Risio, M.; De Girolamo, P.
2015-11-01
Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.
NASA Astrophysics Data System (ADS)
Busuioc, Aristita; Dumitrescu, Alexandru; Dumitrache, Rodica; Iriza, Amalia
2017-04-01
Seasonal climate forecasts in Europe are currently issued at the European Centre for Medium-Range Weather Forecasts (ECMWF) in the form of multi-model ensemble predictions available within the "EUROSIP" system. Different statistical techniques to calibrate, downscale and combine the EUROSIP direct model output are used to optimize the quality of the final probabilistic forecasts. In this study, a statistical downscaling model (SDM) based on canonical correlation analysis (CCA) is used to downscale the EUROSIP seasonal forecast at a spatial resolution of 1km x 1km over the Movila farm placed in southeastern Romania. This application is achieved in the framework of the H2020 MOSES project (http://www.moses-project.eu). The combination between monthly standardized values of three climate variables (maximum/minimum temperatures-Tmax/Tmin, total precipitation-Prec) is used as predictand while combinations of various large-scale predictors are tested in terms of their availability as outputs in the seasonal EUROSIP probabilistic forecasting (sea level pressure, temperature at 850 hPa and geopotential height at 500 hPa). The predictors are taken from the ECMWF system considering 15 members of the ensemble, for which the hindcasts since 1991 until present are available. The model was calibrated over the period 1991-2014 and predictions for summers 2015 and 2016 were achieved. The calibration was made for the ensemble average as well as for each ensemble member. The model was developed for each lead time: one month anticipation for June, two months anticipation for July and three months anticipation for August. The main conclusions from these preliminary results are: best predictions (in terms of the anomaly sign) for Tmax (July-2 months anticipation, August-3 months anticipation) for both years (2015, 2016); for Tmin - good predictions only for August (3 months anticipation ) for both years; for precipitation, good predictions for July (2 months anticipation) in 2015 and August (3 months anticipation) in 2016; failed prediction for June (1-month anticipation) for all parameters. To see if the results obtained for 2015 and 2016 summers are in agreement with the general ECMWF model performance in forecast of the three predictors used in the CCA SDM calibration, the mean bias and root mean square errors (RMSE) calculated over the entire period in each grid point, for each ensemble member and ensemble average were computed. The obtained results are confirmed, showing highest ECMWF performance in forecasting of the three predictors for 3 months anticipation (August) and lowest performance for one month anticipation (June). The added value of the CCA SDM in forecasting local Tmax/Tmin and total precipitation was compared to the ECMWF performance using nearest grid point method. Comparisons were performed for the 1991-2014 period, taking into account the forecast made in May for July. An important improvement was found for the CCA SDM predictions in terms of the RMSE value (computed against observations) for Tmax/Tmin and less for precipitation. The tests are in progress for the other summer months (June, July).
Remote Sensing and River Discharge Forecasting for Major Rivers in South Asia (Invited)
NASA Astrophysics Data System (ADS)
Webster, P. J.; Hopson, T. M.; Hirpa, F. A.; Brakenridge, G. R.; De-Groeve, T.; Shrestha, K.; Gebremichael, M.; Restrepo, P. J.
2013-12-01
The South Asia is a flashpoint for natural disasters particularly flooding of the Indus, Ganges, and Brahmaputra has profound societal impacts for the region and globally. The 2007 Brahmaputra floods affecting India and Bangladesh, the 2008 avulsion of the Kosi River in India, the 2010 flooding of the Indus River in Pakistan and the 2013 Uttarakhand exemplify disasters on scales almost inconceivable elsewhere. Their frequent occurrence of floods combined with large and rapidly growing populations, high levels of poverty and low resilience, exacerbate the impact of the hazards. Mitigation of these devastating hazards are compounded by limited flood forecast capability, lack of rain/gauge measuring stations and forecast use within and outside the country, and transboundary data sharing on natural hazards. Here, we demonstrate the utility of remotely-derived hydrologic and weather products in producing skillful flood forecasting information without reliance on vulnerable in situ data sources. Over the last decade a forecast system has been providing operational probabilistic forecasts of severe flooding of the Brahmaputra and Ganges Rivers in Bangldesh was developed (Hopson and Webster 2010). The system utilizes ECMWF weather forecast uncertainty information and ensemble weather forecasts, rain gauge and satellite-derived precipitation estimates, together with the limited near-real-time river stage observations from Bangladesh. This system has been expanded to Pakistan and has successfully forecast the 2010-2012 flooding (Shrestha and Webster 2013). To overcome the in situ hydrological data problem, recent efforts in parallel with the numerical modeling have utilized microwave satellite remote sensing of river widths to generate operational discharge advective-based forecasts for the Ganges and Brahmaputra. More than twenty remotely locations upstream of Bangldesh were used to produce stand-alone river flow nowcasts and forecasts at 1-15 days lead time. showing that satellite-based flow estimates are a useful source of dynamical surface water information in data-scarce regions and that they could be used for model calibration and data assimilation purposes in near-time hydrologic forecast applications (Hirpa et al. 2013). More recent efforts during this year's monsoon season are optimally combining these different independent sources of river forecast information along with archived flood inundation imagery of the Dartmouth Flood Observatory to improve the visualization and overall skill of the ongoing CFAB ensemble weather forecast-based flood forecasting system within the unique context of the ongoing flood forecasting efforts for Bangladesh.
Lu, Fred Sun; Hou, Suqin; Baltrusaitis, Kristin; Shah, Manan; Leskovec, Jure; Sosic, Rok; Hawkins, Jared; Brownstein, John; Conidi, Giuseppe; Gunn, Julia; Gray, Josh; Zink, Anna
2018-01-01
Background Influenza outbreaks pose major challenges to public health around the world, leading to thousands of deaths a year in the United States alone. Accurate systems that track influenza activity at the city level are necessary to provide actionable information that can be used for clinical, hospital, and community outbreak preparation. Objective Although Internet-based real-time data sources such as Google searches and tweets have been successfully used to produce influenza activity estimates ahead of traditional health care–based systems at national and state levels, influenza tracking and forecasting at finer spatial resolutions, such as the city level, remain an open question. Our study aimed to present a precise, near real-time methodology capable of producing influenza estimates ahead of those collected and published by the Boston Public Health Commission (BPHC) for the Boston metropolitan area. This approach has great potential to be extended to other cities with access to similar data sources. Methods We first tested the ability of Google searches, Twitter posts, electronic health records, and a crowd-sourced influenza reporting system to detect influenza activity in the Boston metropolis separately. We then adapted a multivariate dynamic regression method named ARGO (autoregression with general online information), designed for tracking influenza at the national level, and showed that it effectively uses the above data sources to monitor and forecast influenza at the city level 1 week ahead of the current date. Finally, we presented an ensemble-based approach capable of combining information from models based on multiple data sources to more robustly nowcast as well as forecast influenza activity in the Boston metropolitan area. The performances of our models were evaluated in an out-of-sample fashion over 4 influenza seasons within 2012-2016, as well as a holdout validation period from 2016 to 2017. Results Our ensemble-based methods incorporating information from diverse models based on multiple data sources, including ARGO, produced the most robust and accurate results. The observed Pearson correlations between our out-of-sample flu activity estimates and those historically reported by the BPHC were 0.98 in nowcasting influenza and 0.94 in forecasting influenza 1 week ahead of the current date. Conclusions We show that information from Internet-based data sources, when combined using an informed, robust methodology, can be effectively used as early indicators of influenza activity at fine geographic resolutions. PMID:29317382
Reservoir studies with geostatistics to forecast performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, R.W.; Behrens, R.A.; Emanuel, A.S.
1991-05-01
In this paper example geostatistics and streamtube applications are presented for waterflood and CO{sub 2} flood in two low-permeability sandstone reservoirs. Thy hybrid approach of combining fine vertical resolution in cross-sectional models with streamtubes resulted in models that showed water channeling and provided realistic performance estimates. Results indicate that the combination of detailed geostatistical cross sections and fine-grid streamtube models offers a systematic approach for realistic performance forecasts.
NASA Astrophysics Data System (ADS)
Aulov, Oleg
This dissertation presents a novel approach that utilizes quantifiable social media data as a human aware, near real-time observing system, coupled with geophysical predictive models for improved response to disasters and extreme events. It shows that social media data has the potential to significantly improve disaster management beyond informing the public, and emphasizes the importance of different roles that social media can play in management, monitoring, modeling and mitigation of natural and human-caused extreme disasters. In the proposed approach Social Media users are viewed as "human sensors" that are "deployed" in the field, and their posts are considered to be "sensor observations", thus different social media outlets all together form a Human Sensor Network. We utilized the "human sensor" observations, as boundary value forcings, to show improved geophysical model forecasts of extreme disaster events when combined with other scientific data such as satellite observations and sensor measurements. Several recent extreme disasters are presented as use case scenarios. In the case of the Deepwater Horizon oil spill disaster of 2010 that devastated the Gulf of Mexico, the research demonstrates how social media data from Flickr can be used as a boundary forcing condition of GNOME oil spill plume forecast model, and results in an order of magnitude forecast improvement. In the case of Hurricane Sandy NY/NJ landfall impact of 2012, we demonstrate how the model forecasts, when combined with social media data in a single framework, can be used for near real-time forecast validation, damage assessment and disaster management. Owing to inherent uncertainties in the weather forecasts, the NOAA operational surge model only forecasts the worst-case scenario for flooding from any given hurricane. Geolocated and time-stamped Instagram photos and tweets allow near real-time assessment of the surge levels at different locations, which can validate model forecasts, give timely views of the actual levels of surge, as well as provide an upper bound beyond which the surge did not spread. Additionally, we developed AsonMaps---a crisis-mapping tool that combines dynamic model forecast outputs with social media observations and physical measurements to define the regions of event impacts.
NASA Astrophysics Data System (ADS)
Welling, D. T.; Manchester, W.; Savani, N.; Sokolov, I.; van der Holst, B.; Jin, M.; Toth, G.; Liemohn, M. W.; Gombosi, T. I.
2017-12-01
The future of space weather prediction depends on the community's ability to predict L1 values from observations of the solar atmosphere, which can yield hours of lead time. While both empirical and physics-based L1 forecast methods exist, it is not yet known if this nascent capability can translate to skilled dB/dt forecasts at the Earth's surface. This paper shows results for the first forecast-quality, solar-atmosphere-to-Earth's-surface dB/dt predictions. Two methods are used to predict solar wind and IMF conditions at L1 for several real-world coronal mass ejection events. The first method is an empirical and observationally based system to estimate the plasma characteristics. The magnetic field predictions are based on the Bz4Cast system which assumes that the CME has a cylindrical flux rope geometry locally around Earth's trajectory. The remaining plasma parameters of density, temperature and velocity are estimated from white-light coronagraphs via a variety of triangulation methods and forward based modelling. The second is a first-principles-based approach that combines the Eruptive Event Generator using Gibson-Low configuration (EEGGL) model with the Alfven Wave Solar Model (AWSoM). EEGGL specifies parameters for the Gibson-Low flux rope such that it erupts, driving a CME in the coronal model that reproduces coronagraph observations and propagates to 1AU. The resulting solar wind predictions are used to drive the operational Space Weather Modeling Framework (SWMF) for geospace. Following the configuration used by NOAA's Space Weather Prediction Center, this setup couples the BATS-R-US global magnetohydromagnetic model to the Rice Convection Model (RCM) ring current model and a height-integrated ionosphere electrodynamics model. The long lead time predictions of dB/dt are compared to model results that are driven by L1 solar wind observations. Both are compared to real-world observations from surface magnetometers at a variety of geomagnetic latitudes. Metrics are calculated to examine how the simulated solar wind drivers impact forecast skill. These results illustrate the current state of long-lead-time forecasting and the promise of this technology for operational use.
A Comparison of Synoptic Classification Methods for Application to Wind Power Prediction
NASA Astrophysics Data System (ADS)
Fowler, P.; Basu, S.
2008-12-01
Wind energy is a highly variable resource. To make it competitive with other sources of energy for integration on the power grid, at the very least, a day-ahead forecast of power output must be available. In many grid operations worldwide, next-day power output is scheduled in 30 minute intervals and grid management routinely occurs at real time. Maintenance and repairs require costly time to complete and must be scheduled along with normal operations. Revenue is dependent on the reliability of the entire system. In other words, there is financial and managerial benefit to short-term prediction of wind power. One approach to short-term forecasting is to combine a data centric method such as an artificial neural network with a physically based approach like numerical weather prediction (NWP). The key is in associating high-dimensional NWP model output with the most appropriately trained neural network. Because neural networks perform the best in the situations they are designed for, one can hypothesize that if one can identify similar recurring states in historical weather data, this data can be used to train multiple custom designed neural networks to be used when called upon by numerical prediction. Identifying similar recurring states may offer insight to how a neural network forecast can be improved, but amassing the knowledge and utilizing it efficiently in the time required for power prediction would be difficult for a human to master, thus showing the advantage of classification. Classification methods are important tools for short-term forecasting because they can be unsupervised, objective, and computationally quick. They primarily involve categorizing data sets in to dominant weather classes, but there are numerous ways to define a class and a great variety in interpretation of the results. In the present study a collection of classification methods are used on a sampling of atmospheric variables from the North American Regional Reanalysis data set. The results will be discussed in relation to their use for short-term wind power forecasting by neural networks.
Statistical evaluation of forecasts
NASA Astrophysics Data System (ADS)
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
NASA Astrophysics Data System (ADS)
Engeland, K.; Steinsland, I.
2012-04-01
This work is driven by the needs of next generation short term optimization methodology for hydro power production. Stochastic optimization are about to be introduced; i.e. optimizing when available resources (water) and utility (prices) are uncertain. In this paper we focus on the available resources, i.e. water, where uncertainty mainly comes from uncertainty in future runoff. When optimizing a water system all catchments and several lead times have to be considered simultaneously. Depending on the system of hydropower reservoirs, it might be a set of headwater catchments, a system of upstream /downstream reservoirs where water used from one catchment /dam arrives in a lower catchment maybe days later, or a combination of both. The aim of this paper is therefore to construct a simultaneous probabilistic forecast for several catchments and lead times, i.e. to provide a predictive distribution for the forecasts. Stochastic optimization methods need samples/ensembles of run-off forecasts as input. Hence, it should also be possible to sample from our probabilistic forecast. A post-processing approach is taken, and an error model based on Box- Cox transformation, power transform and a temporal-spatial copula model is used. It accounts for both between catchment and between lead time dependencies. In operational use it is strait forward to sample run-off ensembles from this models that inherits the catchment and lead time dependencies. The methodology is tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times are constructed. The methodology has enough flexibility to model operationally important features in this case study such as hetroscadasety, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. Our model is evaluated using CRPS for marginal predictive distributions and energy score for joint predictive distribution. It is tested against deterministic run-off forecast, climatology forecast and a persistent forecast, and is found to be the better probabilistic forecast for lead time grater then two. From an operational point of view the results are interesting as the between catchment dependency gets stronger with longer lead-times.
Real-Time CME Forecasting Using HMI Active-Region Magnetograms and Flare History
NASA Technical Reports Server (NTRS)
Falconer, David; Moore, Ron; Barghouty, Abdulnasser F.; Khazanov, Igor
2011-01-01
We have recently developed a method of predicting an active region s probability of producing a CME, an X-class Flare, an M-class Flare, or a Solar Energetic Particle Event from a free-energy proxy measured from SOHO/MDI line-of-sight magnetograms. This year we have added three major improvements to our forecast tool: 1) Transition from MDI magnetogram to SDO/HMI magnetogram allowing us near-real-time forecasts, 2) Automation of acquisition and measurement of HMI magnetograms giving us near-real-time forecasts (no older than 2 hours), and 3) Determination of how to improve forecast by using the active region s previous flare history in combination with its free-energy proxy. HMI was turned on in May 2010 and MDI was turned off in April 2011. Using the overlap period, we have calibrated HMI to yield what MDI would measure. This is important since the value of the free-energy proxy used for our forecast is resolution dependent, and the forecasts are made from results of a 1996-2004 database of MDI observations. With near-real-time magnetograms from HMI, near-real-time forecasts are now possible. We have augmented the code so that it continually acquires and measures new magnetograms as they become available online, and updates the whole-sun forecast from the coming day. The next planned improvement is to use an active region s previous flare history, in conjunction with its free-energy proxy, to forecast the active region s event rate. It has long been known that active regions that have produced flares in the past are likely to produce flares in the future, and that active regions that are nonpotential (have large free-energy) are more likely to produce flares in the future. This year we have determined that persistence of flaring is not just a reflection of an active region s free energy. In other words, after controlling for free energy, we have found that active regions that have flared recently are more likely to flare in the future.
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
NASA Astrophysics Data System (ADS)
Gunda, T.; Bazuin, J. T.; Nay, J.; Yeung, K. L.
2017-03-01
Access to seasonal climate forecasts can benefit farmers by allowing them to make more informed decisions about their farming practices. However, it is unclear whether farmers realize these benefits when crop choices available to farmers have different and variable costs and returns; multiple countries have programs that incentivize production of certain crops while other crops are subject to market fluctuations. We hypothesize that the benefits of forecasts on farmer livelihoods will be moderated by the combined impact of differing crop economics and changing climate. Drawing upon methods and insights from both physical and social sciences, we develop a model of farmer decision-making to evaluate this hypothesis. The model dynamics are explored using empirical data from Sri Lanka; primary sources include survey and interview information as well as game-based experiments conducted with farmers in the field. Our simulations show that a farmer using seasonal forecasts has more diversified crop selections, which drive increases in average agricultural income. Increases in income are particularly notable under a drier climate scenario, when a farmer using seasonal forecasts is more likely to plant onions, a crop with higher possible returns. Our results indicate that, when water resources are scarce (i.e. drier climate scenario), farmer incomes could become stratified, potentially compounding existing disparities in farmers’ financial and technical abilities to use forecasts to inform their crop selections. This analysis highlights that while programs that promote production of certain crops may ensure food security in the short-term, the long-term implications of these dynamics need careful evaluation.
A human judgment approach to epidemiological forecasting
Farrow, David C.; Brooks, Logan C.; Rosenfeld, Roni
2017-01-01
Infectious diseases impose considerable burden on society, despite significant advances in technology and medicine over the past century. Advanced warning can be helpful in mitigating and preparing for an impending or ongoing epidemic. Historically, such a capability has lagged for many reasons, including in particular the uncertainty in the current state of the system and in the understanding of the processes that drive epidemic trajectories. Presently we have access to data, models, and computational resources that enable the development of epidemiological forecasting systems. Indeed, several recent challenges hosted by the U.S. government have fostered an open and collaborative environment for the development of these technologies. The primary focus of these challenges has been to develop statistical and computational methods for epidemiological forecasting, but here we consider a serious alternative based on collective human judgment. We created the web-based “Epicast” forecasting system which collects and aggregates epidemic predictions made in real-time by human participants, and with these forecasts we ask two questions: how accurate is human judgment, and how do these forecasts compare to their more computational, data-driven alternatives? To address the former, we assess by a variety of metrics how accurately humans are able to predict influenza and chikungunya trajectories. As for the latter, we show that real-time, combined human predictions of the 2014–2015 and 2015–2016 U.S. flu seasons are often more accurate than the same predictions made by several statistical systems, especially for short-term targets. We conclude that there is valuable predictive power in collective human judgment, and we discuss the benefits and drawbacks of this approach. PMID:28282375
A human judgment approach to epidemiological forecasting.
Farrow, David C; Brooks, Logan C; Hyun, Sangwon; Tibshirani, Ryan J; Burke, Donald S; Rosenfeld, Roni
2017-03-01
Infectious diseases impose considerable burden on society, despite significant advances in technology and medicine over the past century. Advanced warning can be helpful in mitigating and preparing for an impending or ongoing epidemic. Historically, such a capability has lagged for many reasons, including in particular the uncertainty in the current state of the system and in the understanding of the processes that drive epidemic trajectories. Presently we have access to data, models, and computational resources that enable the development of epidemiological forecasting systems. Indeed, several recent challenges hosted by the U.S. government have fostered an open and collaborative environment for the development of these technologies. The primary focus of these challenges has been to develop statistical and computational methods for epidemiological forecasting, but here we consider a serious alternative based on collective human judgment. We created the web-based "Epicast" forecasting system which collects and aggregates epidemic predictions made in real-time by human participants, and with these forecasts we ask two questions: how accurate is human judgment, and how do these forecasts compare to their more computational, data-driven alternatives? To address the former, we assess by a variety of metrics how accurately humans are able to predict influenza and chikungunya trajectories. As for the latter, we show that real-time, combined human predictions of the 2014-2015 and 2015-2016 U.S. flu seasons are often more accurate than the same predictions made by several statistical systems, especially for short-term targets. We conclude that there is valuable predictive power in collective human judgment, and we discuss the benefits and drawbacks of this approach.
Forecasting in foodservice: model development, testing, and evaluation.
Miller, J L; Thompson, P A; Orabella, M M
1991-05-01
This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.
The value of forecasting key-decision variables for rain-fed farming
NASA Astrophysics Data System (ADS)
Winsemius, Hessel; Werner, Micha
2013-04-01
Rain-fed farmers are highly vulnerable to variability in rainfall. Timely knowledge of the onset of the rainy season, the expected amount of rainfall and the occurrence of dry spells can help rain-fed farmers to plan the cropping season. Seasonal probabilistic weather forecasts may provide such information to farmers, but need to provide reliable forecasts of key variables with which farmers can make decisions. In this contribution, we present a new method to evaluate the value of meteorological forecasts in predicting these key variables. The proposed method measures skill by assessing whether a forecast was useful to this decision. This is done by taking into account the required accuracy of timing of the event to make the decision useful. The method progresses the estimate of forecast skill to forecast value by taking into account the required accuracy that is needed to make the decision valuable, based on the cost/loss ratio of possible decisions. The method is applied over the Limpopo region in Southern Africa. We demonstrate the method using the example of temporary water harvesting techniques. Such techniques require time to construct and must be ready long enough before the occurrence of a dry spell to be effective. The value of the forecasts to the decision used as an example is shown to be highly sensitive to the accuracy in the timing of forecasted dry spells, and the tolerance in the decision to timing error. The skill with which dry spells can be predicted is shown to be higher in some parts of the basin, indicating that these forecasts have higher value for the decision in those parts than in others. Through assessing the skill of forecasting key decision variables to the farmers we show that it is easier to understand if the forecasts have value in reducing risk, or if other adaptation strategies should be implemented.
Analysis of forecasting and inventory control of raw material supplies in PT INDAC INT’L
NASA Astrophysics Data System (ADS)
Lesmana, E.; Subartini, B.; Riaman; Jabar, D. A.
2018-03-01
This study discusses the data forecasting sales of carbon electrodes at PT. INDAC INT L uses winters and double moving average methods, while for predicting the amount of inventory and cost required in ordering raw material of carbon electrode next period using Economic Order Quantity (EOQ) model. The result of error analysis shows that winters method for next period gives result of MAE, MSE, and MAPE, the winters method is a better forecasting method for forecasting sales of carbon electrode products. So that PT. INDAC INT L is advised to provide products that will be sold following the sales amount by the winters method.
A Hybrid Supervised/Unsupervised Machine Learning Approach to Solar Flare Prediction
NASA Astrophysics Data System (ADS)
Benvenuto, Federico; Piana, Michele; Campi, Cristina; Massone, Anna Maria
2018-01-01
This paper introduces a novel method for flare forecasting, combining prediction accuracy with the ability to identify the most relevant predictive variables. This result is obtained by means of a two-step approach: first, a supervised regularization method for regression, namely, LASSO is applied, where a sparsity-enhancing penalty term allows the identification of the significance with which each data feature contributes to the prediction; then, an unsupervised fuzzy clustering technique for classification, namely, Fuzzy C-Means, is applied, where the regression outcome is partitioned through the minimization of a cost function and without focusing on the optimization of a specific skill score. This approach is therefore hybrid, since it combines supervised and unsupervised learning; realizes classification in an automatic, skill-score-independent way; and provides effective prediction performances even in the case of imbalanced data sets. Its prediction power is verified against NOAA Space Weather Prediction Center data, using as a test set, data in the range between 1996 August and 2010 December and as training set, data in the range between 1988 December and 1996 June. To validate the method, we computed several skill scores typically utilized in flare prediction and compared the values provided by the hybrid approach with the ones provided by several standard (non-hybrid) machine learning methods. The results showed that the hybrid approach performs classification better than all other supervised methods and with an effectiveness comparable to the one of clustering methods; but, in addition, it provides a reliable ranking of the weights with which the data properties contribute to the forecast.
Forecasting the Short-Term Passenger Flow on High-Speed Railway with Neural Networks
Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing
2014-01-01
Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway. PMID:25544838
Spatial forecast of landslides in three gorges based on spatial data mining.
Wang, Xianmin; Niu, Ruiqing
2009-01-01
The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods.
Spatial Forecast of Landslides in Three Gorges Based On Spatial Data Mining
Wang, Xianmin; Niu, Ruiqing
2009-01-01
The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods. PMID:22573999
NASA Astrophysics Data System (ADS)
Nikolić, Vlastimir; Petković, Dalibor; Lazov, Lyubomir; Milovančević, Miloš
2016-07-01
Water-jet assisted underwater laser cutting has shown some advantages as it produces much less turbulence, gas bubble and aerosols, resulting in a more gentle process. However, this process has relatively low efficiency due to different losses in water. It is important to determine which parameters are the most important for the process. In this investigation was analyzed the water-jet assisted underwater laser cutting parameters forecasting based on the different parameters. The method of ANFIS (adaptive neuro fuzzy inference system) was applied to the data in order to select the most influential factors for water-jet assisted underwater laser cutting parameters forecasting. Three inputs are considered: laser power, cutting speed and water-jet speed. The ANFIS process for variable selection was also implemented in order to detect the predominant factors affecting the forecasting of the water-jet assisted underwater laser cutting parameters. According to the results the combination of laser power cutting speed forms the most influential combination foe the prediction of water-jet assisted underwater laser cutting parameters. The best prediction was observed for the bottom kerf-width (R2 = 0.9653). The worst prediction was observed for dross area per unit length (R2 = 0.6804). According to the results, a greater improvement in estimation accuracy can be achieved by removing the unnecessary parameter.
NASA Technical Reports Server (NTRS)
Hathaway, D. H.
2000-01-01
A number of techniques for predicting solar activity on a solar cycle time scale are identified, described, and tested with historical data. Some techniques, e.g,, regression and curve-fitting, work well as solar activity approaches maximum and provide a month- by-month description of future activity, while others, e.g., geomagnetic precursors, work well near solar minimum but provide an estimate only of the amplitude of the cycle. A synthesis of different techniques is shown to provide a more accurate and useful forecast of solar cycle activity levels. A combination of two uncorrelated geomagnetic precursor techniques provides the most accurate prediction for the amplitude of a solar activity cycle at a time well before activity minimum. This precursor method gave a smoothed sunspot number maximum of 154+21 for cycle 23. A mathematical function dependent upon the time of cycle initiation and the cycle amplitude then describes the level of solar activity for the complete cycle. As the time of cycle maximum approaches a better estimate of the cycle activity is obtained by including the fit between recent activity levels and this function. This Combined Solar Cycle Activity Forecast now gives a smoothed sunspot maximum of 140+20 for cycle 23. The success of the geomagnetic precursors in predicting future solar activity suggests that solar magnetic phenomena at latitudes above the sunspot activity belts are linked to solar activity, which occurs many years later in the lower latitudes.
The improved business valuation model for RFID company based on the community mining method.
Li, Shugang; Yu, Zhaoxu
2017-01-01
Nowadays, the appetite for the investment and mergers and acquisitions (M&A) activity in RFID companies is growing rapidly. Although the huge number of papers have addressed the topic of business valuation models based on statistical methods or neural network methods, only a few are dedicated to constructing a general framework for business valuation that improves the performance with network graph (NG) and the corresponding community mining (CM) method. In this study, an NG based business valuation model is proposed, where real options approach (ROA) integrating CM method is designed to predict the company's net profit as well as estimate the company value. Three improvements are made in the proposed valuation model: Firstly, our model figures out the credibility of the node belonging to each community and clusters the network according to the evolutionary Bayesian method. Secondly, the improved bacterial foraging optimization algorithm (IBFOA) is adopted to calculate the optimized Bayesian posterior probability function. Finally, in IBFOA, bi-objective method is used to assess the accuracy of prediction, and these two objectives are combined into one objective function using a new Pareto boundary method. The proposed method returns lower forecasting error than 10 well-known forecasting models on 3 different time interval valuing tasks for the real-life simulation of RFID companies.
The improved business valuation model for RFID company based on the community mining method
Li, Shugang; Yu, Zhaoxu
2017-01-01
Nowadays, the appetite for the investment and mergers and acquisitions (M&A) activity in RFID companies is growing rapidly. Although the huge number of papers have addressed the topic of business valuation models based on statistical methods or neural network methods, only a few are dedicated to constructing a general framework for business valuation that improves the performance with network graph (NG) and the corresponding community mining (CM) method. In this study, an NG based business valuation model is proposed, where real options approach (ROA) integrating CM method is designed to predict the company’s net profit as well as estimate the company value. Three improvements are made in the proposed valuation model: Firstly, our model figures out the credibility of the node belonging to each community and clusters the network according to the evolutionary Bayesian method. Secondly, the improved bacterial foraging optimization algorithm (IBFOA) is adopted to calculate the optimized Bayesian posterior probability function. Finally, in IBFOA, bi-objective method is used to assess the accuracy of prediction, and these two objectives are combined into one objective function using a new Pareto boundary method. The proposed method returns lower forecasting error than 10 well-known forecasting models on 3 different time interval valuing tasks for the real-life simulation of RFID companies. PMID:28459815
Why preferring parametric forecasting to nonparametric methods?
Jabot, Franck
2015-05-07
A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huijnen, V.; Bouarar, I.; Chabrillat, S. H.; Christophe, Y.; Thierno, D.; Karydis, V.; Marecal, V.; Pozzer, A.; Flemming, J.
2017-12-01
Operational atmospheric composition analyses and forecasts such as developed in the Copernicus Atmosphere Monitoring Service (CAMS) rely on modules describing emissions, chemical conversion, transport and removal processing, as well as data assimilation methods. The CAMS forecasts can be used to drive regional air quality models across the world. Critical analyses of uncertainties in any of these processes are continuously needed to advance the quality of such systems on a global scale, ranging from the surface up to the stratosphere. With regard to the atmospheric chemistry to describe the fate of trace gases, the operational system currently relies on a modified version of the CB05 chemistry scheme for the troposphere combined with the Cariolle scheme to describe stratospheric ozone, as integrated in ECMWF's Integrated Forecasting System (IFS). It is further constrained by assimilation of satellite observations of CO, O3 and NO2. As part of CAMS we have recently developed three fully independent schemes to describe the chemical conversion throughout the atmosphere. These parameterizations originate from parent model codes in MOZART, MOCAGE and a combination of TM5/BASCOE. In this contribution we evaluate the correspondence and elemental differences in the performance of the three schemes in an otherwise identical model configuration (excluding data-assimilation) against a large range of in-situ and satellite-based observations of ozone, CO, VOC's and chlorine-containing trace gases for both troposphere and stratosphere. This analysis aims to provide a measure of model uncertainty in the operational system for tracers that are not, or poorly, constrained by data assimilation. It aims also to provide guidance on the directions for further model improvement with regard to the chemical conversion module.
Forecasting of Radiation Belts: Results From the PROGRESS Project.
NASA Astrophysics Data System (ADS)
Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.
2017-12-01
Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.
Low Streamflow Forcasting using Minimum Relative Entropy
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2013-12-01
Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.
Future mission studies: Preliminary comparisons of solar flux models
NASA Technical Reports Server (NTRS)
Ashrafi, S.
1991-01-01
The results of comparisons of the solar flux models are presented. (The wavelength lambda = 10.7 cm radio flux is the best indicator of the strength of the ionizing radiations such as solar ultraviolet and x-ray emissions that directly affect the atmospheric density thereby changing the orbit lifetime of satellites. Thus, accurate forecasting of solar flux F sub 10.7 is crucial for orbit determination of spacecrafts.) The measured solar flux recorded by National Oceanic and Atmospheric Administration (NOAA) is compared against the forecasts made by Schatten, MSFC, and NOAA itself. The possibility of a combined linear, unbiased minimum-variance estimation that properly combines all three models into one that minimizes the variance is also discussed. All the physics inherent in each model are combined. This is considered to be the dead-end statistical approach to solar flux forecasting before any nonlinear chaotic approach.
NASA Astrophysics Data System (ADS)
Holmukhe, R. M.; Dhumale, Mrs. Sunita; Chaudhari, Mr. P. S.; Kulkarni, Mr. P. P.
2010-10-01
Load forecasting is very essential to the operation of Electricity companies. It enhances the energy efficient and reliable operation of power system. Forecasting of load demand data forms an important component in planning generation schedules in a power system. The purpose of this paper is to identify issues and better method for load foecasting. In this paper we focus on fuzzy logic system based short term load forecasting. It serves as overview of the state of the art in the intelligent techniques employed for load forecasting in power system planning and reliability. Literature review has been conducted and fuzzy logic method has been summarized to highlight advantages and disadvantages of this technique. The proposed technique for implementing fuzzy logic based forecasting is by Identification of the specific day and by using maximum and minimum temperature for that day and finally listing the maximum temperature and peak load for that day. The results show that Load forecasting where there are considerable changes in temperature parameter is better dealt with Fuzzy Logic system method as compared to other short term forecasting techniques.
Forecasts of non-Gaussian parameter spaces using Box-Cox transformations
NASA Astrophysics Data System (ADS)
Joachimi, B.; Taylor, A. N.
2011-09-01
Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.
Forecasting of natural gas consumption with neural network and neuro fuzzy system
NASA Astrophysics Data System (ADS)
Kaynar, Oguz; Yilmaz, Isik; Demirkoparan, Ferhan
2010-05-01
The prediction of natural gas consumption is crucial for Turkey which follows foreign-dependent policy in point of providing natural gas and whose stock capacity is only 5% of internal total consumption. Prediction accuracy of demand is one of the elements which has an influence on sectored investments and agreements about obtaining natural gas, so on development of sector. In recent years, new techniques, such as artificial neural networks and fuzzy inference systems, have been widely used in natural gas consumption prediction in addition to classical time series analysis. In this study, weekly natural gas consumption of Turkey has been predicted by means of three different approaches. The first one is Autoregressive Integrated Moving Average (ARIMA), which is classical time series analysis method. The second approach is the Artificial Neural Network. Two different ANN models, which are Multi Layer Perceptron (MLP) and Radial Basis Function Network (RBFN), are employed to predict natural gas consumption. The last is Adaptive Neuro Fuzzy Inference System (ANFIS), which combines ANN and Fuzzy Inference System. Different prediction models have been constructed and one model, which has the best forecasting performance, is determined for each method. Then predictions are made by using these models and results are compared. Keywords: ANN, ANFIS, ARIMA, Natural Gas, Forecasting
A probabilistic drought forecasting framework: A combined dynamical and statistical approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh
In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less
Recursive least squares background prediction of univariate syndromic surveillance data
2009-01-01
Background Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Methods Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. Results We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. Conclusion The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the residuals of the RLS forecasts. We compare the detection performance of this algorithm to the W2 method recently implemented at CDC. The modified RLS method gives consistently better sensitivity at multiple background alert rates, and we recommend that it should be considered for routine application in bio-surveillance systems. PMID:19149886
Performance of time-series methods in forecasting the demand for red blood cell transfusion.
Pereira, Arturo
2004-05-01
Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.
Ensemble Streamflow Prediction in Korea: Past and Future 5 Years
NASA Astrophysics Data System (ADS)
Jeong, D.; Kim, Y.; Lee, J.
2005-05-01
The Ensemble Streamflow Prediction (ESP) approach was first introduced in 2000 by the Hydrology Research Group (HRG) at Seoul National University as an alternative probabilistic forecasting technique for improving the 'Water Supply Outlook' That is issued every month by the Ministry of Construction and Transportation in Korea. That study motivated the Korea Water Resources Corporation (KOWACO) to establish their seasonal probabilistic forecasting system for the 5 major river basins using the ESP approach. In cooperation with the HRG, the KOWACO developed monthly optimal multi-reservoir operating systems for the Geum river basin in 2004, which coupled the ESP forecasts with an optimization model using sampling stochastic dynamic programming. The user interfaces for both ESP and SSDP have also been designed for the developed computer systems to become more practical. More projects for developing ESP systems to the other 3 major river basins (i.e. the Nakdong, Han and Seomjin river basins) was also completed by the HRG and KOWACO at the end of December 2004. Therefore, the ESP system has become the most important mid- and long-term streamflow forecast technique in Korea. In addition to the practical aspects, resent research experience on ESP has raised some concerns into ways of improving the accuracy of ESP in Korea. Jeong and Kim (2002) performed an error analysis on its resulting probabilistic forecasts and found that the modeling error is dominant in the dry season, while the meteorological error is dominant in the flood season. To address the first issue, Kim et al. (2004) tested various combinations and/or combining techniques and showed that the ESP probabilistic accuracy could be improved considerably during the dry season when the hydrologic models were combined and/or corrected. In addition, an attempt was also made to improve the ESP accuracy for the flood season using climate forecast information. This ongoing project handles three types of climate forecast information: (1) the Monthly Industrial Meteorology Information Magazine (MIMIM) of the Korea Meteorological Administration (2) the Global Data Assimilation Prediction System (GDAPS), and (3) the US National Centers for Environmental Prediction (NCEP). Each of these forecasts is issued in a unique format: (1) MIMIM is a most-probable-event forecast, (2) GDAPS is a single series of deterministic forecasts, and (3) NCEP is an ensemble of deterministic forecasts. Other minor issues include how long the initial conditions influences the ESP accuracy, and how many ESP scenarios are needed to obtain the best accuracy. This presentation also addresses some future research that is needed for ESP in Korea.
A Wind Forecasting System for Energy Application
NASA Astrophysics Data System (ADS)
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
2010-05-01
Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.
Kim, Taegu; Hong, Jungsik; Kang, Pilsung
2017-01-01
Accurate box office forecasting models are developed by considering competition and word-of-mouth (WOM) effects in addition to screening-related information. Nationality, genre, ratings, and distributors of motion pictures running concurrently with the target motion picture are used to describe the competition, whereas the numbers of informative, positive, and negative mentions posted on social network services (SNS) are used to gauge the atmosphere spread by WOM. Among these candidate variables, only significant variables are selected by genetic algorithm (GA), based on which machine learning algorithms are trained to build forecasting models. The forecasts are combined to improve forecasting performance. Experimental results on the Korean film market show that the forecasting accuracy in early screening periods can be significantly improved by considering competition. In addition, WOM has a stronger influence on total box office forecasting. Considering both competition and WOM improves forecasting performance to a larger extent than when only one of them is considered.
Kim, Taegu; Hong, Jungsik
2017-01-01
Accurate box office forecasting models are developed by considering competition and word-of-mouth (WOM) effects in addition to screening-related information. Nationality, genre, ratings, and distributors of motion pictures running concurrently with the target motion picture are used to describe the competition, whereas the numbers of informative, positive, and negative mentions posted on social network services (SNS) are used to gauge the atmosphere spread by WOM. Among these candidate variables, only significant variables are selected by genetic algorithm (GA), based on which machine learning algorithms are trained to build forecasting models. The forecasts are combined to improve forecasting performance. Experimental results on the Korean film market show that the forecasting accuracy in early screening periods can be significantly improved by considering competition. In addition, WOM has a stronger influence on total box office forecasting. Considering both competition and WOM improves forecasting performance to a larger extent than when only one of them is considered. PMID:28819355
Forecasting the Human Pathogen Vibrio Parahaemolyticus in Shellfish Tissue within Long Island Sound
NASA Astrophysics Data System (ADS)
Whitney, M. M.; DeRosia-Banick, K.
2016-02-01
Vibrio parahaemolyticus (Vp) is a marine bacterium that occurs naturally in brackish and saltwater environments and may be found in higher concentrations in the warmest months. Vp is a growing threat to producing safe seafood. Consumption of shellfish with high Vp levels can result in gastrointestinal human illnesses. Management response to Vp-related illness outbreaks includes closure of shellfish growing areas. Water quality observations, Vp measurements, and model forecasts are key components to effective management of shellfish growing areas. There is a clear need for observations within the growing area themselves. These areas are offshore of coastal stations and typically inshore of the observing system moorings. New field observations in Long Island Sound (LIS) shellfish growing areas are described and their agreement with high-resolution satellite sea surface temperature data is discussed. A new dataset of Vp concentrations in shellfish tissue is used to determine the LIS-specific Vp vs. temperature relationship following methods in the FDA pre-harvest Vp risk model. This information is combined with output from a high-resolution hydrodynamic model of LIS to make daily forecasts of Vp levels. The influence of river inflows, the role of heat waves, and predictions for future warmer climates are discussed. The key elements of this observational-modeling approach to pathogen forecasting are extendable to other coastal systems.
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.
2011-08-01
Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.
Characterizing Time Series Data Diversity for Wind Forecasting: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong
Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less
Forecasting Occurrences of Activities.
Minor, Bryan; Cook, Diane J
2017-07-01
While activity recognition has been shown to be valuable for pervasive computing applications, less work has focused on techniques for forecasting the future occurrence of activities. We present an activity forecasting method to predict the time that will elapse until a target activity occurs. This method generates an activity forecast using a regression tree classifier and offers an advantage over sequence prediction methods in that it can predict expected time until an activity occurs. We evaluate this algorithm on real-world smart home datasets and provide evidence that our proposed approach is most effective at predicting activity timings.
Orsini, Luisa; Schwenk, Klaus; De Meester, Luc; Colbourne, John K.; Pfrender, Michael E.; Weider, Lawrence J.
2013-01-01
Evolutionary changes are determined by a complex assortment of ecological, demographic and adaptive histories. Predicting how evolution will shape the genetic structures of populations coping with current (and future) environmental challenges has principally relied on investigations through space, in lieu of time, because long-term phenotypic and molecular data are scarce. Yet, dormant propagules in sediments, soils and permafrost are convenient natural archives of population-histories from which to trace adaptive trajectories along extended time periods. DNA sequence data obtained from these natural archives, combined with pioneering methods for analyzing both ecological and population genomic time-series data, are likely to provide predictive models to forecast evolutionary responses of natural populations to environmental changes resulting from natural and anthropogenic stressors, including climate change. PMID:23395434
A Response Function Approach for Rapid Far-Field Tsunami Forecasting
NASA Astrophysics Data System (ADS)
Tolkova, Elena; Nicolsky, Dmitry; Wang, Dailin
2017-08-01
Predicting tsunami impacts at remote coasts largely relies on tsunami en-route measurements in an open ocean. In this work, these measurements are used to generate instant tsunami predictions in deep water and near the coast. The predictions are generated as a response or a combination of responses to one or more tsunameters, with each response obtained as a convolution of real-time tsunameter measurements and a pre-computed pulse response function (PRF). Practical implementation of this method requires tables of PRFs in a 3D parameter space: earthquake location-tsunameter-forecasted site. Examples of hindcasting the 2010 Chilean and the 2011 Tohoku-Oki tsunamis along the US West Coast and beyond demonstrated high accuracy of the suggested technology in application to trans-Pacific seismically generated tsunamis.
NASA Astrophysics Data System (ADS)
Shaman, J.; Stieglitz, M.; Zebiak, S.; Cane, M.; Day, J. F.
2002-12-01
We present an ensemble local hydrologic forecast derived from the seasonal forecasts of the International Research Institute (IRI) for Climate Prediction. Three- month seasonal forecasts were used to resample historical meteorological conditions and generate ensemble forcing datasets for a TOPMODEL-based hydrology model. Eleven retrospective forecasts were run at a Florida and New York site. Forecast skill was assessed for mean area modeled water table depth (WTD), i.e. near surface soil wetness conditions, and compared with WTD simulated with observed data. Hydrology model forecast skill was evident at the Florida site but not at the New York site. At the Florida site, persistence of hydrologic conditions and local skill of the IRI seasonal forecast contributed to the local hydrologic forecast skill. This forecast will permit probabilistic prediction of future hydrologic conditions. At the Florida site, we have also quantified the link between modeled WTD (i.e. drought) and the amplification and transmission of St. Louis Encephalitis virus (SLEV). We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission associated with human clinical cases. We then combine the seasonal forecasts of local, modeled WTD with this empirical relationship and produce retrospective probabilistic seasonal forecasts of epidemic SLEV transmission in Florida. Epidemic SLEV transmission forecast skill is demonstrated. These findings will permit real-time forecast of drought and resultant SLEV transmission in Florida.
Rodríguez, Nibaldo
2014-01-01
Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0 : 26%, followed by MA-ARIMA with a MAPE of 1 : 12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15 : 51%. PMID:25243200
NASA Astrophysics Data System (ADS)
Wang, Jiangbo; Liu, Junhui; Li, Tiantian; Yin, Shuo; He, Xinhui
2018-01-01
The monthly electricity sales forecasting is a basic work to ensure the safety of the power system. This paper presented a monthly electricity sales forecasting method which comprehensively considers the coupled multi-factors of temperature, economic growth, electric power replacement and business expansion. The mathematical model is constructed by using regression method. The simulation results show that the proposed method is accurate and effective.
NASA Astrophysics Data System (ADS)
Kozel, Tomas; Stary, Milos
2017-12-01
The main advantage of stochastic forecasting is fan of possible value whose deterministic method of forecasting could not give us. Future development of random process is described better by stochastic then deterministic forecasting. Discharge in measurement profile could be categorized as random process. Content of article is construction and application of forecasting model for managed large open water reservoir with supply function. Model is based on neural networks (NS) and zone models, which forecasting values of average monthly flow from inputs values of average monthly flow, learned neural network and random numbers. Part of data was sorted to one moving zone. The zone is created around last measurement average monthly flow. Matrix of correlation was assembled only from data belonging to zone. The model was compiled for forecast of 1 to 12 month with using backward month flows (NS inputs) from 2 to 11 months for model construction. Data was got ridded of asymmetry with help of Box-Cox rule (Box, Cox, 1964), value r was found by optimization. In next step were data transform to standard normal distribution. The data were with monthly step and forecast is not recurring. 90 years long real flow series was used for compile of the model. First 75 years were used for calibration of model (matrix input-output relationship), last 15 years were used only for validation. Outputs of model were compared with real flow series. For comparison between real flow series (100% successfully of forecast) and forecasts, was used application to management of artificially made reservoir. Course of water reservoir management using Genetic algorithm (GE) + real flow series was compared with Fuzzy model (Fuzzy) + forecast made by Moving zone model. During evaluation process was founding the best size of zone. Results show that the highest number of input did not give the best results and ideal size of zone is in interval from 25 to 35, when course of management was almost same for all numbers from interval. Resulted course of management was compared with course, which was obtained from using GE + real flow series. Comparing results showed that fuzzy model with forecasted values has been able to manage main malfunction and artificially disorders made by model were founded essential, after values of water volume during management were evaluated. Forecasting model in combination with fuzzy model provide very good results in management of water reservoir with storage function and can be recommended for this purpose.
Near real time wind energy forecasting incorporating wind tunnel modeling
NASA Astrophysics Data System (ADS)
Lubitz, William David
A series of experiments and investigations were carried out to inform the development of a day-ahead wind power forecasting system. An experimental near-real time wind power forecasting system was designed and constructed that operates on a desktop PC and forecasts 12--48 hours in advance. The system uses model output of the Eta regional scale forecast (RSF) to forecast the power production of a wind farm in the Altamont Pass, California, USA from 12 to 48 hours in advance. It is of modular construction and designed to also allow diagnostic forecasting using archived RSF data, thereby allowing different methods of completing each forecasting step to be tested and compared using the same input data. Wind-tunnel investigations of the effect of wind direction and hill geometry on wind speed-up above a hill were conducted. Field data from an Altamont Pass, California site was used to evaluate several speed-up prediction algorithms, both with and without wind direction adjustment. These algorithms were found to be of limited usefulness for the complex terrain case evaluated. Wind-tunnel and numerical simulation-based methods were developed for determining a wind farm power curve (the relation between meteorological conditions at a point in the wind farm and the power production of the wind farm). Both methods, as well as two methods based on fits to historical data, ultimately showed similar levels of accuracy: mean absolute errors predicting power production of 5 to 7 percent of the wind farm power capacity. The downscaling of RSF forecast data to the wind farm was found to be complicated by the presence of complex terrain. Poor results using the geostrophic drag law and regression methods motivated the development of a database search method that is capable of forecasting not only wind speeds but also power production with accuracy better than persistence.
NASA Astrophysics Data System (ADS)
Goswami, M.; O'Connor, K. M.; Shamseldin, A. Y.
The "Galway Real-Time River Flow Forecasting System" (GFFS) is a software pack- age developed at the Department of Engineering Hydrology, of the National University of Ireland, Galway, Ireland. It is based on a selection of lumped black-box and con- ceptual rainfall-runoff models, all developed in Galway, consisting primarily of both the non-parametric (NP) and parametric (P) forms of two black-box-type rainfall- runoff models, namely, the Simple Linear Model (SLM-NP and SLM-P) and the seasonally-based Linear Perturbation Model (LPM-NP and LPM-P), together with the non-parametric wetness-index-based Linearly Varying Gain Factor Model (LVGFM), the black-box Artificial Neural Network (ANN) Model, and the conceptual Soil Mois- ture Accounting and Routing (SMAR) Model. Comprised of the above suite of mod- els, the system enables the user to calibrate each model individually, initially without updating, and it is capable also of producing combined (i.e. consensus) forecasts us- ing the Simple Average Method (SAM), the Weighted Average Method (WAM), or the Artificial Neural Network Method (NNM). The updating of each model output is achieved using one of four different techniques, namely, simple Auto-Regressive (AR) updating, Linear Transfer Function (LTF) updating, Artificial Neural Network updating (NNU), and updating by the Non-linear Auto-Regressive Exogenous-input method (NARXM). The models exhibit a considerable range of variation in degree of complexity of structure, with corresponding degrees of complication in objective func- tion evaluation. Operating in continuous river-flow simulation and updating modes, these models and techniques have been applied to two Irish catchments, namely, the Fergus and the Brosna. A number of performance evaluation criteria have been used to comparatively assess the model discharge forecast efficiency.
Future Research in Health Information Technology: A Review
Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammad Reza; Saghafi, Fatemeh
2017-01-01
Introduction Currently, information technology is considered an important tool to improve healthcare services. To adopt the right technologies, policy makers should have adequate information about present and future advances. This study aimed to review and compare studies with a focus on the future of health information technology. Method This review study was completed in 2015. The databases used were Scopus, Web of Science, ProQuest, Ovid Medline, and PubMed. Keyword searches were used to identify papers and materials published between 2000 and 2015. Initially, 407 papers were obtained, and they were reduced to 11 papers at the final stage. The selected papers were described and compared in terms of the country of origin, objective, methodology, and time horizon. Results The papers were divided into two groups: those forecasting the future of health information technology (seven papers) and those providing health information technology foresight (four papers). The results showed that papers related to forecasting the future of health information technology were mostly a literature review, and the time horizon was up to 10 years in most of these studies. In the health information technology foresight group, most of the studies used a combination of techniques, such as scenario building and Delphi methods, and had long-term objectives. Conclusion To make the most of an investment and to improve planning and successful implementation of health information technology, a strategic plan for the future needs to be set. To achieve this aim, methods such as forecasting the future of health information technology and offering health information technology foresight can be applied. The forecasting method is used when the objectives are not very large, and the foresight approach is recommended when large-scale objectives are set to be achieved. In the field of health information technology, the results of foresight studies can help to establish realistic long-term expectations of the future of health information technology. PMID:28566991
Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge observations and several years of archived forecasts, overall empirical error distributions termed 'overall error' were for each gauge derived for a range of relevant forecast lead times. b) The error distributions vary strongly with the hydrometeorological situation, therefore a subdivision into the hydrological cases 'low flow, 'rising flood', 'flood', flood recession' was introduced. c) For the sake of numerical compression, theoretical distributions were fitted to the empirical distributions using the method of moments. Here, the normal distribution was generally best suited. d) Further data compression was achieved by representing the distribution parameters as a function (second-order polynome) of lead time. In general, the 'overall error' obtained from the above procedure is most useful in regions where large human impact occurs and where the influence of the meteorological forecast is limited. In upstream regions however, forecast uncertainty is strongly dependent on the current predictability of the atmosphere, which is contained in the spread of an ensemble forecast. Including this dynamically in the hydrological forecast uncertainty estimation requires prior elimination of the contribution of the weather forecast to the 'overall error'. This was achieved by calculating long series of hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The resulting error distribution is termed 'model error' and can be applied on hydrological ensemble forecasts, where ensemble rainfall forecasts are used as forcing. The concept will be illustrated by examples (good and bad ones) covering a wide range of catchment sizes, hydrometeorological regimes and quality of hydrological model calibration. The methodology to combine the static and dynamic shares of uncertainty will be presented in part II of this study.
Air Quality Forecasts Using the NASA GEOS Model
NASA Technical Reports Server (NTRS)
Keller, Christoph A.; Knowland, K. Emma; Nielsen, Jon E.; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Follette-Cook, Melanie; Liu, Junhua;
2018-01-01
We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.
Lu, Wei-Zhen; Wang, Wen-Jian; Wang, Xie-Kang; Yan, Sui-Hang; Lam, Joseph C
2004-09-01
The forecasting of air pollutant trends has received much attention in recent years. It is an important and popular topic in environmental science, as concerns have been raised about the health impacts caused by unacceptable ambient air pollutant levels. Of greatest concern are metropolitan cities like Hong Kong. In Hong Kong, respirable suspended particulates (RSP), nitrogen oxides (NOx), and nitrogen dioxide (NO2) are major air pollutants due to the dominant usage of diesel fuel by commercial vehicles and buses. Hence, the study of the influence and the trends relating to these pollutants is extremely significant to the public health and the image of the city. The use of neural network techniques to predict trends relating to air pollutants is regarded as a reliable and cost-effective method for the task of prediction. The works reported here involve developing an improved neural network model that combines both the principal component analysis technique and the radial basis function network and forecasts pollutant tendencies based on a recorded database. Compared with general neural network models, the proposed model features a more simple network architecture, a faster training speed, and a more satisfactory prediction performance. The improved model was evaluated with hourly time series of RSP, NOx and NO2 concentrations monitored at the Mong Kok Roadside Gaseous Monitory Station in Hong Kong during the year 2000 and proved to be effective. The model developed is a potential tool for forecasting air quality parameters and is superior to traditional neural network methods.
Multi-Step Time Series Forecasting with an Ensemble of Varied Length Mixture Models.
Ouyang, Yicun; Yin, Hujun
2018-05-01
Many real-world problems require modeling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange (FX) rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive (AR) model, the varied length mixture (VLM) models are proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component AR models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world FX rates and weather temperatures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Jared A.; Hacker, Joshua P.; Monache, Luca Delle
A current barrier to greater deployment of offshore wind turbines is the poor quality of numerical weather prediction model wind and turbulence forecasts over open ocean. The bulk of development for atmospheric boundary layer (ABL) parameterization schemes has focused on land, partly due to a scarcity of observations over ocean. The 100-m FINO1 tower in the North Sea is one of the few sources worldwide of atmospheric profile observations from the sea surface to turbine hub height. These observations are crucial to developing a better understanding and modeling of physical processes in the marine ABL. In this paper we usemore » the WRF single column model (SCM), coupled with an ensemble Kalman filter from the Data Assimilation Research Testbed (DART), to create 100-member ensembles at the FINO1 location. The goal of this study is to determine the extent to which model parameter estimation can improve offshore wind forecasts. Combining two datasets that provide lateral forcing for the SCM and two methods for determining z 0, the time-varying sea-surface roughness length, we conduct four WRF-SCM/DART experiments over the October-December 2006 period. The two methods for determining z 0 are the default Fairall-adjusted Charnock formulation in WRF, and using parameter estimation techniques to estimate z 0 in DART. Using DART to estimate z 0 is found to reduce 1-h forecast errors of wind speed over the Charnock-Fairall z 0 ensembles by 4%–22%. Finally, however, parameter estimation of z 0 does not simultaneously reduce turbulent flux forecast errors, indicating limitations of this approach and the need for new marine ABL parameterizations.« less
System load forecasts for an electric utility. [Hourly loads using Box-Jenkins method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uri, N.D.
This paper discusses forecasting hourly system load for an electric utility using Box-Jenkins time-series analysis. The results indicate that a model based on the method of Box and Jenkins, given its simplicity, gives excellent results over the forecast horizon.
OAST planning model for space systems technology
NASA Technical Reports Server (NTRS)
Sadin, S. R.
1978-01-01
The NASA Office of Aeronautics and Space Technology (OAST) planning model for space systems technology is described, and some space technology forecasts of a general nature are reported. Technology forecasts are presented as a span of technology levels; uncertainties in level of commitment to project and in required time are taken into account, with emphasis on differences resulting from high or low commitment. Forecasts are created by combining several types of data, including information on past technology trends, the trends of past predictions, the rate of advancement predicted by experts in the field, and technology forecasts already published.
Profitability analysis of KINGLONG nearly 5 years
NASA Astrophysics Data System (ADS)
Zhang, Mei; Wen, Jinghua
2017-08-01
Profitability analysis for measuring business performance and forecast its prospects play an important role. In this paper, the research instance King Long Motor in understanding the basic theory on the basis of financial management, to take a combination of theory and data analysis methods, combined with a measure of profitability related indicators of King Long Motor company’s profitability do a specific analysis to identify factors constraining the profitability of Kinglong company exists and the motivation to improve profitability, which made recommendations to improve the profitability of Kinglong car company to promote the company’s future can be better and faster development.)
NASA Astrophysics Data System (ADS)
Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md
2017-08-01
Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.
A case study of the sensitivity of forecast skill to data and data analysis techniques
NASA Technical Reports Server (NTRS)
Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.
1983-01-01
A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.
Gershunov, A.; Barnett, T.P.; Cayan, D.R.; Tubbs, T.; Goddard, L.
2000-01-01
Three long-range forecasting methods have been evaluated for prediction and downscaling of seasonal and intraseasonal precipitation statistics in California. Full-statistical, hybrid-dynamical - statistical and full-dynamical approaches have been used to forecast El Nin??o - Southern Oscillation (ENSO) - related total precipitation, daily precipitation frequency, and average intensity anomalies during the January - March season. For El Nin??o winters, the hybrid approach emerges as the best performer, while La Nin??a forecasting skill is poor. The full-statistical forecasting method features reasonable forecasting skill for both La Nin??a and El Nin??o winters. The performance of the full-dynamical approach could not be evaluated as rigorously as that of the other two forecasting schemes. Although the full-dynamical forecasting approach is expected to outperform simpler forecasting schemes in the long run, evidence is presented to conclude that, at present, the full-dynamical forecasting approach is the least viable of the three, at least in California. The authors suggest that operational forecasting of any intraseasonal temperature, precipitation, or streamflow statistic derivable from the available records is possible now for ENSO-extreme years.
The Second NWRA Flare-Forecasting Comparison Workshop: Methods Compared and Methodology
NASA Astrophysics Data System (ADS)
Leka, K. D.; Barnes, G.; the Flare Forecasting Comparison Group
2013-07-01
The Second NWRA Workshop to compare methods of solar flare forecasting was held 2-4 April 2013 in Boulder, CO. This is a follow-on to the First NWRA Workshop on Flare Forecasting Comparison, also known as the ``All-Clear Forecasting Workshop'', held in 2009 jointly with NASA/SRAG and NOAA/SWPC. For this most recent workshop, many researchers who are active in the field participated, and diverse methods were represented in terms of both the characterization of the Sun and the statistical approaches used to create a forecast. A standard dataset was created for this investigation, using data from the Solar Dynamics Observatory/ Helioseismic and Magnetic Imager (SDO/HMI) vector magnetic field HARP series. For each HARP on each day, 6 hours of data were used, allowing for nominal time-series analysis to be included in the forecasts. We present here a summary of the forecasting methods that participated and the standardized dataset that was used. Funding for the workshop and the data analysis was provided by NASA/Living with a Star contract NNH09CE72C and NASA/Guest Investigator contract NNH12CG10C.
Convective Weather Forecast Quality Metrics for Air Traffic Management Decision-Making
NASA Technical Reports Server (NTRS)
Chatterji, Gano B.; Gyarfas, Brett; Chan, William N.; Meyn, Larry A.
2006-01-01
Since numerical weather prediction models are unable to accurately forecast the severity and the location of the storm cells several hours into the future when compared with observation data, there has been a growing interest in probabilistic description of convective weather. The classical approach for generating uncertainty bounds consists of integrating the state equations and covariance propagation equations forward in time. This step is readily recognized as the process update step of the Kalman Filter algorithm. The second well known method, known as the Monte Carlo method, consists of generating output samples by driving the forecast algorithm with input samples selected from distributions. The statistical properties of the distributions of the output samples are then used for defining the uncertainty bounds of the output variables. This method is computationally expensive for a complex model compared to the covariance propagation method. The main advantage of the Monte Carlo method is that a complex non-linear model can be easily handled. Recently, a few different methods for probabilistic forecasting have appeared in the literature. A method for computing probability of convection in a region using forecast data is described in Ref. 5. Probability at a grid location is computed as the fraction of grid points, within a box of specified dimensions around the grid location, with forecast convection precipitation exceeding a specified threshold. The main limitation of this method is that the results are dependent on the chosen dimensions of the box. The examples presented Ref. 5 show that this process is equivalent to low-pass filtering of the forecast data with a finite support spatial filter. References 6 and 7 describe the technique for computing percentage coverage within a 92 x 92 square-kilometer box and assigning the value to the center 4 x 4 square-kilometer box. This technique is same as that described in Ref. 5. Characterizing the forecast, following the process described in Refs. 5 through 7, in terms of percentage coverage or confidence level is notionally sound compared to characterizing in terms of probabilities because the probability of the forecast being correct can only be determined using actual observations. References 5 through 7 only use the forecast data and not the observations. The method for computing the probability of detection, false alarm ratio and several forecast quality metrics (Skill Scores) using both the forecast and observation data are given in Ref. 2. This paper extends the statistical verification method in Ref. 2 to determine co-occurrence probabilities. The method consists of computing the probability that a severe weather cell (grid location) is detected in the observation data in the neighborhood of the severe weather cell in the forecast data. Probabilities of occurrence at the grid location and in its neighborhood with higher severity, and with lower severity in the observation data compared to that in the forecast data are examined. The method proposed in Refs. 5 through 7 is used for computing the probability that a certain number of cells in the neighborhood of severe weather cells in the forecast data are seen as severe weather cells in the observation data. Finally, the probability of existence of gaps in the observation data in the neighborhood of severe weather cells in forecast data is computed. Gaps are defined as openings between severe weather cells through which an aircraft can safely fly to its intended destination. The rest of the paper is organized as follows. Section II summarizes the statistical verification method described in Ref. 2. The extension of this method for computing the co-occurrence probabilities in discussed in Section HI. Numerical examples using NCWF forecast data and NCWD observation data are presented in Section III to elucidate the characteristics of the co-occurrence probabilities. This section also discusses the procedure for computing throbabilities that the severity of convection in the observation data will be higher or lower in the neighborhood of grid locations compared to that indicated at the grid locations in the forecast data. The probability of coverage of neighborhood grid cells is also described via examples in this section. Section IV discusses the gap detection algorithm and presents a numerical example to illustrate the method. The locations of the detected gaps in the observation data are used along with the locations of convective weather cells in the forecast data to determine the probability of existence of gaps in the neighborhood of these cells. Finally, the paper is concluded in Section V.
A Load-Based Temperature Prediction Model for Anomaly Detection
NASA Astrophysics Data System (ADS)
Sobhani, Masoud
Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.
A new accuracy measure based on bounded relative error for time series forecasting
Twycross, Jamie; Garibaldi, Jonathan M.
2017-01-01
Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred. PMID:28339480
A new accuracy measure based on bounded relative error for time series forecasting.
Chen, Chao; Twycross, Jamie; Garibaldi, Jonathan M
2017-01-01
Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred.
NASA Astrophysics Data System (ADS)
Beckers, J.; Weerts, A.; Tijdeman, E.; Welles, E.; McManamon, A.
2013-12-01
To provide reliable and accurate seasonal streamflow forecasts for water resources management several operational hydrologic agencies and hydropower companies around the world use the Extended Streamflow Prediction (ESP) procedure. The ESP in its original implementation does not accommodate for any additional information that the forecaster may have about expected deviations from climatology in the near future. Several attempts have been conducted to improve the skill of the ESP forecast, especially for areas which are affected by teleconnetions (e,g. ENSO, PDO) via selection (Hamlet and Lettenmaier, 1999) or weighting schemes (Werner et al., 2004; Wood and Lettenmaier, 2006; Najafi et al., 2012). A disadvantage of such schemes is that they lead to a reduction of the signal to noise ratio of the probabilistic forecast. To overcome this, we propose a resampling method conditional on climate indices to generate meteorological time series to be used in the ESP. The method can be used to generate a large number of meteorological ensemble members in order to improve the statistical properties of the ensemble. The effectiveness of the method was demonstrated in a real-time operational hydrologic seasonal forecasts system for the Columbia River basin operated by the Bonneville Power Administration. The forecast skill of the k-nn resampler was tested against the original ESP for three basins at the long-range seasonal time scale. The BSS and CRPSS were used to compare the results to those of the original ESP method. Positive forecast skill scores were found for the resampler method conditioned on different indices for the prediction of spring peak flows in the Dworshak and Hungry Horse basin. For the Libby Dam basin however, no improvement of skill was found. The proposed resampling method is a promising practical approach that can add skill to ESP forecasts at the seasonal time scale. Further improvement is possible by fine tuning the method and selecting the most informative climate indices for the region of interest.
Replacement Beef Cow Valuation under Data Availability Constraints
Hagerman, Amy D.; Thompson, Jada M.; Ham, Charlotte; Johnson, Kamina K.
2017-01-01
Economists are often tasked with estimating the benefits or costs associated with livestock production losses; however, lack of available data or absence of consistent reporting can reduce the accuracy of these valuations. This work looks at three potential estimation techniques for determining the value for replacement beef cows with varying types of market data to proxy constrained data availability and discusses the potential margin of error for each technique. Oklahoma bred replacement cows are valued using hedonic pricing based on Oklahoma bred cow data—a best case scenario—vector error correction modeling (VECM) based on national cow sales data and cost of production (COP) based on just a representative enterprise budget and very limited sales data. Each method was then used to perform a within-sample forecast of 2016 January to December, and forecasts are compared with the 2016 monthly observed market prices in Oklahoma using the mean absolute percent error (MAPE). Hedonic pricing methods tend to overvalue for within-sample forecasting but performed best, as measured by MAPE for high quality cows. The VECM tended to undervalue cows but performed best for younger animals. COP performed well, compared with the more data intensive methods. Examining each method individually across eight representative replacement beef female types, the VECM forecast resulted in a MAPE under 10% for 33% of forecasted months, followed by hedonic pricing at 24% of the forecasted months and COP at 14% of the forecasted months for average quality beef females. For high quality females, the hedonic pricing method worked best producing a MAPE under 10% in 36% of the forecasted months followed by the COP method at 21% of months and the VECM at 14% of the forecasted months. These results suggested that livestock valuation method selection was not one-size-fits-all and may need to vary based not only on the data available but also on the characteristics (e.g., quality or age) of the livestock being valued. PMID:29164141
NASA Astrophysics Data System (ADS)
Shukla, S.; McEvoy, D.; Hobbins, M.; Husak, G. J.; Huntington, J. L.; Funk, C.; Verdin, J.; Macharia, D.
2017-12-01
The Famine Early Warning Systems Network (FEWS NET) team provides food insecurity outlooks for several developing countries in Africa, Central Asia, and Central America. Thus far in terms of agroclimatic conditions that influence food insecurity, FEWS NET's primary focus has been on the seasonal precipitation forecasts while not adequately accounting for the atmospheric evaporative demand, which is also directly related to agricultural production and hence food insecurity, and is most often estimated by reference evapotranspiration (ETo). This presentation reports on the development of a new global ETo seasonal reforecast and skill evaluation with a particular emphasis on the potential use of this dataset by the FEWS NET to support food insecurity early warning. The ETo reforecasts span the 1982-2009 period and are calculated following ASCE's formulation of Penman-Monteith method driven by seasonal climate forecasts of monthly mean temperature, humidity, wind speed, and solar radiation from NCEP's CFSv2 and NASA's GEOS-5 models. The skill evaluation using deterministic and probabilistic scores focuses on the December-February (DJF), March-May (MAM), June-August (JJA) and September-November (SON) seasons. The results indicate that ETo forecasts are a promising tool for early warning of drought and food insecurity. The FEWS NET regions with promising level of skill (correlation >0.35 at lead times of 3 months) include Northern Sub-Saharan Africa (DJF, dry season), Central America (DJF, dry season), parts of East Africa (JJA, wet Season), Southern Africa (JJA, dry season), and Central Asia (MAM, wet season). A case study over parts of East Africa for the JJA season shows that, in combination with the precipitation forecasts, ETo forecasts could have provided early warning of recent severe drought events (e.g., 2002, 2004, 2009) that contributed to substantial food insecurity in the region.
A novel single-parameter approach for forecasting algal blooms.
Xiao, Xi; He, Junyu; Huang, Haomin; Miller, Todd R; Christakos, George; Reichwaldt, Elke S; Ghadouani, Anas; Lin, Shengpan; Xu, Xinhua; Shi, Jiyan
2017-01-01
Harmful algal blooms frequently occur globally, and forecasting could constitute an essential proactive strategy for bloom control. To decrease the cost of aquatic environmental monitoring and increase the accuracy of bloom forecasting, a novel single-parameter approach combining wavelet analysis with artificial neural networks (WNN) was developed and verified based on daily online monitoring datasets of algal density in the Siling Reservoir, China and Lake Winnebago, U.S.A. Firstly, a detailed modeling process was illustrated using the forecasting of cyanobacterial cell density in the Chinese reservoir as an example. Three WNN models occupying various prediction time intervals were optimized through model training using an early stopped training approach. All models performed well in fitting historical data and predicting the dynamics of cyanobacterial cell density, with the best model predicting cyanobacteria density one-day ahead (r = 0.986 and mean absolute error = 0.103 × 10 4 cells mL -1 ). Secondly, the potential of this novel approach was further confirmed by the precise predictions of algal biomass dynamics measured as chl a in both study sites, demonstrating its high performance in forecasting algal blooms, including cyanobacteria as well as other blooming species. Thirdly, the WNN model was compared to current algal forecasting methods (i.e. artificial neural networks, autoregressive integrated moving average model), and was found to be more accurate. In addition, the application of this novel single-parameter approach is cost effective as it requires only a buoy-mounted fluorescent probe, which is merely a fraction (∼15%) of the cost of a typical auto-monitoring system. As such, the newly developed approach presents a promising and cost-effective tool for the future prediction and management of harmful algal blooms. Copyright © 2016 Elsevier Ltd. All rights reserved.
Integrating Remote Sensing and Disease Surveillance to Forecast Malaria Epidemics
NASA Astrophysics Data System (ADS)
Wimberly, M. C.; Beyane, B.; DeVos, M.; Liu, Y.; Merkord, C. L.; Mihretie, A.
2015-12-01
Advance information about the timing and locations of malaria epidemics can facilitate the targeting of resources for prevention and emergency response. Early detection methods can detect incipient outbreaks by identifying deviations from expected seasonal patterns, whereas early warning approaches typically forecast future malaria risk based on lagged responses to meteorological factors. A critical limiting factor for implementing either of these approaches is the need for timely and consistent acquisition, processing and analysis of both environmental and epidemiological data. To address this need, we have developed EPIDEMIA - an integrated system for surveillance and forecasting of malaria epidemics. The EPIDEMIA system includes a public health interface for uploading and querying weekly surveillance reports as well as algorithms for automatically validating incoming data and updating the epidemiological surveillance database. The newly released EASTWeb 2.0 software application automatically downloads, processes, and summaries remotely-sensed environmental data from multiple earth science data archives. EASTWeb was implemented as a component of the EPIDEMIA system, which combines the environmental monitoring data and epidemiological surveillance data into a unified database that supports both early detection and early warning models. Dynamic linear models implemented with Kalman filtering were used to carry out forecasting and model updating. Preliminary forecasts have been disseminated to public health partners in the Amhara Region of Ethiopia and will be validated and refined as the EPIDEMIA system ingests new data. In addition to continued model development and testing, future work will involve updating the public health interface to provide a broader suite of outbreak alerts and data visualization tools that are useful to our public health partners. The EPIDEMIA system demonstrates a feasible approach to synthesizing the information from epidemiological surveillance systems and remotely-sensed environmental monitoring systems to improve malaria epidemic detection and forecasting.
Post-processing through linear regression
NASA Astrophysics Data System (ADS)
van Schaeybroeck, B.; Vannitsem, S.
2011-03-01
Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
Recursive least squares background prediction of univariate syndromic surveillance data.
Najmi, Amir-Homayoon; Burkom, Howard
2009-01-16
Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the residuals of the RLS forecasts. We compare the detection performance of this algorithm to the W2 method recently implemented at CDC. The modified RLS method gives consistently better sensitivity at multiple background alert rates, and we recommend that it should be considered for routine application in bio-surveillance systems.
NASA Astrophysics Data System (ADS)
Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.; Arámbula-Mendoza, R.; Budi-Santoso, A.
2016-11-01
Most attempts of deterministic eruption forecasting are based on the material Failure Forecast Method (FFM). This method assumes that a precursory observable, such as the rate of seismic activity, can be described by a simple power law which presents a singularity at a time close to the eruption onset. Until now, this method has been applied only in a small number of cases, generally for forecasts in hindsight. In this paper, a rigorous Bayesian approach of the FFM designed for real-time applications is applied. Using an automatic recognition system, seismo-volcanic events are detected and classified according to their physical mechanism and time series of probability distributions of the rates of events are calculated. At each time of observation, a Bayesian inversion provides estimations of the exponent of the power law and of the time of eruption, together with their probability density functions. Two criteria are defined in order to evaluate the quality and reliability of the forecasts. Our automated procedure has allowed the analysis of long, continuous seismic time series: 13 years from Volcán de Colima, Mexico, 10 years from Piton de la Fournaise, Reunion Island, France, and several months from Merapi volcano, Java, Indonesia. The new forecasting approach has been applied to 64 pre-eruptive sequences which present various types of dominant seismic activity (volcano-tectonic or long-period events) and patterns of seismicity with different level of complexity. This has allowed us to test the FFM assumptions, to determine in which conditions the method can be applied, and to quantify the success rate of the forecasts. 62% of the precursory sequences analysed are suitable for the application of FFM and half of the total number of eruptions are successfully forecast in hindsight. In real-time, the method allows for the successful forecast of 36% of all the eruptions considered. Nevertheless, real-time forecasts are successful for 83% of the cases that fulfil the reliability criteria. Therefore, good confidence on the method is obtained when the reliability criteria are met.
Buitrago, Jaime; Asfour, Shihab
2017-01-01
Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buitrago, Jaime; Asfour, Shihab
Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less
Statistical Correction of Air Temperature Forecasts for City and Road Weather Applications
NASA Astrophysics Data System (ADS)
Mahura, Alexander; Petersen, Claus; Sass, Bent; Gilet, Nicolas
2014-05-01
The method for statistical correction of air /road surface temperatures forecasts was developed based on analysis of long-term time-series of meteorological observations and forecasts (from HIgh Resolution Limited Area Model & Road Conditions Model; 3 km horizontal resolution). It has been tested for May-Aug 2012 & Oct 2012 - Mar 2013, respectively. The developed method is based mostly on forecasted meteorological parameters with a minimal inclusion of observations (covering only a pre-history period). Although the st iteration correction is based taking into account relevant temperature observations, but the further adjustment of air and road temperature forecasts is based purely on forecasted meteorological parameters. The method is model independent, e.g. it can be applied for temperature correction with other types of models having different horizontal resolutions. It is relatively fast due to application of the singular value decomposition method for matrix solution to find coefficients. Moreover, there is always a possibility for additional improvement due to extra tuning of the temperature forecasts for some locations (stations), and in particular, where for example, the MAEs are generally higher compared with others (see Gilet et al., 2014). For the city weather applications, new operationalized procedure for statistical correction of the air temperature forecasts has been elaborated and implemented for the HIRLAM-SKA model runs at 00, 06, 12, and 18 UTCs covering forecast lengths up to 48 hours. The procedure includes segments for extraction of observations and forecast data, assigning these to forecast lengths, statistical correction of temperature, one-&multi-days statistical evaluation of model performance, decision-making on using corrections by stations, interpolation, visualisation and storage/backup. Pre-operational air temperature correction runs were performed for the mainland Denmark since mid-April 2013 and shown good results. Tests also showed that the CPU time required for the operational procedure is relatively short (less than 15 minutes including a large time spent for interpolation). These also showed that in order to start correction of forecasts there is no need to have a long-term pre-historical data (containing forecasts and observations) and, at least, a couple of weeks will be sufficient when a new observational station is included and added to the forecast point. Note for the road weather application, the operationalization of the statistical correction of the road surface temperature forecasts (for the RWM system daily hourly runs covering forecast length up to 5 hours ahead) for the Danish road network (for about 400 road stations) was also implemented, and it is running in a test mode since Sep 2013. The method can also be applied for correction of the dew point temperature and wind speed (as a part of observations/ forecasts at synoptical stations), where these both meteorological parameters are parts of the proposed system of equations. The evaluation of the method performance for improvement of the wind speed forecasts is planned as well, with considering possibilities for the wind direction improvements (which is more complex due to multi-modal types of such data distribution). The method worked for the entire domain of mainland Denmark (tested for 60 synoptical and 395 road stations), and hence, it can be also applied for any geographical point within this domain, as through interpolation to about 100 cities' locations (for Danish national byvejr forecasts). Moreover, we can assume that the same method can be used in other geographical areas. The evaluation for other domains (with a focus on Greenland and Nordic countries) is planned. In addition, a similar approach might be also tested for statistical correction of concentrations of chemical species, but such approach will require additional elaboration and evaluation.
Performance of univariate forecasting on seasonal diseases: the case of tuberculosis.
Permanasari, Adhistya Erna; Rambli, Dayang Rohaya Awang; Dominic, P Dhanapal Durai
2011-01-01
The annual disease incident worldwide is desirable to be predicted for taking appropriate policy to prevent disease outbreak. This chapter considers the performance of different forecasting method to predict the future number of disease incidence, especially for seasonal disease. Six forecasting methods, namely linear regression, moving average, decomposition, Holt-Winter's, ARIMA, and artificial neural network (ANN), were used for disease forecasting on tuberculosis monthly data. The model derived met the requirement of time series with seasonality pattern and downward trend. The forecasting performance was compared using similar error measure in the base of the last 5 years forecast result. The findings indicate that ARIMA model was the most appropriate model since it obtained the less relatively error than the other model.
Study on load forecasting to data centers of high power density based on power usage effectiveness
NASA Astrophysics Data System (ADS)
Zhou, C. C.; Zhang, F.; Yuan, Z.; Zhou, L. M.; Wang, F. M.; Li, W.; Yang, J. H.
2016-08-01
There is usually considerable energy consumption in data centers. Load forecasting to data centers is in favor of formulating regional load density indexes and of great benefit to getting regional spatial load forecasting more accurately. The building structure and the other influential factors, i.e. equipment, geographic and climatic conditions, are considered for the data centers, and a method to forecast the load of the data centers based on power usage effectiveness is proposed. The cooling capacity of a data center and the index of the power usage effectiveness are used to forecast the power load of the data center in the method. The cooling capacity is obtained by calculating the heat load of the data center. The index is estimated using the group decision-making method of mixed language information. An example is given to prove the applicability and accuracy of this method.
Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan
2013-06-01
In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.
Calibration of decadal ensemble predictions
NASA Astrophysics Data System (ADS)
Pasternack, Alexander; Rust, Henning W.; Bhend, Jonas; Liniger, Mark; Grieger, Jens; Müller, Wolfgang; Ulbrich, Uwe
2017-04-01
Decadal climate predictions are of great socio-economic interest due to the corresponding planning horizons of several political and economic decisions. Due to uncertainties of weather and climate, forecasts (e.g. due to initial condition uncertainty), they are issued in a probabilistic way. One issue frequently observed for probabilistic forecasts is that they tend to be not reliable, i.e. the forecasted probabilities are not consistent with the relative frequency of the associated observed events. Thus, these kind of forecasts need to be re-calibrated. While re-calibration methods for seasonal time scales are available and frequently applied, these methods still have to be adapted for decadal time scales and its characteristic problems like climate trend and lead time dependent bias. Regarding this, we propose a method to re-calibrate decadal ensemble predictions that takes the above mentioned characteristics into account. Finally, this method will be applied and validated to decadal forecasts from the MiKlip system (Germany's initiative for decadal prediction).
Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project
NASA Astrophysics Data System (ADS)
Arritt, R. W.; Mred Team
2010-12-01
The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence
Kelly, David; Majda, Andrew J.; Tong, Xin T.
2015-01-01
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.
Kelly, David; Majda, Andrew J; Tong, Xin T
2015-08-25
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.
Application of wavelet-based multi-model Kalman filters to real-time flood forecasting
NASA Astrophysics Data System (ADS)
Chou, Chien-Ming; Wang, Ru-Yih
2004-04-01
This paper presents the application of a multimodel method using a wavelet-based Kalman filter (WKF) bank to simultaneously estimate decomposed state variables and unknown parameters for real-time flood forecasting. Applying the Haar wavelet transform alters the state vector and input vector of the state space. In this way, an overall detail plus approximation describes each new state vector and input vector, which allows the WKF to simultaneously estimate and decompose state variables. The wavelet-based multimodel Kalman filter (WMKF) is a multimodel Kalman filter (MKF), in which the Kalman filter has been substituted for a WKF. The WMKF then obtains M estimated state vectors. Next, the M state-estimates, each of which is weighted by its possibility that is also determined on-line, are combined to form an optimal estimate. Validations conducted for the Wu-Tu watershed, a small watershed in Taiwan, have demonstrated that the method is effective because of the decomposition of wavelet transform, the adaptation of the time-varying Kalman filter and the characteristics of the multimodel method. Validation results also reveal that the resulting method enhances the accuracy of the runoff prediction of the rainfall-runoff process in the Wu-Tu watershed.
A new scoring method for evaluating the performance of earthquake forecasts and predictions
NASA Astrophysics Data System (ADS)
Zhuang, J.
2009-12-01
This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
Data-driven forecasting algorithms for building energy consumption
NASA Astrophysics Data System (ADS)
Noh, Hae Young; Rajagopal, Ram
2013-04-01
This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.
NASA Astrophysics Data System (ADS)
Schmidt, F.; Liu, S.
2016-12-01
Source water quality plays an important role for the safety of drinking water and early detection of its contamination is vital to taking appropriate countermeasures. However, compared to drinking water, it is more difficult to detect contamination events because its environment is less controlled and numerous natural causes contribute to a high variability of the background values. In this project, Artificial Neural Networks (ANNs) and a Contamination Event Detection Process (CED Process) were used to identify events in river water. The ANN models the response of basic water quality sensors obtained in laboratory experiments in an off-line learning stage and continuously forecasts future values of the time line in an on-line forecasting step. During this second stage, the CED Process compares the forecast to the measured value and classifies it as regular background or event value, which modifies the ANN's continuous learning and influences its forecasts. In addition to this basic setup, external information is fed to the CED Process: A so-called Operator Input (OI) is provided to inform about unusual water quality levels that are unrelated to the presence of contamination, for example due to cooling water discharge from a nearby power plant. This study's primary goal is to evaluate how well the OI fits into the design of the combined forecasting ANN and CED Process and to understand its effects on the online forecasting stage. To test this, data from laboratory experiments conducted previously at the School of Environment, Tsinghua University, have been used to perform simulations highlighting features and drawbacks of this method. Applying the OI has been shown to have a positive influence on the ANN's ability to handle a sudden change in background values, which is unrelated to contamination. However, it might also mask the presence of an event, an issue that underlines the necessity to have several instances of the algorithm run in parallel. Other difficulties addressed in this study include the source and the format of the OI. This project tries to add to the ongoing research into algorithms for CED. It provides ideas for how results from the binary classification of time series could be evaluated in a more realistic fashion and shows what the advantages and limitations of such a method would be.
An information-theoretical perspective on weighted ensemble forecasts
NASA Astrophysics Data System (ADS)
Weijs, Steven V.; van de Giesen, Nick
2013-08-01
This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.
NASA Astrophysics Data System (ADS)
Liu, P.
2013-12-01
Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.
Probing magma reservoirs to improve volcano forecasts
Lowenstern, Jacob B.; Sisson, Thomas W.; Hurwitz, Shaul
2017-01-01
When it comes to forecasting eruptions, volcano observatories rely mostly on real-time signals from earthquakes, ground deformation, and gas discharge, combined with probabilistic assessments based on past behavior [Sparks and Cashman, 2017]. There is comparatively less reliance on geophysical and petrological understanding of subsurface magma reservoirs.
On Manpower Forecasting. Methods for Manpower Analysis, No.2.
ERIC Educational Resources Information Center
Morton, J.E.
Some of the problems and techniques involved in manpower forecasting are discussed. This non-technical introduction to the field aims at reducing fears of data manipulation methods and at increasing respect for conceptual, logical, and analytical issues. The major approaches to manpower forecasting are explicated and evaluated under the headings:…
ERIC Educational Resources Information Center
Baker, Bruce D.; Richards, Craig E.
1999-01-01
Applies neural network methods for forecasting 1991-95 per-pupil expenditures in U.S. public elementary and secondary schools. Forecasting models included the National Center for Education Statistics' multivariate regression model and three neural architectures. Regarding prediction accuracy, neural network results were comparable or superior to…
Educational Forecasting Methodologies: State of the Art, Trends, and Highlights.
ERIC Educational Resources Information Center
Hudson, Barclay; Bruno, James
This overview of both quantitative and qualitative methods of educational forecasting is introduced by a discussion of a general typology of forecasting methods. In each of the following sections, discussion follows the same general format: a number of basic approaches are identified (e.g. extrapolation, correlation, systems modelling), and each…
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Verification of operational solar flare forecast: Case of Regional Warning Center Japan
NASA Astrophysics Data System (ADS)
Kubo, Yûki; Den, Mitsue; Ishii, Mamoru
2017-08-01
In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.
Flare forecasting at the Met Office Space Weather Operations Centre
NASA Astrophysics Data System (ADS)
Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.
2017-04-01
The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.
NASA Astrophysics Data System (ADS)
Gan, Chuen-Meei
Air quality model forecasts from Weather Research and Forecast (WRF) and Community Multiscale Air Quality (CMAQ) are often used to support air quality applications such as regulatory issues and scientific inquiries on atmospheric science processes. In urban environments, these models become more complex due to the inherent complexity of the land surface coupling and the enhanced pollutants emissions. This makes it very difficult to diagnose the model, if the surface parameter forecasts such as PM2.5 (particulate matter with aerodynamic diameter less than 2.5 microm) are not accurate. For this reason, getting accurate boundary layer dynamic forecasts is as essential as quantifying realistic pollutants emissions. In this thesis, we explore the usefulness of vertical sounding measurements on assessing meteorological and air quality forecast models. In particular, we focus on assessing the WRF model (12km x 12km) coupled with the CMAQ model for the urban New York City (NYC) area using multiple vertical profiling and column integrated remote sensing measurements. This assessment is helpful in probing the root causes for WRF-CMAQ overestimates of surface PM2.5 occurring both predawn and post-sunset in the NYC area during the summer. In particular, we find that the significant underestimates in the WRF PBL height forecast is a key factor in explaining this anomaly. On the other hand, the model predictions of the PBL height during daytime when convective heating dominates were found to be highly correlated to lidar derived PBL height with minimal bias. Additional topics covered in this thesis include mathematical method using direct Mie scattering approach to convert aerosol microphysical properties from CMAQ into optical parameters making direct comparisons with lidar and multispectral radiometers feasible. Finally, we explore some tentative ideas on combining visible (VIS) and mid-infrared (MIR) sensors to better separate aerosols into fine and coarse modes.
Forecasting Natural Gas Prices Using Wavelets, Time Series, and Artificial Neural Networks
2015-01-01
Following the unconventional gas revolution, the forecasting of natural gas prices has become increasingly important because the association of these prices with those of crude oil has weakened. With this as motivation, we propose some modified hybrid models in which various combinations of the wavelet approximation, detail components, autoregressive integrated moving average, generalized autoregressive conditional heteroskedasticity, and artificial neural network models are employed to predict natural gas prices. We also emphasize the boundary problem in wavelet decomposition, and compare results that consider the boundary problem case with those that do not. The empirical results show that our suggested approach can handle the boundary problem, such that it facilitates the extraction of the appropriate forecasting results. The performance of the wavelet-hybrid approach was superior in all cases, whereas the application of detail components in the forecasting was only able to yield a small improvement in forecasting performance. Therefore, forecasting with only an approximation component would be acceptable, in consideration of forecasting efficiency. PMID:26539722
Forecasting Natural Gas Prices Using Wavelets, Time Series, and Artificial Neural Networks.
Jin, Junghwan; Kim, Jinsoo
2015-01-01
Following the unconventional gas revolution, the forecasting of natural gas prices has become increasingly important because the association of these prices with those of crude oil has weakened. With this as motivation, we propose some modified hybrid models in which various combinations of the wavelet approximation, detail components, autoregressive integrated moving average, generalized autoregressive conditional heteroskedasticity, and artificial neural network models are employed to predict natural gas prices. We also emphasize the boundary problem in wavelet decomposition, and compare results that consider the boundary problem case with those that do not. The empirical results show that our suggested approach can handle the boundary problem, such that it facilitates the extraction of the appropriate forecasting results. The performance of the wavelet-hybrid approach was superior in all cases, whereas the application of detail components in the forecasting was only able to yield a small improvement in forecasting performance. Therefore, forecasting with only an approximation component would be acceptable, in consideration of forecasting efficiency.
Iterative near-term ecological forecasting: Needs, opportunities, and challenges
Dietze, Michael C.; Fox, Andrew; Beck-Johnson, Lindsay; Betancourt, Julio L.; Hooten, Mevin B.; Jarnevich, Catherine S.; Keitt, Timothy H.; Kenney, Melissa A.; Laney, Christine M.; Larsen, Laurel G.; Loescher, Henry W.; Lunch, Claire K.; Pijanowski, Bryan; Randerson, James T.; Read, Emily; Tredennick, Andrew T.; Vargas, Rodrigo; Weathers, Kathleen C.; White, Ethan P.
2018-01-01
Two foundational questions about sustainability are “How are ecosystems and the services they provide going to change in the future?” and “How do human decisions affect these trajectories?” Answering these questions requires an ability to forecast ecological processes. Unfortunately, most ecological forecasts focus on centennial-scale climate responses, therefore neither meeting the needs of near-term (daily to decadal) environmental decision-making nor allowing comparison of specific, quantitative predictions to new observational data, one of the strongest tests of scientific theory. Near-term forecasts provide the opportunity to iteratively cycle between performing analyses and updating predictions in light of new evidence. This iterative process of gaining feedback, building experience, and correcting models and methods is critical for improving forecasts. Iterative, near-term forecasting will accelerate ecological research, make it more relevant to society, and inform sustainable decision-making under high uncertainty and adaptive management. Here, we identify the immediate scientific and societal needs, opportunities, and challenges for iterative near-term ecological forecasting. Over the past decade, data volume, variety, and accessibility have greatly increased, but challenges remain in interoperability, latency, and uncertainty quantification. Similarly, ecologists have made considerable advances in applying computational, informatic, and statistical methods, but opportunities exist for improving forecast-specific theory, methods, and cyberinfrastructure. Effective forecasting will also require changes in scientific training, culture, and institutions. The need to start forecasting is now; the time for making ecology more predictive is here, and learning by doing is the fastest route to drive the science forward.
Iterative near-term ecological forecasting: Needs, opportunities, and challenges.
Dietze, Michael C; Fox, Andrew; Beck-Johnson, Lindsay M; Betancourt, Julio L; Hooten, Mevin B; Jarnevich, Catherine S; Keitt, Timothy H; Kenney, Melissa A; Laney, Christine M; Larsen, Laurel G; Loescher, Henry W; Lunch, Claire K; Pijanowski, Bryan C; Randerson, James T; Read, Emily K; Tredennick, Andrew T; Vargas, Rodrigo; Weathers, Kathleen C; White, Ethan P
2018-02-13
Two foundational questions about sustainability are "How are ecosystems and the services they provide going to change in the future?" and "How do human decisions affect these trajectories?" Answering these questions requires an ability to forecast ecological processes. Unfortunately, most ecological forecasts focus on centennial-scale climate responses, therefore neither meeting the needs of near-term (daily to decadal) environmental decision-making nor allowing comparison of specific, quantitative predictions to new observational data, one of the strongest tests of scientific theory. Near-term forecasts provide the opportunity to iteratively cycle between performing analyses and updating predictions in light of new evidence. This iterative process of gaining feedback, building experience, and correcting models and methods is critical for improving forecasts. Iterative, near-term forecasting will accelerate ecological research, make it more relevant to society, and inform sustainable decision-making under high uncertainty and adaptive management. Here, we identify the immediate scientific and societal needs, opportunities, and challenges for iterative near-term ecological forecasting. Over the past decade, data volume, variety, and accessibility have greatly increased, but challenges remain in interoperability, latency, and uncertainty quantification. Similarly, ecologists have made considerable advances in applying computational, informatic, and statistical methods, but opportunities exist for improving forecast-specific theory, methods, and cyberinfrastructure. Effective forecasting will also require changes in scientific training, culture, and institutions. The need to start forecasting is now; the time for making ecology more predictive is here, and learning by doing is the fastest route to drive the science forward.
EU pharmaceutical expenditure forecast
Urbinati, Duccio; Rémuzat, Cécile; Kornfeld, Åsa; Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
Background and Objectives With constant incentives for healthcare payers to contain their pharmaceutical budgets, forecasting has become critically important. Some countries have, for instance, developed pharmaceutical horizon scanning units. The objective of this project was to build a model to assess the net effect of the entrance of new patented medicinal products versus medicinal products going off-patent, with a defined forecast horizon, on selected European Union (EU) Member States’ pharmaceutical budgets. This model took into account population ageing, as well as current and future country-specific pricing, reimbursement, and market access policies (the project was performed for the European Commission; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Method In order to have a representative heterogeneity of EU Member States, the following countries were selected for the analysis: France, Germany, Greece, Hungary, Poland, Portugal, and the United Kingdom. A forecasting period of 5 years (2012–2016) was chosen to assess the net pharmaceutical budget impact. A model for generics and biosimilars was developed for each country. The model estimated a separate and combined effect of the direct and indirect impacts of the patent cliff. A second model, estimating the sales development and the risk of development failure, was developed for new drugs. New drugs were reviewed individually to assess their clinical potential and translate it into commercial potential. The forecast was carried out according to three perspectives (healthcare public payer, society, and manufacturer), and several types of distribution chains (retail, hospital, and combined retail and hospital). Probabilistic and deterministic sensitivity analyses were carried out. Results According to the model, all countries experienced drug budget reductions except Poland (+€41 million). Savings were expected to be the highest in the United Kingdom (−€9,367 million), France (−€5,589 million), and, far behind them, Germany (−€831 million), Greece (−€808 million), Portugal (−€243 million), and Hungary (−€84 million). The main source of savings came from the cardiovascular, central nervous system, and respiratory areas and from biosimilar entries. Oncology, immunology, and inflammation, in contrast, lead to additional expenditure. The model was particularly sensitive to the time to market of branded products, generic prices, generic penetration, and the distribution of biosimilars. Conclusions The results of this forecast suggested a decrease in pharmaceutical expenditure in the studied period. The model was sensitive to pharmaceutical policy decisions. PMID:27226837
Approaches in Health Human Resource Forecasting: A Roadmap for Improvement
Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh
2016-01-01
Introduction Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. Methods A literature review was conducted for studies published in English from 1990–2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies’ references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses Results Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. Conclusions An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems. PMID:27790343
NASA Astrophysics Data System (ADS)
O'Connor, Alison; Kirtman, Benjamin; Harrison, Scott; Gorman, Joe
2016-05-01
The US Navy faces several limitations when planning operations in regard to forecasting environmental conditions. Currently, mission analysis and planning tools rely heavily on short-term (less than a week) forecasts or long-term statistical climate products. However, newly available data in the form of weather forecast ensembles provides dynamical and statistical extended-range predictions that can produce more accurate predictions if ensemble members can be combined correctly. Charles River Analytics is designing the Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS), which performs data fusion over extended-range multi-model ensembles, such as the North American Multi-Model Ensemble (NMME), to produce a unified forecast for several weeks to several seasons in the future. We evaluated thirty years of forecasts using machine learning to select predictions for an all-encompassing and superior forecast that can be used to inform the Navy's decision planning process.
NASA Astrophysics Data System (ADS)
Petrova, Desislava; Koopman, Siem Jan; Ballester, Joan; Rodó, Xavier
2017-02-01
El Niño (EN) is a dominant feature of climate variability on inter-annual time scales driving changes in the climate throughout the globe, and having wide-spread natural and socio-economic consequences. In this sense, its forecast is an important task, and predictions are issued on a regular basis by a wide array of prediction schemes and climate centres around the world. This study explores a novel method for EN forecasting. In the state-of-the-art the advantageous statistical technique of unobserved components time series modeling, also known as structural time series modeling, has not been applied. Therefore, we have developed such a model where the statistical analysis, including parameter estimation and forecasting, is based on state space methods, and includes the celebrated Kalman filter. The distinguishing feature of this dynamic model is the decomposition of a time series into a range of stochastically time-varying components such as level (or trend), seasonal, cycles of different frequencies, irregular, and regression effects incorporated as explanatory covariates. These components are modeled separately and ultimately combined in a single forecasting scheme. Customary statistical models for EN prediction essentially use SST and wind stress in the equatorial Pacific. In addition to these, we introduce a new domain of regression variables accounting for the state of the subsurface ocean temperature in the western and central equatorial Pacific, motivated by our analysis, as well as by recent and classical research, showing that subsurface processes and heat accumulation there are fundamental for the genesis of EN. An important feature of the scheme is that different regression predictors are used at different lead months, thus capturing the dynamical evolution of the system and rendering more efficient forecasts. The new model has been tested with the prediction of all warm events that occurred in the period 1996-2015. Retrospective forecasts of these events were made for long lead times of at least two and a half years. Hence, the present study demonstrates that the theoretical limit of ENSO prediction should be sought much longer than the commonly accepted "Spring Barrier". The high correspondence between the forecasts and observations indicates that the proposed model outperforms all current operational statistical models, and behaves comparably to the best dynamical models used for EN prediction. Thus, the novel way in which the modeling scheme has been structured could also be used for improving other statistical and dynamical modeling systems.
NASA Astrophysics Data System (ADS)
Bellier, Joseph; Bontron, Guillaume; Zin, Isabella
2017-12-01
Meteorological ensemble forecasts are nowadays widely used as input of hydrological models for probabilistic streamflow forecasting. These forcings are frequently biased and have to be statistically postprocessed, using most of the time univariate techniques that apply independently to individual locations, lead times and weather variables. Postprocessed ensemble forecasts therefore need to be reordered so as to reconstruct suitable multivariate dependence structures. The Schaake shuffle and ensemble copula coupling are the two most popular methods for this purpose. This paper proposes two adaptations of them that make use of meteorological analogues for reconstructing spatiotemporal dependence structures of precipitation forecasts. Performances of the original and adapted techniques are compared through a multistep verification experiment using real forecasts from the European Centre for Medium-Range Weather Forecasts. This experiment evaluates not only multivariate precipitation forecasts but also the corresponding streamflow forecasts that derive from hydrological modeling. Results show that the relative performances of the different reordering methods vary depending on the verification step. In particular, the standard Schaake shuffle is found to perform poorly when evaluated on streamflow. This emphasizes the crucial role of the precipitation spatiotemporal dependence structure in hydrological ensemble forecasting.
NASA Technical Reports Server (NTRS)
1988-01-01
ROFFS stands for Roffer's Ocean Fishing Forecasting Service, Inc. Roffer combines satellite and computer technology with oceanographic information from several sources to produce frequently updated charts sometimes as often as 30 times a day showing clues to the location of marlin, sailfish, tuna, swordfish and a variety of other types. Also provides customized forecasts for racing boats and the shipping industry along with seasonal forecasts that allow the marine industry to formulate fishing strategies based on foreknowledge of the arrival and departure times of different fish. Roffs service exemplifies the potential for benefits to marine industries from satellite observations. Most notable results are reduced search time and substantial fuel savings.
Benchmarking an operational procedure for rapid flood mapping and risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Salamon, Peter; Kalas, Milan; Bianchi, Alessandra; Feyen, Luc
2016-04-01
The development of real-time methods for rapid flood mapping and risk assessment is crucial to improve emergency response and mitigate flood impacts. This work describes the benchmarking of an operational procedure for rapid flood risk assessment based on the flood predictions issued by the European Flood Awareness System (EFAS). The daily forecasts produced for the major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations, based on the hydro-meteorological dataset of EFAS. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in near real-time in terms of flood prone areas, potential economic damage, affected population, infrastructures and cities. An extensive testing of the operational procedure is carried out using the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-derived flood footprints, while ground-based estimations of economic damage and affected population is compared against modelled estimates. We evaluated the skill of flood hazard and risk estimations derived from EFAS flood forecasts with different lead times and combinations. The assessment includes a comparison of several alternative approaches to produce and present the information content, in order to meet the requests of EFAS users. The tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management.
Air Quality Forecasts Using the NASA GEOS Model: A Unified Tool from Local to Global Scales
NASA Technical Reports Server (NTRS)
Knowland, E. Emma; Keller, Christoph; Nielsen, J. Eric; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Cook, Melanie; Liu, Junhua;
2017-01-01
We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (approximately 25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.
NASA Technical Reports Server (NTRS)
Cardone, V. J.; Pierson, W. J.
1975-01-01
On Skylab, a combination microwave radar-radiometer (S193) made measurements in a tropical hurricane (AVA), a tropical storm, and various extratropical wind systems. The winds at each cell scanned by the instrument were determined by objective numerical analysis techniques. The measured radar backscatter is compared to the analyzed winds and shown to provide an accurate method for measuring winds from space. An operational version of the instrument on an orbiting satellite will be able to provide the kind of measurements in tropical cyclones available today only by expensive and dangerous aircraft reconnaissance. Additionally, the specifications of the wind field in the tropical boundary layer should contribute to improved accuracy of tropical cyclone forecasts made with numerical weather predictions models currently being applied to the tropical atmosphere.
NASA Astrophysics Data System (ADS)
Harty, T. M.; Lorenzo, A.; Holmgren, W.; Morzfeld, M.
2017-12-01
The irradiance incident on a solar panel is the main factor in determining the power output of that panel. For this reason, accurate global horizontal irradiance (GHI) estimates and forecasts are critical when determining the optimal location for a solar power plant, forecasting utility scale solar power production, or forecasting distributed, behind the meter rooftop solar power production. Satellite images provide a basis for producing the GHI estimates needed to undertake these objectives. The focus of this work is to combine satellite derived GHI estimates with ground sensor measurements and an advection model. The idea is to use accurate but sparsely distributed ground sensors to improve satellite derived GHI estimates which can cover large areas (the size of a city or a region of the United States). We use a Bayesian framework to perform the data assimilation, which enables us to produce irradiance forecasts and associated uncertainties which incorporate both satellite and ground sensor data. Within this framework, we utilize satellite images taken from the GOES-15 geostationary satellite (available every 15-30 minutes) as well as ground data taken from irradiance sensors and rooftop solar arrays (available every 5 minutes). The advection model, driven by wind forecasts from a numerical weather model, simulates cloud motion between measurements. We use the Local Ensemble Transform Kalman Filter (LETKF) to perform the data assimilation. We present preliminary results towards making such a system useful in an operational context. We explain how localization and inflation in the LETKF, perturbations of wind-fields, and random perturbations of the advection model, affect the accuracy of our estimates and forecasts. We present experiments showing the accuracy of our forecasted GHI over forecast-horizons of 15 mins to 1 hr. The limitations of our approach and future improvements are also discussed.
An Operational System for Surveillance and Ecological Forecasting of West Nile Virus Outbreaks
NASA Astrophysics Data System (ADS)
Wimberly, M. C.; Davis, J. K.; Vincent, G.; Hess, A.; Hildreth, M. B.
2017-12-01
Mosquito-borne disease surveillance has traditionally focused on tracking human cases along with the abundance and infection status of mosquito vectors. For many of these diseases, vector and host population dynamics are also sensitive to climatic factors, including temperature fluctuations and the availability of surface water for mosquito breeding. Thus, there is a potential to strengthen surveillance and predict future outbreaks by monitoring environmental risk factors using broad-scale sensor networks that include earth-observing satellites. The South Dakota Mosquito Information System (SDMIS) project combines entomological surveillance with gridded meteorological data from NASA's North American Land Data Assimilation System (NLDAS) to generate weekly risk maps for West Nile virus (WNV) in the north-central United States. Critical components include a mosquito infection model that smooths the noisy infection rate and compensates for unbalanced sampling, and a human infection model that combines the entomological risk estimates with lagged effects of meteorological variables from the North American Land Data Assimilation System (NLDAS). Two types of forecasts are generated: long-term forecasts of statewide risk extending through the entire WNV season, and short-term forecasts of the geographic pattern of WNV risk in the upcoming week. Model forecasts are connected to public health actions through decision support matrices that link predicted risk levels to a set of phased responses. In 2016, the SDMIS successfully forecast an early start to the WNV season and a large outbreak of WNV cases following several years of low transmission. An evaluation of the 2017 forecasts will also be presented. Our experiences with the SDMIS highlight several important lessons that can inform future efforts at disease early warning. These include the value of integrating climatic models with recent observations of infection, the critical role of automated workflows to facilitate the timely integration of multiple data streams, the need for effective synthesis and visualization of forecasts, and the importance of linking forecasts to specific public health responses.
Action-based flood forecasting for triggering humanitarian action
NASA Astrophysics Data System (ADS)
Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin
2016-09-01
Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.
Forecasting Error Calculation with Mean Absolute Deviation and Mean Absolute Percentage Error
NASA Astrophysics Data System (ADS)
Khair, Ummul; Fahmi, Hasanul; Hakim, Sarudin Al; Rahim, Robbi
2017-12-01
Prediction using a forecasting method is one of the most important things for an organization, the selection of appropriate forecasting methods is also important but the percentage error of a method is more important in order for decision makers to adopt the right culture, the use of the Mean Absolute Deviation and Mean Absolute Percentage Error to calculate the percentage of mistakes in the least square method resulted in a percentage of 9.77% and it was decided that the least square method be worked for time series and trend data.
PREMAQ: A NEW PRE-PROCESSOR TO CMAQ FOR AIR-QUALITY FORECASTING
A new pre-processor to CMAQ (PREMAQ) has been developed as part of the national air-quality forecasting system. PREMAQ combines the functionality of MCIP and parts of SMOKE in a single real-time processor. PREMAQ was specifically designed to link NCEP's Eta model with CMAQ, and...
A Fifteen-Year Forecast of Information-Processing Technology. Final Report.
ERIC Educational Resources Information Center
Bernstein, George B.
This study developed a variation of the DELPHI approach, a polling technique for systematically soliciting opinions from experts, to produce a technological forecast of developments in the information-processing industry. SEER (System for Event Evaluation and Review) combines the more desirable elements of existing techniques: (1) intuitive…
Post-Secondary Enrolment Forecasting with Traditional and Cross Pressure-Impact Methodologies.
ERIC Educational Resources Information Center
Hoffman, Bernard B.
A model for forecasting postsecondary enrollment, the PDEM-1, is considered, which combines the traditional with a cross-pressure impact decision-making model. The model is considered in relation to its background, assumptions, survey instrument, model conception, applicability to educational environments, and implementation difficulties. The…
Short-Term fo F2 Forecast: Present Day State of Art
NASA Astrophysics Data System (ADS)
Mikhailov, A. V.; Depuev, V. H.; Depueva, A. H.
An analysis of the F2-layer short-term forecast problem has been done. Both objective and methodological problems prevent us from a deliberate F2-layer forecast issuing at present. An empirical approach based on statistical methods may be recommended for practical use. A forecast method based on a new aeronomic index (a proxy) AI has been proposed and tested over selected 64 severe storm events. The method provides an acceptable prediction accuracy both for strongly disturbed and quiet conditions. The problems with the prediction of the F2-layer quiet-time disturbances as well as some other unsolved problems are discussed
Load forecast method of electric vehicle charging station using SVR based on GA-PSO
NASA Astrophysics Data System (ADS)
Lu, Kuan; Sun, Wenxue; Ma, Changhui; Yang, Shenquan; Zhu, Zijian; Zhao, Pengfei; Zhao, Xin; Xu, Nan
2017-06-01
This paper presents a Support Vector Regression (SVR) method for electric vehicle (EV) charging station load forecast based on genetic algorithm (GA) and particle swarm optimization (PSO). Fuzzy C-Means (FCM) clustering is used to establish similar day samples. GA is used for global parameter searching and PSO is used for a more accurately local searching. Load forecast is then regressed using SVR. The practical load data of an EV charging station were taken to illustrate the proposed method. The result indicates an obvious improvement in the forecasting accuracy compared with SVRs based on PSO and GA exclusively.
Assessing the impact of different satellite retrieval methods on forecast available potential energy
NASA Technical Reports Server (NTRS)
Whittaker, Linda M.; Horn, Lyle H.
1990-01-01
The effects of the inclusion of satellite temperature retrieval data, and of different satellite retrieval methods, on forecasts made with the NASA Goddard Laboratory for Atmospheres (GLA) fourth-order model were investigated using, as the parameter, the available potential energy (APE) in its isentropic form. Calculation of the APE were used to study the differences in the forecast sets both globally and in the Northern Hemisphere during 72-h forecast period. The analysis data sets used for the forecasts included one containing the NESDIS TIROS-N retrievals, the GLA retrievals using the physical inversion method, and a third, which did not contain satellite data, used as a control; two data sets, with and without satellite data, were used for verification. For all three data sets, the Northern Hemisphere values for the total APE showed an increase throughout the forecast period, mostly due to an increase in the zonal component, in contrast to the verification sets, which showed a steady level of total APE.
Time series forecasting using ERNN and QR based on Bayesian model averaging
NASA Astrophysics Data System (ADS)
Pwasong, Augustine; Sathasivam, Saratha
2017-08-01
The Bayesian model averaging technique is a multi-model combination technique. The technique was employed to amalgamate the Elman recurrent neural network (ERNN) technique with the quadratic regression (QR) technique. The amalgamation produced a hybrid technique known as the hybrid ERNN-QR technique. The potentials of forecasting with the hybrid technique are compared with the forecasting capabilities of individual techniques of ERNN and QR. The outcome revealed that the hybrid technique is superior to the individual techniques in the mean square error sense.
A hybrid group method of data handling with discrete wavelet transform for GDP forecasting
NASA Astrophysics Data System (ADS)
Isa, Nadira Mohamed; Shabri, Ani
2013-09-01
This study is proposed the application of hybridization model using Group Method of Data Handling (GMDH) and Discrete Wavelet Transform (DWT) in time series forecasting. The objective of this paper is to examine the flexibility of the hybridization GMDH in time series forecasting by using Gross Domestic Product (GDP). A time series data set is used in this study to demonstrate the effectiveness of the forecasting model. This data are utilized to forecast through an application aimed to handle real life time series. This experiment compares the performances of a hybrid model and a single model of Wavelet-Linear Regression (WR), Artificial Neural Network (ANN), and conventional GMDH. It is shown that the proposed model can provide a promising alternative technique in GDP forecasting.
a system approach to the long term forecasting of the climat data in baikal region
NASA Astrophysics Data System (ADS)
Abasov, N.; Berezhnykh, T.
2003-04-01
The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method of forecasting (with a year in advance) is based on the property of alternation of series of years with increase and decrease in the observed indicators (characteristic indices) of natural processes. Most of the series (98.4-99.6%) are represented by series of one to three years. The problem of forecasting is divided into two parts: 1) qualitative forecast of the probability that the started series will either continue or be replaced by a new series during the next year that is based on the frequency characteristics of series of years with increase or decrease of the forecasted sequence); 2) quantitative estimate of the forecasted value in the form of a curve of conditional frequencies is made on the base of intra-sequence interrelations among hydrometeorological elements by their differentiation with respect to series of years of increase or decrease, by construction of particular curves of conditional frequencies of the runoff for each expected variant of series development and by subsequent construction a generalized curve. Approximative learning methods form forecasted trajectories of the studied process indices for a long-term perspective. The method of analog-similarity relations is based on the fact that long periods of observations reveal some similarities in the character of variability of indices for some fragments of the sequence x (t) by definite criteria. The idea of the method is to estimate similarity of such fragments of the sequence that have been called the analogs. The method applies multistage optimization of both external parameters (e.g. the number of iterations of the sliding averaging needed to decompose the sequence into two components: the smoothed one with isolated periodic oscillations and the residual or random one). The method is applicable to current terms of forecasts and ending with the double solar cycle. Using a special procedure of integration, it separates terms with the best results for the given optimization subsample. Several optimal vectors of parameters obtained are tested on the examination (verifying) subsample. If the procedure is successful, the forecast is immediately made by integration of several best solutions. Peculiarities of forecasting extreme processes. Methods of long-term forecasting allow the sufficiently reliable forecasts to be made within the interval of xmin+Δ_1, xmax - Δ_2 (i.e. in the interval of medium values of indices). Meanwhile, in the intervals close to extreme ones, reliability of forecasts is substantially lower. While for medium values the statistics of the100-year sequence gives acceptable results owing to a sufficiently large number of revealed analogs that correspond to prognostic samples, for extreme values the situation is quite different, first of all by virtue of poverty of statistical data. Decreasing the values of Δ_1,Δ_2: Δ_1,Δ_2 rightarrow 0 (by including them into optimization parameters of the considered forecasting methods) could be one of the ways to improve reliability of forecasts. Partially, such an approach has been realized in the method of analog-similarity relations, giving the possibility to form a range of possible forecasted trajectories in two variants - from the minimum possible trajectory to the maximum possible one. Reliability of long-term forecasts. Both the methodology and the methods considered above have been realized as the information-forecasting system "GIPSAR". The system includes some tools implementing several methods of forecasting, analysis of initial and forecasted information, a developed database, a set of tools for verification of algorithms, additional information on the algorithms of statistical processing of sequences (sliding averaging, integral-difference curves, etc.), aids to organize input of initial information (in its various forms) as well as aids to draw up output prognostic documents. Risk management. The normal functioning of the Angara cascade is periodically interrupted by risks of two types that take place in the Baikal, the Bratsk and Ust-Ilimsk reservoirs: long low-water periods and sudden periods of extremely high water levels. For example, low-water periods, observed in the reservoirs of the Angara cascade can be classified under four risk categories : 1 - acceptable (negligible reduction of electric power generation by hydropower plants; certain difficulty in meeting environmental and navigation requirements); 2 - significant (substantial reduction of electric power generation by hydropower plants; certain restriction on water releases for navigation; violation of environmental requirements in some years); 3 - emergency (big losses in electric power generation; limited electricity supply to large consumers; significant restriction of water releases for navigation; threat of exposure of drinkable water intake works; violation of environmental requirements for a number of years); 4 - catastrophic (energy crisis; social crisis exposure of drinkable water intake works; termination of navigation; environmental catastrophe). Management of energy systems consists in operative, many-year regulation and perspective planning and has to take into account the analysis of operative data (water reserves in reservoirs), long-term statistics and relations among natural processes and also forecasts - short-term (for a day, week, decade), long-term and/or super-long-term (from a month to several decades). Such natural processes as water inflow to reservoirs, air temperatures during heating periods depend in turn on external factors: prevailing types of atmospheric circulation, intensity of the 11- and 22-year cycles of solar activity, volcanic activity, interaction between the ocean and atmosphere, etc. Until recently despite the formed scientific schools on long-term forecasting (I.P.Druzhinin, A.P.Reznikhov) the energy system management has been based on specially drawn dispatching schedules and long-term hydrometeorological forecasts only without attraction of perspective forecasted indices. Insertion of a parallel block of forecast (based on the analysis of data on natural processes and special methods of forecasting) into the scheme can largely smooth unfavorable consequences from the impact of natural processes on sustainable development of energy systems and especially on its safe operation. However, the requirements to reliability and accuracy of long-term forecasts significantly increase. The considered approach to long term forecasting can be used for prediction: mean winter and summer air temperatures, droughts and wood fires.
Can we use Earth Observations to improve monthly water level forecasts?
NASA Astrophysics Data System (ADS)
Slater, L. J.; Villarini, G.
2017-12-01
Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.
A scoping review of nursing workforce planning and forecasting research.
Squires, Allison; Jylhä, Virpi; Jun, Jin; Ensio, Anneli; Kinnunen, Juha
2017-11-01
This study will critically evaluate forecasting models and their content in workforce planning policies for nursing professionals and to highlight the strengths and the weaknesses of existing approaches. Although macro-level nursing workforce issues may not be the first thing that many nurse managers consider in daily operations, the current and impending nursing shortage in many countries makes nursing specific models for workforce forecasting important. A scoping review was conducted using a directed and summative content analysis approach to capture supply and demand analytic methods of nurse workforce planning and forecasting. The literature on nurse workforce forecasting studies published in peer-reviewed journals as well as in grey literature was included in the scoping review. Thirty six studies met the inclusion criteria, with the majority coming from the USA. Forecasting methods were biased towards service utilization analyses and were not consistent across studies. Current methods for nurse workforce forecasting are inconsistent and have not accounted sufficiently for socioeconomic and political factors that can influence workforce projections. Additional studies examining past trends are needed to improve future modelling. Accurate nursing workforce forecasting can help nurse managers, administrators and policy makers to understand the supply and demand of the workforce to prepare and maintain an adequate and competent current and future workforce. © 2017 John Wiley & Sons Ltd.
Robustness of disaggregate oil and gas discovery forecasting models
Attanasi, E.D.; Schuenemeyer, J.H.
1989-01-01
The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.
NASA Astrophysics Data System (ADS)
Domínguez, Efraín; Angarita, Hector; Rosmann, Thomas; Mendez, Zulma; Angulo, Gustavo
2013-04-01
A viable quantitative hydrological forecasting service is a combination of technological elements, personnel and knowledge, working together to establish a stable operational cycle of forecasts emission, dissemination and assimilation; hence, the process for establishing such system usually requires significant resources and time to reach an adequate development and integration in order to produce forecasts with acceptable levels of performance. Here are presented the results of this process for the recently implemented Operational Forecast Service for the Betania's Hydropower Reservoir - or SPHEB, located at the Upper-Magdalena River Basin (Colombia). The current scope of the SPHEB includes forecasting of water levels and discharge for the three main streams affluent to the reservoir, for lead times between +1 to +57 hours, and +1 to +10 days. The core of the SPHEB is the Flexible, Adaptive, Simple and Transient Time forecasting approach, namely FAST-T. This comprises of a set of data structures, mathematical kernel, distributed computing and network infrastructure designed to provide seamless real-time operational forecast and automatic model adjustment in case of failures in data transmission or assimilation. Among FAST-T main features are: an autonomous evaluation and detection of the most relevant information for the later configuration of forecasting models; an adaptively linearized mathematical kernel, the optimal adaptive linear combination or OALC, which provides a computationally simple and efficient algorithm for real-time applications; and finally, a meta-model catalog, containing prioritized forecast models at given stream conditions. The SPHEB is at present feed by the fraction of hydrological monitoring network installed at the basin that has telemetric capabilities via NOAA-GOES satellites (8 stages, approximately 47%) with data availability of about a 90% at one hour intervals. However, there is a dense network of 'conventional' hydro-meteorological stages -read manually once or twice per day - that, despite not ideal in the context of real-time system, improve model performance significantly, and therefore are entered into the system by manual input. At its current configuration, the SPHEB performance objectives are fulfilled for 90% of the forecasts with lead times up to +2 days and +15 hours (using the predictability criteria of the Russian Hydrometeorological Center S/?Δ) and the average accuracy is in the range 70-99% ( r2 criteria). However, longer lead times are at present not satisfactory in terms of forecasts accuracy.
NASA Astrophysics Data System (ADS)
Zheng, Minghua
Cool-season extratropical cyclones near the U.S. East Coast often have significant impacts on the safety, health, environment and economy of this most densely populated region. Hence it is of vital importance to forecast these high-impact winter storm events as accurately as possible by numerical weather prediction (NWP), including in the medium-range. Ensemble forecasts are appealing to operational forecasters when forecasting such events because they can provide an envelope of likely solutions to serve user communities. However, it is generally accepted that ensemble outputs are not used efficiently in NWS operations mainly due to the lack of simple and quantitative tools to communicate forecast uncertainties and ensemble verification to assess model errors and biases. Ensemble sensitivity analysis (ESA), which employs a linear correlation and regression between a chosen forecast metric and the forecast state vector, can be used to analyze the forecast uncertainty development for both short- and medium-range forecasts. The application of ESA to a high-impact winter storm in December 2010 demonstrated that the sensitivity signals based on different forecast metrics are robust. In particular, the ESA based on the leading two EOF PCs can separate sensitive regions associated with cyclone amplitude and intensity uncertainties, respectively. The sensitivity signals were verified using the leave-one-out cross validation (LOOCV) method based on a multi-model ensemble from CMC, ECMWF, and NCEP. The climatology of ensemble sensitivities for the leading two EOF PCs based on 3-day and 6-day forecasts of historical cyclone cases was presented. It was found that the EOF1 pattern often represents the intensity variations while the EOF2 pattern represents the track variations along west-southwest and east-northeast direction. For PC1, the upper-level trough associated with the East Coast cyclone and its downstream ridge are important to the forecast uncertainty in cyclone strength. The initial differences in forecasting the ridge along the west coast of North America impact the EOF1 pattern most. For PC2, it was shown that the shift of the tri-polar structure is most significantly related to the cyclone track forecasts. The EOF/fuzzy clustering tool was applied to diagnose the scenarios in operational ensemble forecast of East Coast winter storms. It was shown that the clustering method could efficiently separate the forecast scenarios associated with East Coast storms based on the 90-member multi-model ensemble. A scenario-based ensemble verification method has been proposed and applied it to examine the capability of different EPSs in capturing the analysis scenarios for historical East Coast cyclone cases at lead times of 1-9 days. The results suggest that the NCEP model performs better in short-range forecasts in capturing the analysis scenario although it is under-dispersed. The ECMWF ensemble shows the best performance in the medium range. The CMC model is found to show the smallest percentage of members in the analysis group and a relatively high missing rate, suggesting that it is less reliable regarding capturing the analysis scenario when compared with the other two EPSs. A combination of NCEP and CMC models has been found to reduce the missing rate and improve the error-spread skill in medium- to extended-range forecasts. Based on the orthogonal features of the EOF patterns, the model errors for 1-6-day forecasts have been decomposed for the leading two EOF patterns. The results for error decomposition show that the NCEP model tends to better represent both EOF1 and EOF2 patterns by showing less intensity and displacement errors during 1-3 days. The ECMWF model is found to have the smallest errors in both EOF1 and EOF2 patterns during 4-6 days. We have also found that East Coast cyclones in the ECMWF forecast tend to be towards the southwest of the other two models in representing the EOF2 pattern, which is associated with the southwest-northeast shifting of the cyclone. This result suggests that ECMWF model may have a tendency to show a closer-to-shore solution in forecasting East Coast winter storms. The downstream impacts of Rossby wave packets (RWPs) on the predictability of winter storms are investigated to explore the source of ensemble uncertainties. The composited RWPA anomalies show that there are enhanced RWPs propagating across the Pacific in both large-error and large-spread cases over the verification regions. There are also indications that the errors might propagate with a speed comparable with the group velocity of RWPs. Based on the composite results as well as our observations of the operation daily RWPA, a conceptual model of errors/uncertainty development associated with RWPs has been proposed to serve as a practical tool to understand the evolution of forecast errors and uncertainties associated with the coherent RWPs originating from upstream as far as western Pacific. (Abstract shortened by ProQuest.).
Forecasting hotspots in East Kutai, Kutai Kartanegara, and West Kutai as early warning information
NASA Astrophysics Data System (ADS)
Wahyuningsih, S.; Goejantoro, R.; Rizki, N. A.
2018-04-01
The aims of this research are to model hotspots and forecast hotspot 2017 in East Kutai, Kutai Kartanegara and West Kutai. The methods which used in this research were Holt exponential smoothing, Holt’s additive dump trend method, Holt-Winters’ additive method, additive decomposition method, multiplicative decomposition method, Loess decomposition method and Box-Jenkins method. For smoothing techniques, additive decomposition is better than Holt’s exponential smoothing. The hotspots model using Box-Jenkins method were Autoregressive Moving Average ARIMA(1,1,0), ARIMA(0,2,1), and ARIMA(0,1,0). Comparing the results from all methods which were used in this research, and based on Root of Mean Squared Error (RMSE), show that Loess decomposition method is the best times series model, because it has the least RMSE. Thus the Loess decomposition model used to forecast the number of hotspot. The forecasting result indicatethat hotspots pattern tend to increase at the end of 2017 in Kutai Kartanegara and West Kutai, but stationary in East Kutai.
Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition
NASA Astrophysics Data System (ADS)
Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.
2005-12-01
Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.
Uncertainty forecasts improve weather-related decisions and attenuate the effects of forecast error.
Joslyn, Susan L; LeClerc, Jared E
2012-03-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather warning system is used. The work reported here tested the relative benefits of several forecast formats, comparing decisions made with and without uncertainty forecasts. In three experiments, participants assumed the role of a manager of a road maintenance company in charge of deciding whether to pay to salt the roads and avoid a potential penalty associated with icy conditions. Participants used overnight low temperature forecasts accompanied in some conditions by uncertainty estimates and in others by decision advice comparable to categorical warnings. Results suggested that uncertainty information improved decision quality overall and increased trust in the forecast. Participants with uncertainty forecasts took appropriate precautionary action and withheld unnecessary action more often than did participants using deterministic forecasts. When error in the forecast increased, participants with conventional forecasts were reluctant to act. However, this effect was attenuated by uncertainty forecasts. Providing categorical decision advice alone did not improve decisions. However, combining decision advice with uncertainty estimates resulted in the best performance overall. The results reported here have important implications for the development of forecast formats to increase compliance with severe weather warnings as well as other domains in which one must act in the face of uncertainty. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Using Landslide Failure Forecast Models in Near Real Time: the Mt. de La Saxe case-study
NASA Astrophysics Data System (ADS)
Manconi, Andrea; Giordan, Daniele
2014-05-01
Forecasting the occurrence of landslide phenomena in space and time is a major scientific challenge. The approaches used to forecast landslides mainly depend on the spatial scale analyzed (regional vs. local), the temporal range of forecast (long- vs. short-term), as well as the triggering factor and the landslide typology considered. By focusing on short-term forecast methods for large, deep seated slope instabilities, the potential time of failure (ToF) can be estimated by studying the evolution of the landslide deformation over time (i.e., strain rate) provided that, under constant stress conditions, landslide materials follow creep mechanism before reaching rupture. In the last decades, different procedures have been proposed to estimate ToF by considering simplified empirical and/or graphical methods applied to time series of deformation data. Fukuzono, 1985 proposed a failure forecast method based on the experience performed during large scale laboratory experiments, which were aimed at observing the kinematic evolution of a landslide induced by rain. This approach, known also as the inverse-velocity method, considers the evolution over time of the inverse value of the surface velocity (v) as an indicator of the ToF, by assuming that failure approaches while 1/v tends to zero. Here we present an innovative method to aimed at achieving failure forecast of landslide phenomena by considering near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and then apply straightforward statistical methods to obtain confidence intervals on the time of failure. Our results can be relevant to support the management of early warning systems during landslide emergency conditions, also when the predefined displacement and/or velocity thresholds are exceeded. In addition, our statistical approach for the definition of confidence interval and forecast reliability can be applied also to different failure forecast methods. We applied for the first time the herein presented approach in near real time during the emergency scenario relevant to the reactivation of the La Saxe rockslide, a large mass wasting menacing the population of Courmayeur, northern Italy, and the important European route E25. We show how the application of simplified but robust forecast models can be a convenient method to manage and support early warning systems during critical situations. References: Fukuzono T. (1985), A New Method for Predicting the Failure Time of a Slope, Proc. IVth International Conference and Field Workshop on Landslides, Tokyo.
Forecasting extinction risk with nonstationary matrix models.
Gotelli, Nicholas J; Ellison, Aaron M
2006-02-01
Matrix population growth models are standard tools for forecasting population change and for managing rare species, but they are less useful for predicting extinction risk in the face of changing environmental conditions. Deterministic models provide point estimates of lambda, the finite rate of increase, as well as measures of matrix sensitivity and elasticity. Stationary matrix models can be used to estimate extinction risk in a variable environment, but they assume that the matrix elements are randomly sampled from a stationary (i.e., non-changing) distribution. Here we outline a method for using nonstationary matrix models to construct realistic forecasts of population fluctuation in changing environments. Our method requires three pieces of data: (1) field estimates of transition matrix elements, (2) experimental data on the demographic responses of populations to altered environmental conditions, and (3) forecasting data on environmental drivers. These three pieces of data are combined to generate a series of sequential transition matrices that emulate a pattern of long-term change in environmental drivers. Realistic estimates of population persistence and extinction risk can be derived from stochastic permutations of such a model. We illustrate the steps of this analysis with data from two populations of Sarracenia purpurea growing in northern New England. Sarracenia purpurea is a perennial carnivorous plant that is potentially at risk of local extinction because of increased nitrogen deposition. Long-term monitoring records or models of environmental change can be used to generate time series of driver variables under different scenarios of changing environments. Both manipulative and natural experiments can be used to construct a linking function that describes how matrix parameters change as a function of the environmental driver. This synthetic modeling approach provides quantitative estimates of extinction probability that have an explicit mechanistic basis.
Lee, Jared A.; Hacker, Joshua P.; Monache, Luca Delle; ...
2016-08-03
A current barrier to greater deployment of offshore wind turbines is the poor quality of numerical weather prediction model wind and turbulence forecasts over open ocean. The bulk of development for atmospheric boundary layer (ABL) parameterization schemes has focused on land, partly due to a scarcity of observations over ocean. The 100-m FINO1 tower in the North Sea is one of the few sources worldwide of atmospheric profile observations from the sea surface to turbine hub height. These observations are crucial to developing a better understanding and modeling of physical processes in the marine ABL. In this paper we usemore » the WRF single column model (SCM), coupled with an ensemble Kalman filter from the Data Assimilation Research Testbed (DART), to create 100-member ensembles at the FINO1 location. The goal of this study is to determine the extent to which model parameter estimation can improve offshore wind forecasts. Combining two datasets that provide lateral forcing for the SCM and two methods for determining z 0, the time-varying sea-surface roughness length, we conduct four WRF-SCM/DART experiments over the October-December 2006 period. The two methods for determining z 0 are the default Fairall-adjusted Charnock formulation in WRF, and using parameter estimation techniques to estimate z 0 in DART. Using DART to estimate z 0 is found to reduce 1-h forecast errors of wind speed over the Charnock-Fairall z 0 ensembles by 4%–22%. Finally, however, parameter estimation of z 0 does not simultaneously reduce turbulent flux forecast errors, indicating limitations of this approach and the need for new marine ABL parameterizations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu
Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less
NASA Technical Reports Server (NTRS)
Raymond, William H.; Olson, William S.; Callan, Geary
1990-01-01
The focus of this part of the investigation is to find one or more general modeling techniques that will help reduce the time taken by numerical forecast models to initiate or spin-up precipitation processes and enhance storm intensity. If the conventional data base could explain the atmospheric mesoscale flow in detail, then much of our problem would be eliminated. But the data base is primarily synoptic scale, requiring that a solution must be sought either in nonconventional data, in methods to initialize mesoscale circulations, or in ways of retaining between forecasts the model generated mesoscale dynamics and precipitation fields. All three methods are investigated. The initialization and assimilation of explicit cloud and rainwater quantities computed from conservation equations in a mesoscale regional model are examined. The physical processes include condensation, evaporation, autoconversion, accretion, and the removal of rainwater by fallout. The question of how to initialize the explicit liquid water calculations in numerical models and how to retain information about precipitation processes during the 4-D assimilation cycle are important issues that are addressed. The explicit cloud calculations were purposely kept simple so that different initialization techniques can be easily and economically tested. Precipitation spin-up processes associated with three different types of weather phenomena are examined. Our findings show that diabatic initialization, or diabatic initialization in combination with a new diabatic forcing procedure, work effectively to enhance the spin-up of precipitation in a mesoscale numerical weather prediction forecast. Also, the retention of cloud and rain water during the analysis phase of the 4-D data assimilation procedure is shown to be valuable. Without detailed observations, the vertical placement of the diabatic heating remains a critical problem.
Bias Adjusted Precipitation Threat Scores
NASA Astrophysics Data System (ADS)
Mesinger, F.
2008-04-01
Among the wide variety of performance measures available for the assessment of skill of deterministic precipitation forecasts, the equitable threat score (ETS) might well be the one used most frequently. It is typically used in conjunction with the bias score. However, apart from its mathematical definition the meaning of the ETS is not clear. It has been pointed out (Mason, 1989; Hamill, 1999) that forecasts with a larger bias tend to have a higher ETS. Even so, the present author has not seen this having been accounted for in any of numerous papers that in recent years have used the ETS along with bias "as a measure of forecast accuracy". A method to adjust the threat score (TS) or the ETS so as to arrive at their values that correspond to unit bias in order to show the model's or forecaster's accuracy in placing precipitation has been proposed earlier by the present author (Mesinger and Brill, the so-called dH/dF method). A serious deficiency however has since been noted with the dH/dF method in that the hypothetical function that it arrives at to interpolate or extrapolate the observed value of hits to unit bias can have values of hits greater than forecast when the forecast area tends to zero. Another method is proposed here based on the assumption that the increase in hits per unit increase in false alarms is proportional to the yet unhit area. This new method removes the deficiency of the dH/dF method. Examples of its performance for 12 months of forecasts by three NCEP operational models are given.
Survey of air cargo forecasting techniques
NASA Technical Reports Server (NTRS)
Kuhlthan, A. R.; Vermuri, R. S.
1978-01-01
Forecasting techniques currently in use in estimating or predicting the demand for air cargo in various markets are discussed with emphasis on the fundamentals of the different forecasting approaches. References to specific studies are cited when appropriate. The effectiveness of current methods is evaluated and several prospects for future activities or approaches are suggested. Appendices contain summary type analyses of about 50 specific publications on forecasting, and selected bibliographies on air cargo forecasting, air passenger demand forecasting, and general demand and modalsplit modeling.
NASA Astrophysics Data System (ADS)
Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris
2017-04-01
Machine learning (ML) is considered to be a promising approach to hydrological processes forecasting. We conduct a comparison between several stochastic and ML point estimation methods by performing large-scale computational experiments based on simulations. The purpose is to provide generalized results, while the respective comparisons in the literature are usually based on case studies. The stochastic methods used include simple methods, models from the frequently used families of Autoregressive Moving Average (ARMA), Autoregressive Fractionally Integrated Moving Average (ARFIMA) and Exponential Smoothing models. The ML methods used are Random Forests (RF), Support Vector Machines (SVM) and Neural Networks (NN). The comparison refers to the multi-step ahead forecasting properties of the methods. A total of 20 methods are used, among which 9 are the ML methods. 12 simulation experiments are performed, while each of them uses 2 000 simulated time series of 310 observations. The time series are simulated using stochastic processes from the families of ARMA and ARFIMA models. Each time series is split into a fitting (first 300 observations) and a testing set (last 10 observations). The comparative assessment of the methods is based on 18 metrics, that quantify the methods' performance according to several criteria related to the accurate forecasting of the testing set, the capturing of its variation and the correlation between the testing and forecasted values. The most important outcome of this study is that there is not a uniformly better or worse method. However, there are methods that are regularly better or worse than others with respect to specific metrics. It appears that, although a general ranking of the methods is not possible, their classification based on their similar or contrasting performance in the various metrics is possible to some extent. Another important conclusion is that more sophisticated methods do not necessarily provide better forecasts compared to simpler methods. It is pointed out that the ML methods do not differ dramatically from the stochastic methods, while it is interesting that the NN, RF and SVM algorithms used in this study offer potentially very good performance in terms of accuracy. It should be noted that, although this study focuses on hydrological processes, the results are of general scientific interest. Another important point in this study is the use of several methods and metrics. Using fewer methods and fewer metrics would have led to a very different overall picture, particularly if those fewer metrics corresponded to fewer criteria. For this reason, we consider that the proposed methodology is appropriate for the evaluation of forecasting methods.
Verification of Space Weather Forecasts using Terrestrial Weather Approaches
NASA Astrophysics Data System (ADS)
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
2015-12-01
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.
Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.
2015-01-01
Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380
Yao, Yibin; Shan, Lulu; Zhao, Qingzhi
2017-09-29
Global Navigation Satellite System (GNSS) can effectively retrieve precipitable water vapor (PWV) with high precision and high-temporal resolution. GNSS-derived PWV can be used to reflect water vapor variation in the process of strong convection weather. By studying the relationship between time-varying PWV and rainfall, it can be found that PWV contents increase sharply before raining. Therefore, a short-term rainfall forecasting method is proposed based on GNSS-derived PWV. Then the method is validated using hourly GNSS-PWV data from Zhejiang Continuously Operating Reference Station (CORS) network of the period 1 September 2014 to 31 August 2015 and its corresponding hourly rainfall information. The results show that the forecasted correct rate can reach about 80%, while the false alarm rate is about 66%. Compared with results of the previous studies, the correct rate is improved by about 7%, and the false alarm rate is comparable. The method is also applied to other three actual rainfall events of different regions, different durations, and different types. The results show that the method has good applicability and high accuracy, which can be used for rainfall forecasting, and in the future study, it can be assimilated with traditional weather forecasting techniques to improve the forecasted accuracy.
Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power
NASA Astrophysics Data System (ADS)
Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab
2014-01-01
Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.
NASA Astrophysics Data System (ADS)
Zhou, Jianzhong; Zhang, Hairong; Zhang, Jianyun; Zeng, Xiaofan; Ye, Lei; Liu, Yi; Tayyab, Muhammad; Chen, Yufan
2017-07-01
An accurate flood forecasting with long lead time can be of great value for flood prevention and utilization. This paper develops a one-way coupled hydro-meteorological modeling system consisting of the mesoscale numerical weather model Weather Research and Forecasting (WRF) model and the Chinese Xinanjiang hydrological model to extend flood forecasting lead time in the Jinshajiang River Basin, which is the largest hydropower base in China. Focusing on four typical precipitation events includes: first, the combinations and mode structures of parameterization schemes of WRF suitable for simulating precipitation in the Jinshajiang River Basin were investigated. Then, the Xinanjiang model was established after calibration and validation to make up the hydro-meteorological system. It was found that the selection of the cloud microphysics scheme and boundary layer scheme has a great impact on precipitation simulation, and only a proper combination of the two schemes could yield accurate simulation effects in the Jinshajiang River Basin and the hydro-meteorological system can provide instructive flood forecasts with long lead time. On the whole, the one-way coupled hydro-meteorological model could be used for precipitation simulation and flood prediction in the Jinshajiang River Basin because of its relatively high precision and long lead time.
Future Research in Health Information Technology: A Review.
Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammad Reza; Saghafi, Fatemeh
2017-01-01
Currently, information technology is considered an important tool to improve healthcare services. To adopt the right technologies, policy makers should have adequate information about present and future advances. This study aimed to review and compare studies with a focus on the future of health information technology. This review study was completed in 2015. The databases used were Scopus, Web of Science, ProQuest, Ovid Medline, and PubMed. Keyword searches were used to identify papers and materials published between 2000 and 2015. Initially, 407 papers were obtained, and they were reduced to 11 papers at the final stage. The selected papers were described and compared in terms of the country of origin, objective, methodology, and time horizon. The papers were divided into two groups: those forecasting the future of health information technology (seven papers) and those providing health information technology foresight (four papers). The results showed that papers related to forecasting the future of health information technology were mostly a literature review, and the time horizon was up to 10 years in most of these studies. In the health information technology foresight group, most of the studies used a combination of techniques, such as scenario building and Delphi methods, and had long-term objectives. To make the most of an investment and to improve planning and successful implementation of health information technology, a strategic plan for the future needs to be set. To achieve this aim, methods such as forecasting the future of health information technology and offering health information technology foresight can be applied. The forecasting method is used when the objectives are not very large, and the foresight approach is recommended when large-scale objectives are set to be achieved. In the field of health information technology, the results of foresight studies can help to establish realistic long-term expectations of the future of health information technology.
Fuzzy forecasting based on fuzzy-trend logical relationship groups.
Chen, Shyi-Ming; Wang, Nai-Yi
2010-10-01
In this paper, we present a new method to predict the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) based on fuzzy-trend logical relationship groups (FTLRGs). The proposed method divides fuzzy logical relationships into FTLRGs based on the trend of adjacent fuzzy sets appearing in the antecedents of fuzzy logical relationships. First, we apply an automatic clustering algorithm to cluster the historical data into intervals of different lengths. Then, we define fuzzy sets based on these intervals of different lengths. Then, the historical data are fuzzified into fuzzy sets to derive fuzzy logical relationships. Then, we divide the fuzzy logical relationships into FTLRGs for forecasting the TAIEX. Moreover, we also apply the proposed method to forecast the enrollments and the inventory demand, respectively. The experimental results show that the proposed method gets higher average forecasting accuracy rates than the existing methods.
North Carolina forecasts for truck traffic
DOT National Transportation Integrated Search
2006-07-01
North Carolina has experienced significant increases in truck traffic on many of its highways. Yet, current NCDOT : project-level highway traffic forecasts do not appropriately capture anticipated truck traffic growth. NCDOT : methods forecast total ...
Adaptive correction of ensemble forecasts
NASA Astrophysics Data System (ADS)
Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane
2017-04-01
Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.
Approaches in Health Human Resource Forecasting: A Roadmap for Improvement.
Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh
2016-09-01
Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. A literature review was conducted for studies published in English from 1990-2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies' references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses. Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems.
Forecasting United States heartworm Dirofilaria immitis prevalence in dogs.
Bowman, Dwight D; Liu, Yan; McMahan, Christopher S; Nordone, Shila K; Yabsley, Michael J; Lund, Robert B
2016-10-10
This paper forecasts next year's canine heartworm prevalence in the United States from 16 climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 31 million antigen heartworm tests conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on 16 predictive factors, including temperature, precipitation, median household income, local forest and surface water coverage, and presence/absence of eight mosquito species. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county heartworm prevalence for the 5-year period 2011-2015 is 0.727, demonstrating reasonable model accuracy. The correlation between 2015 observed and forecasted county-by-county heartworm prevalence is 0.940, demonstrating significant skill and showing that heartworm prevalence can be forecasted reasonably accurately. The forecast presented herein can a priori alert veterinarians to areas expected to see higher than normal heartworm activity. The proposed methods may prove useful for forecasting other diseases.
Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren
2016-01-01
Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words.
Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren
2016-01-01
Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words. PMID:27313605
Statistical and Machine Learning forecasting methods: Concerns and ways forward
Makridakis, Spyros; Assimakopoulos, Vassilios
2018-01-01
Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784
NASA Astrophysics Data System (ADS)
Cofino, A. S.; Santos, C.; Garcia-Moya, J. A.; Gutierrez, J. M.; Orfila, B.
2009-04-01
The Short-Range Ensemble Prediction System (SREPS) is a multi-LAM (UM, HIRLAM, MM5, LM and HRM) multi analysis/boundary conditions (ECMWF, UKMetOffice, DWD and GFS) run twice a day by AEMET (72 hours lead time) over a European domain, with a total of 5 (LAMs) x 4 (GCMs) = 20 members. One of the main goals of this project is analyzing the impact of models and boundary conditions in the short-range high-resolution forecasted precipitation. A previous validation of this method has been done considering a set of climate networks in Spain, France and Germany, by interpolating the prediction to the gauge locations (SREPS, 2008). In this work we compare these results with those obtained by using a statistical downscaling method to post-process the global predictions, obtaining an "advanced interpolation" for the local precipitation using climate network precipitation observations. In particular, we apply the PROMETEO downscaling system based on analogs and compare the SREPS ensemble of 20 members with the PROMETEO statistical ensemble of 5 (analog ensemble) x 4 (GCMs) = 20 members. Moreover, we will also compare the performance of a combined approach post-processing the SREPS outputs using the PROMETEO system. References: SREPS 2008. 2008 EWGLAM-SRNWP Meeting (http://www.aemet.es/documentos/va/divulgacion/conferencias/prediccion/Ewglam/PRED_CSantos.pdf)
NASA Astrophysics Data System (ADS)
Cervone, G.; Clemente-Harding, L.; Alessandrini, S.; Delle Monache, L.
2016-12-01
A methodology based on Artificial Neural Networks (ANN) and an Analog Ensemble (AnEn) is presented to generate 72-hour deterministic and probabilistic forecasts of power generated by photovoltaic (PV) power plants using input from a numerical weather prediction model and computed astronomical variables. ANN and AnEn are used individually and in combination to generate forecasts for three solar power plant located in Italy. The computational scalability of the proposed solution is tested using synthetic data simulating 4,450 PV power stations. The NCAR Yellowstone supercomputer is employed to test the parallel implementation of the proposed solution, ranging from 1 node (32 cores) to 4,450 nodes (141,140 cores). Results show that a combined AnEn + ANN solution yields best results, and that the proposed solution is well suited for massive scale computation.
NASA Astrophysics Data System (ADS)
Perekhodtseva, E. V.
2009-09-01
Development of successful method of forecast of storm winds, including squalls and tornadoes and heavy rainfalls, that often result in human and material losses, could allow one to take proper measures against destruction of buildings and to protect people. Well-in-advance successful forecast (from 12 hours to 48 hour) makes possible to reduce the losses. Prediction of the phenomena involved is a very difficult problem for synoptic till recently. The existing graphic and calculation methods still depend on subjective decision of an operator. Nowadays in Russia there is no hydrodynamic model for forecast of the maximal precipitation and wind velocity V> 25m/c, hence the main tools of objective forecast are statistical methods using the dependence of the phenomena involved on a number of atmospheric parameters (predictors). Statistical decisive rule of the alternative and probability forecast of these events was obtained in accordance with the concept of "perfect prognosis" using the data of objective analysis. For this purpose the different teaching samples of present and absent of this storm wind and rainfalls were automatically arranged that include the values of forty physically substantiated potential predictors. Then the empirical statistical method was used that involved diagonalization of the mean correlation matrix R of the predictors and extraction of diagonal blocks of strongly correlated predictors. Thus for these phenomena the most informative predictors were selected without loosing information. The statistical decisive rules for diagnosis and prognosis of the phenomena involved U(X) were calculated for choosing informative vector-predictor. We used the criterion of distance of Mahalanobis and criterion of minimum of entropy by Vapnik-Chervonenkis for the selection predictors. Successful development of hydrodynamic models for short-term forecast and improvement of 36-48h forecasts of pressure, temperature and others parameters allowed us to use the prognostic fields of those models for calculations of the discriminant functions in the nodes of the grid 150x150km and the values of probabilities P of dangerous wind and thus to get fully automated forecasts. In order to change to the alternative forecast the author proposes the empirical threshold values specified for this phenomenon and advance period 36 hours. In the accordance to the Pirsey-Obukhov criterion (T), the success of these automated statistical methods of forecast of squalls and tornadoes to 36 -48 hours ahead and heavy rainfalls in the warm season for the territory of Italy, Spain and Balkan countries is T = 1-a-b=0,54: 0,78 after author experiments. A lot of examples of very successful forecasts of summer storm wind and heavy rainfalls over the Italy and Spain territory are submitted at this report. The same decisive rules were applied to the forecast of these phenomena during cold period in this year too. This winter heavy snowfalls in Spain and in Italy and storm wind at this territory were observed very often. And our forecasts are successful.
NASA Astrophysics Data System (ADS)
Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.
2017-08-01
A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.
2014-01-01
Gold price forecasting has been a hot issue in economics recently. In this work, wavelet neural network (WNN) combined with a novel artificial bee colony (ABC) algorithm is proposed for this gold price forecasting issue. In this improved algorithm, the conventional roulette selection strategy is discarded. Besides, the convergence statuses in a previous cycle of iteration are fully utilized as feedback messages to manipulate the searching intensity in a subsequent cycle. Experimental results confirm that this new algorithm converges faster than the conventional ABC when tested on some classical benchmark functions and is effective to improve modeling capacity of WNN regarding the gold price forecasting scheme. PMID:24744773
Tourism forecasting using modified empirical mode decomposition and group method of data handling
NASA Astrophysics Data System (ADS)
Yahya, N. A.; Samsudin, R.; Shabri, A.
2017-09-01
In this study, a hybrid model using modified Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) model is proposed for tourism forecasting. This approach reconstructs intrinsic mode functions (IMFs) produced by EMD using trial and error method. The new component and the remaining IMFs is then predicted respectively using GMDH model. Finally, the forecasted results for each component are aggregated to construct an ensemble forecast. The data used in this experiment are monthly time series data of tourist arrivals from China, Thailand and India to Malaysia from year 2000 to 2016. The performance of the model is evaluated using Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) where conventional GMDH model and EMD-GMDH model are used as benchmark models. Empirical results proved that the proposed model performed better forecasts than the benchmarked models.
Purposes and methods of scoring earthquake forecasts
NASA Astrophysics Data System (ADS)
Zhuang, J.
2010-12-01
There are two kinds of purposes in the studies on earthquake prediction or forecasts: one is to give a systematic estimation of earthquake risks in some particular region and period in order to give advice to governments and enterprises for the use of reducing disasters, the other one is to search for reliable precursors that can be used to improve earthquake prediction or forecasts. For the first case, a complete score is necessary, while for the latter case, a partial score, which can be used to evaluate whether the forecasts or predictions have some advantages than a well know model, is necessary. This study reviews different scoring methods for evaluating the performance of earthquake prediction and forecasts. Especially, the gambling scoring method, which is developed recently, shows its capacity in finding good points in an earthquake prediction algorithm or model that are not in a reference model, even if its overall performance is no better than the reference model.
Integrating Satellite Measurements from Polar-orbiting instruments into Smoke Disperson Forecasts
NASA Astrophysics Data System (ADS)
Smith, N.; Pierce, R. B.; Barnet, C.; Gambacorta, A.; Davies, J. E.; Strabala, K.
2015-12-01
The IDEA-I (Infusion of Satellite Data into Environmental Applications-International) is a real-time system that currently generates trajectory-based forecasts of aerosol dispersion and stratospheric intrusions. Here we demonstrate new capabilities that use satellite measurements from the Joint Polar Satellite System (JPSS) Suomi-NPP (S-NPP) instruments (operational since 2012) in the generation of trajectory-based predictions of smoke dispersion from North American wildfires. Two such data products are used, namely the Visible Infrared Imaging Radiometer Suite (VIIRS) Aerosol Optical Depth (AOD) and the combined Cross-track Infrared Sounder (CrIS) and Advanced Technology Microwave Sounder (ATMS) NOAA-Unique CrIS-ATMS Processing System (NUCAPS) carbon monoxide (CO) retrievals. The latter is a new data product made possible by the release of full spectral-resolution CrIS measurements since December 2014. Once NUCAPS CO becomes operationally available it will be used in real-time applications such as IDEA-I along with VIIRS AOD and meteorological forecast fields to support National Weather Service (NWS) Incident Meteorologist (IMET) and air quality management decision making. By combining different measurements, the information content of the IDEA-I transport and dispersion forecast is improved within the complex terrain features that dominate the Western US and Alaska. The primary user community of smoke forecasts is the Western regions of the National Weather Service (NWS) and US Environmental Protection Agency (EPA) due to the significant impacts of wildfires in these regions. With this we demonstrate the quality of the smoke dispersion forecasts that can be achieved by integrating polar-orbiting satellite measurements with forecast models to enable on-site decision support services for fire incident management teams and other real-time air quality agencies.
First Assessment of Itaipu Dam Ensemble Inflow Forecasting System
NASA Astrophysics Data System (ADS)
Mainardi Fan, Fernando; Machado Vieira Lisboa, Auder; Gomes Villa Trinidad, Giovanni; Rógenes Monteiro Pontes, Paulo; Collischonn, Walter; Tucci, Carlos; Costa Buarque, Diogo
2017-04-01
Inflow forecasting for Hydropower Plants (HPP) Dams is one of the prominent uses for hydrological forecasts. A very important HPP in terms of energy generation for South America is the Itaipu Dam, located in the Paraná River, between Brazil and Paraguay countries, with a drainage area of 820.000km2. In this work, we present the development of an ensemble forecasting system for Itaipu, operational since November 2015. The system is based in the MGB-IPH hydrological model, includes hydrodynamics simulations of the main river, and is run every day morning forced by seven different rainfall forecasts: (i) CPTEC-ETA 15km; (ii) CPTEC-BRAMS 5km; (iii) SIMEPAR WRF Ferrier; (iv) SIMEPAR WRF Lin; (v) SIMEPAR WRF Morrison; (vi) SIMEPAR WRF WDM6; (vii) SIMEPAR MEDIAN. The last one (vii) corresponds to the median value of SIMEPAR WRF model versions (iii to vi) rainfall forecasts. Besides the developed system, the "traditional" method for inflow forecasting generation for the Itaipu Dam is also run every day. This traditional method consists in the approximation of the future inflow based on the discharge tendency of upstream telemetric gauges. Nowadays, after all the forecasts are run, the hydrology team of Itaipu develop a consensus forecast, based on all obtained results, which is the one used for the Itaipu HPP Dam operation. After one year of operation a first evaluation of the Ensemble Forecasting System was conducted. Results show that the system performs satisfactory for rising flows up to five days lead time. However, some false alarms were also issued by most ensemble members in some cases. And not in all cases the system performed better than the traditional method, especially during hydrograph recessions. In terms of meteorological forecasts, some members usage are being discontinued. In terms of the hydrodynamics representation, it seems that a better information of rivers cross section could improve hydrographs recession curves forecasts. Those opportunities for improvements are currently being addressed in the system next update.
Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics
NASA Astrophysics Data System (ADS)
Lazarus, S. M.; Holman, B. P.; Splitt, M. E.
2017-12-01
A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.
Cassagne, E; Caillaud, P D; Besancenot, J P; Thibaudon, M
2007-10-01
Pollen of Poaceae is among the most allergenic pollen in Europe with pollen of birch. It is therefore useful to elaborate models to help pollen allergy sufferers. The objective of this study was to construct forecast models that could predict the first day characterized by a certain level of allergic risk called here the Starting Date of the Allergic Risk (SDAR). Models result from four forecast methods (three summing and one multiple regression analysis) used in the literature. They were applied on Nancy and Strasbourg from 1988 to 2005 and were tested on 2006. Mean Absolute Error and Actual forecast ability test are the parameters used to choose best models, assess and compare their accuracy. It was found, on the whole, that all the models presented a good forecast accuracy which was equivalent. They were all reliable and were used in order to forecast the SDAR in 2006 with contrasting results in forecasting precision.
Assessing the skill of seasonal precipitation and streamflow forecasts in sixteen French catchments
NASA Astrophysics Data System (ADS)
Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian
2015-04-01
Meteorological centres make sustained efforts to provide seasonal forecasts that are increasingly skilful. Streamflow forecasting is one of the many applications than can benefit from these efforts. Seasonal flow forecasts generated using seasonal ensemble precipitation forecasts as input to a hydrological model can help to take anticipatory measures for water supply reservoir operation or drought risk management. The objective of the study is to assess the skill of seasonal precipitation and streamflow forecasts in France. First, we evaluated the skill of ECMWF SYS4 seasonal precipitation forecasts for streamflow forecasting in sixteen French catchments. Daily flow forecasts were produced using raw seasonal precipitation forecasts as input to the GR6J hydrological model. Ensemble forecasts are issued every month with 15 or 51 members according to the month of the year and evaluated for up to 90 days ahead. In a second step, we applied eight variants of bias correction approaches to the precipitation forecasts prior to generating the flow forecasts. The approaches were based on the linear scaling and the distribution mapping methods. The skill of the ensemble forecasts was assessed in accuracy (MAE), reliability (PIT Diagram) and overall performance (CRPS). The results show that, in most catchments, raw seasonal precipitation and streamflow forecasts are more skilful in terms of accuracy and overall performance than a reference prediction based on historic observed precipitation and watershed initial conditions at the time of forecast. Reliability is the only attribute that is not significantly improved. The skill of the forecasts is, in general, improved when applying bias correction. Two bias correction methods showed the best performance for the studied catchments: the simple linear scaling of monthly values and the empirical distribution mapping of daily values. L. Crochemore is funded by the Interreg IVB DROP Project (Benefit of governance in DROught adaPtation).
Forecasting biodiversity in breeding birds using best practices
Taylor, Shawn D.; White, Ethan P.
2018-01-01
Biodiversity forecasts are important for conservation, management, and evaluating how well current models characterize natural systems. While the number of forecasts for biodiversity is increasing, there is little information available on how well these forecasts work. Most biodiversity forecasts are not evaluated to determine how well they predict future diversity, fail to account for uncertainty, and do not use time-series data that captures the actual dynamics being studied. We addressed these limitations by using best practices to explore our ability to forecast the species richness of breeding birds in North America. We used hindcasting to evaluate six different modeling approaches for predicting richness. Hindcasts for each method were evaluated annually for a decade at 1,237 sites distributed throughout the continental United States. All models explained more than 50% of the variance in richness, but none of them consistently outperformed a baseline model that predicted constant richness at each site. The best practices implemented in this study directly influenced the forecasts and evaluations. Stacked species distribution models and “naive” forecasts produced poor estimates of uncertainty and accounting for this resulted in these models dropping in the relative performance compared to other models. Accounting for observer effects improved model performance overall, but also changed the rank ordering of models because it did not improve the accuracy of the “naive” model. Considering the forecast horizon revealed that the prediction accuracy decreased across all models as the time horizon of the forecast increased. To facilitate the rapid improvement of biodiversity forecasts, we emphasize the value of specific best practices in making forecasts and evaluating forecasting methods. PMID:29441230
NASA Astrophysics Data System (ADS)
Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.
2012-04-01
In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.
2000-01-01
This report describes the outcome of Phase 1 of the AMU's Improved Anvil Forecasting task. Forecasters in the 45th Weather Squadron and the Spaceflight Meteorology Group have found that anvil forecasting is a difficult task when predicting LCC and FR violations. The purpose of this task is to determine the technical feasibility of creating an anvil-forecasting tool. Work on this study was separated into three steps: literature search, forecaster discussions, and determination of technical feasibility. The literature search revealed no existing anvil-forecasting techniques. However, there appears to be growing interest in anvils in recent years. If this interest continues to grow, more information will be available to aid in developing a reliable anvil-forecasting tool. The forecaster discussion step revealed an array of methods on how better forecasting techniques could be developed. The forecasters have ideas based on sound meteorological principles and personal experience in forecasting and analyzing anvils. Based on the information gathered in the discussions with the forecasters, the conclusion of this report is that it is technically feasible at this time to develop an anvil forecasting technique that will significantly contribute to the confidence in anvil forecasts.
Does money matter in inflation forecasting?
NASA Astrophysics Data System (ADS)
Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.
2010-11-01
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.
Peak Wind Tool for General Forecasting
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III
2010-01-01
The expected peak wind speed of the day is an important forecast element in the 45th Weather Squadron's (45 WS) daily 24-Hour and Weekly Planning Forecasts. The forecasts are used for ground and space launch operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45 WS also issues wind advisories for KSC/CCAFS when they expect wind gusts to meet or exceed 25 kt, 35 kt and 50 kt thresholds at any level from the surface to 300 ft. The 45 WS forecasters have indicated peak wind speeds are challenging to forecast, particularly in the cool season months of October - April. In Phase I of this task, the Applied Meteorology Unit (AMU) developed a tool to help the 45 WS forecast non-convective winds at KSC/CCAFS for the 24-hour period of 0800 to 0800 local time. The tool was delivered as a Microsoft Excel graphical user interface (GUI). The GUI displayed the forecast of peak wind speed, 5-minute average wind speed at the time of the peak wind, timing of the peak wind and probability the peak speed would meet or exceed 25 kt, 35 kt and 50 kt. For the current task (Phase II ), the 45 WS requested additional observations be used for the creation of the forecast equations by expanding the period of record (POR). Additional parameters were evaluated as predictors, including wind speeds between 500 ft and 3000 ft, static stability classification, Bulk Richardson Number, mixing depth, vertical wind shear, temperature inversion strength and depth and wind direction. Using a verification data set, the AMU compared the performance of the Phase I and II prediction methods. Just as in Phase I, the tool was delivered as a Microsoft Excel GUI. The 45 WS requested the tool also be available in the Meteorological Interactive Data Display System (MIDDS). The AMU first expanded the POR by two years by adding tower observations, surface observations and CCAFS (XMR) soundings for the cool season months of March 2007 to April 2009. The POR was expanded again by six years, from October 1996 to April 2002, by interpolating 1000-ft sounding data to 100-ft increments. The Phase II developmental data set included observations for the cool season months of October 1996 to February 2007. The AMU calculated 68 candidate predictors from the XMR soundings, to include 19 stability parameters, 48 wind speed parameters and one wind shear parameter. Each day in the data set was stratified by synoptic weather pattern, low-level wind direction, precipitation and Richardson Number, for a total of 60 stratification methods. Linear regression equations, using the 68 predictors and 60 stratification methods, were created for the tool's three forecast parameters: the highest peak wind speed of the day (PWSD), 5-minute average speed at the same time (A WSD), and timing of the PWSD. For PWSD and A WSD, 30 Phase II methods were selected for evaluation in the verification data set. For timing of the PWSD, 12 Phase\\I methods were selected for evaluation. The verification data set contained observations for the cool season months of March 2007 to April 2009. The data set was used to compare the Phase I and II forecast methods to climatology, model forecast winds and wind advisories issued by the 45 WS. The model forecast winds were derived from the 0000 and 1200 UTC runs of the 12-km North American Mesoscale (MesoNAM) model. The forecast methods that performed the best in the verification data set were selected for the Phase II version of the tool. For PWSD and A WSD, linear regression equations based on MesoNAM forecasts performed significantly better than the Phase I and II methods. For timing of the PWSD, none of the methods performed significantly bener than climatology. The AMU then developed the Microsoft Excel and MIDDS GUls. The GUIs display the forecasts for PWSD, AWSD and the probability the PWSD will meet or exceed 25 kt, 35 kt and 50 kt. Since none of the prediction methods for timing of the PWSD performed significantly better thanlimatology, the tool no longer displays this predictand. The Excel and MIDDS GUIs display forecasts for Day-I to Day-3 and Day-I to Day-5, respectively. The Excel GUI uses MesoNAM forecasts as input, while the MIDDS GUI uses input from the MesoNAM and Global Forecast System model. Based on feedback from the 45 WS, the AMU added the daily average wind speed from 30 ft to 60 ft to the tool, which is one of the parameters in the 24-Hour and Weekly Planning Forecasts issued by the 45 WS. In addition, the AMU expanded the MIDDS GUI to include forecasts out to Day-7.
NASA Astrophysics Data System (ADS)
Kotegawa, Tatsuya
Complexity in the Air Transportation System (ATS) arises from the intermingling of many independent physical resources, operational paradigms, and stakeholder interests, as well as the dynamic variation of these interactions over time. Currently, trade-offs and cost benefit analyses of new ATS concepts are carried out on system-wide evaluation simulations driven by air traffic forecasts that assume fixed airline routes. However, this does not well reflect reality as airlines regularly add and remove routes. A airline service route network evolution model that projects route addition and removal was created and combined with state-of-the-art air traffic forecast methods to better reflect the dynamic properties of the ATS in system-wide simulations. Guided by a system-of-systems framework, network theory metrics and machine learning algorithms were applied to develop the route network evolution models based on patterns extracted from historical data. Constructing the route addition section of the model posed the greatest challenge due to the large pool of new link candidates compared to the actual number of routes historically added to the network. Of the models explored, algorithms based on logistic regression, random forests, and support vector machines showed best route addition and removal forecast accuracies at approximately 20% and 40%, respectively, when validated with historical data. The combination of network evolution models and a system-wide evaluation tool quantified the impact of airline route network evolution on air traffic delay. The expected delay minutes when considering network evolution increased approximately 5% for a forecasted schedule on 3/19/2020. Performance trade-off studies between several airline route network topologies from the perspectives of passenger travel efficiency, fuel burn, and robustness were also conducted to provide bounds that could serve as targets for ATS transformation efforts. The series of analysis revealed that high robustness is achievable only in exchange of lower passenger travel and fuel burn efficiency. However, increase in the network density can mitigate this trade-off.
Funk, Chris; Verdin, James P.; Husak, Gregory
2007-01-01
Famine early warning in Africa presents unique challenges and rewards. Hydrologic extremes must be tracked and anticipated over complex and changing climate regimes. The successful anticipation and interpretation of hydrologic shocks can initiate effective government response, saving lives and softening the impacts of droughts and floods. While both monitoring and forecast technologies continue to advance, discontinuities between monitoring and forecast systems inhibit effective decision making. Monitoring systems typically rely on high resolution satellite remote-sensed normalized difference vegetation index (NDVI) and rainfall imagery. Forecast systems provide information on a variety of scales and formats. Non-meteorologists are often unable or unwilling to connect the dots between these disparate sources of information. To mitigate these problem researchers at UCSB's Climate Hazard Group, NASA GIMMS and USGS/EROS are implementing a NASA-funded integrated decision support system that combines the monitoring of precipitation and NDVI with statistical one-to-three month forecasts. We present the monitoring/forecast system, assess its accuracy, and demonstrate its application in food insecure sub-Saharan Africa.
A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting.
Ben Taieb, Souhaib; Atiya, Amir F
2016-01-01
Multistep-ahead forecasts can either be produced recursively by iterating a one-step-ahead time series model or directly by estimating a separate model for each forecast horizon. In addition, there are other strategies; some of them combine aspects of both aforementioned concepts. In this paper, we present a comprehensive investigation into the bias and variance behavior of multistep-ahead forecasting strategies. We provide a detailed review of the different multistep-ahead strategies. Subsequently, we perform a theoretical study that derives the bias and variance for a number of forecasting strategies. Finally, we conduct a Monte Carlo experimental study that compares and evaluates the bias and variance performance of the different strategies. From the theoretical and the simulation studies, we analyze the effect of different factors, such as the forecast horizon and the time series length, on the bias and variance components, and on the different multistep-ahead strategies. Several lessons are learned, and recommendations are given concerning the advantages, disadvantages, and best conditions of use of each strategy.
Clark, M.R.; Gangopadhyay, S.; Hay, L.; Rajagopalan, B.; Wilby, R.
2004-01-01
A number of statistical methods that are used to provide local-scale ensemble forecasts of precipitation and temperature do not contain realistic spatial covariability between neighboring stations or realistic temporal persistence for subsequent forecast lead times. To demonstrate this point, output from a global-scale numerical weather prediction model is used in a stepwise multiple linear regression approach to downscale precipitation and temperature to individual stations located in and around four study basins in the United States. Output from the forecast model is downscaled for lead times up to 14 days. Residuals in the regression equation are modeled stochastically to provide 100 ensemble forecasts. The precipitation and temperature ensembles from this approach have a poor representation of the spatial variability and temporal persistence. The spatial correlations for downscaled output are considerably lower than observed spatial correlations at short forecast lead times (e.g., less than 5 days) when there is high accuracy in the forecasts. At longer forecast lead times, the downscaled spatial correlations are close to zero. Similarly, the observed temporal persistence is only partly present at short forecast lead times. A method is presented for reordering the ensemble output in order to recover the space-time variability in precipitation and temperature fields. In this approach, the ensemble members for a given forecast day are ranked and matched with the rank of precipitation and temperature data from days randomly selected from similar dates in the historical record. The ensembles are then reordered to correspond to the original order of the selection of historical data. Using this approach, the observed intersite correlations, intervariable correlations, and the observed temporal persistence are almost entirely recovered. This reordering methodology also has applications for recovering the space-time variability in modeled streamflow. ?? 2004 American Meteorological Society.
NASA Technical Reports Server (NTRS)
Raymond, William H.; Olson, William S.
1990-01-01
Delay in the spin-up of precipitation early in numerical atmospheric forecasts is a deficiency correctable by diabatic initialization combined with diabatic forcing. For either to be effective requires some knowledge of the magnitude and vertical placement of the latent heating fields. Until recently the best source of cloud and rain water data was the remotely sensed vertical integrated precipitation rate or liquid water content. Vertical placement of the condensation remains unknown. Some information about the vertical distribution of the heating rates and precipitating liquid water and ice can be obtained from retrieval techniques that use a physical model of precipitating clouds to refine and improve the interpretation of the remotely sensed data. A description of this procedure and an examination of its 3-D liquid water products, along with improved modeling methods that enhance or speed-up storm development is discussed.
NASA Technical Reports Server (NTRS)
Shiau, Jyh-Jen; Wahba, Grace; Johnson, Donald R.
1986-01-01
A new method, based on partial spline models, is developed for including specified discontinuities in otherwise smooth two- and three-dimensional objective analyses. The method is appropriate for including tropopause height information in two- and three-dimensinal temperature analyses, using the O'Sullivan-Wahba physical variational method for analysis of satellite radiance data, and may in principle be used in a combined variational analysis of observed, forecast, and climate information. A numerical method for its implementation is described and a prototype two-dimensional analysis based on simulated radiosonde and tropopause height data is shown. The method may also be appropriate for other geophysical problems, such as modeling the ocean thermocline, fronts, discontinuities, etc.