Sample records for forecast method based

  1. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China

    PubMed Central

    Liu, Dong-jun; Li, Li

    2015-01-01

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332

  2. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China.

    PubMed

    Liu, Dong-jun; Li, Li

    2015-06-23

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.

  3. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  4. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  5. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    PubMed

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  6. Forecasting peaks of seasonal influenza epidemics.

    PubMed

    Nsoesie, Elaine; Mararthe, Madhav; Brownstein, John

    2013-06-21

    We present a framework for near real-time forecast of influenza epidemics using a simulation optimization approach. The method combines an individual-based model and a simple root finding optimization method for parameter estimation and forecasting. In this study, retrospective forecasts were generated for seasonal influenza epidemics using web-based estimates of influenza activity from Google Flu Trends for 2004-2005, 2007-2008 and 2012-2013 flu seasons. In some cases, the peak could be forecasted 5-6 weeks ahead. This study adds to existing resources for influenza forecasting and the proposed method can be used in conjunction with other approaches in an ensemble framework.

  7. Action-based flood forecasting for triggering humanitarian action

    NASA Astrophysics Data System (ADS)

    Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin

    2016-09-01

    Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.

  8. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  9. Spatial forecast of landslides in three gorges based on spatial data mining.

    PubMed

    Wang, Xianmin; Niu, Ruiqing

    2009-01-01

    The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods.

  10. Spatial Forecast of Landslides in Three Gorges Based On Spatial Data Mining

    PubMed Central

    Wang, Xianmin; Niu, Ruiqing

    2009-01-01

    The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods. PMID:22573999

  11. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chen, C.; Rundle, J. B.; Holliday, J. R.; Nanjo, K.; Turcotte, D. L.; Li, S.; Tiampo, K. F.

    2005-12-01

    Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC) diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the m = 7.3 1999 Chi-Chi, Taiwan, earthquake. These forecasts are based on a method, Pattern Informatics (PI), that locates likely sites for future large earthquakes based on large change in activity of the smallest earthquakes. A competing null hypothesis, Relative Intensity (RI), is based on the idea that future large earthquake locations are correlated with sites having the greatest frequency of small earthquakes. We show that for Taiwan, the PI forecast method is superior to the RI forecast null hypothesis. Inspection of the two maps indicates that their forecast locations are indeed quite different. Our results confirm an earlier result suggesting that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous changes in activation or quiescence, and that signatures of these processes can be detected in precursory seismicity data. Furthermore, we find that our methods can accurately forecast the locations of aftershocks from precursory seismicity changes alone, implying that the main shock together with its aftershocks represent a single manifestation of the formation of a high-stress region nucleating prior to the main shock.

  12. A stochastic post-processing method for solar irradiance forecasts derived from NWPs models

    NASA Astrophysics Data System (ADS)

    Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.

    2010-09-01

    Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.

  13. Air Pollution Forecasts: An Overview

    PubMed Central

    Bai, Lu; Wang, Jianzhou; Lu, Haiyan

    2018-01-01

    Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies. PMID:29673227

  14. Air Pollution Forecasts: An Overview.

    PubMed

    Bai, Lu; Wang, Jianzhou; Ma, Xuejiao; Lu, Haiyan

    2018-04-17

    Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies.

  15. A Load-Based Temperature Prediction Model for Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Sobhani, Masoud

    Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.

  16. Forecasting Jakarta composite index (IHSG) based on chen fuzzy time series and firefly clustering algorithm

    NASA Astrophysics Data System (ADS)

    Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.

    2018-03-01

    This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.

  17. Automated time series forecasting for biosurveillance.

    PubMed

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  18. Load forecast method of electric vehicle charging station using SVR based on GA-PSO

    NASA Astrophysics Data System (ADS)

    Lu, Kuan; Sun, Wenxue; Ma, Changhui; Yang, Shenquan; Zhu, Zijian; Zhao, Pengfei; Zhao, Xin; Xu, Nan

    2017-06-01

    This paper presents a Support Vector Regression (SVR) method for electric vehicle (EV) charging station load forecast based on genetic algorithm (GA) and particle swarm optimization (PSO). Fuzzy C-Means (FCM) clustering is used to establish similar day samples. GA is used for global parameter searching and PSO is used for a more accurately local searching. Load forecast is then regressed using SVR. The practical load data of an EV charging station were taken to illustrate the proposed method. The result indicates an obvious improvement in the forecasting accuracy compared with SVRs based on PSO and GA exclusively.

  19. Fuzzy forecasting based on fuzzy-trend logical relationship groups.

    PubMed

    Chen, Shyi-Ming; Wang, Nai-Yi

    2010-10-01

    In this paper, we present a new method to predict the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) based on fuzzy-trend logical relationship groups (FTLRGs). The proposed method divides fuzzy logical relationships into FTLRGs based on the trend of adjacent fuzzy sets appearing in the antecedents of fuzzy logical relationships. First, we apply an automatic clustering algorithm to cluster the historical data into intervals of different lengths. Then, we define fuzzy sets based on these intervals of different lengths. Then, the historical data are fuzzified into fuzzy sets to derive fuzzy logical relationships. Then, we divide the fuzzy logical relationships into FTLRGs for forecasting the TAIEX. Moreover, we also apply the proposed method to forecast the enrollments and the inventory demand, respectively. The experimental results show that the proposed method gets higher average forecasting accuracy rates than the existing methods.

  20. Forecasting the short-term passenger flow on high-speed railway with neural networks.

    PubMed

    Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing

    2014-01-01

    Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway.

  1. A simplified real time method to forecast semi-enclosed basins storm surge

    NASA Astrophysics Data System (ADS)

    Pasquali, D.; Di Risio, M.; De Girolamo, P.

    2015-11-01

    Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.

  2. Predicting Academic Library Circulations: A Forecasting Methods Competition.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    Based on sample data representing five years of monthly circulation totals from 50 academic libraries in Illinois, Iowa, Michigan, Minnesota, Missouri, and Ohio, a study was conducted to determine the most efficient smoothing forecasting methods for academic libraries. Smoothing forecasting methods were chosen because they have been characterized…

  3. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  4. Verification of Weather Running Estimate-Nowcast (WRE-N) Forecasts Using a Spatial-Categorical Method

    DTIC Science & Technology

    2017-07-01

    forecasts and observations on a common grid, which enables the application a number of different spatial verification methods that reveal various...forecasts of continuous meteorological variables using categorical and object-based methods . White Sands Missile Range (NM): Army Research Laboratory (US... Research version of the Weather Research and Forecasting Model adapted for generating short-range nowcasts and gridded observations produced by the

  5. Prediction of a service demand using combined forecasting approach

    NASA Astrophysics Data System (ADS)

    Zhou, Ling

    2017-08-01

    Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.

  6. Replacement Beef Cow Valuation under Data Availability Constraints

    PubMed Central

    Hagerman, Amy D.; Thompson, Jada M.; Ham, Charlotte; Johnson, Kamina K.

    2017-01-01

    Economists are often tasked with estimating the benefits or costs associated with livestock production losses; however, lack of available data or absence of consistent reporting can reduce the accuracy of these valuations. This work looks at three potential estimation techniques for determining the value for replacement beef cows with varying types of market data to proxy constrained data availability and discusses the potential margin of error for each technique. Oklahoma bred replacement cows are valued using hedonic pricing based on Oklahoma bred cow data—a best case scenario—vector error correction modeling (VECM) based on national cow sales data and cost of production (COP) based on just a representative enterprise budget and very limited sales data. Each method was then used to perform a within-sample forecast of 2016 January to December, and forecasts are compared with the 2016 monthly observed market prices in Oklahoma using the mean absolute percent error (MAPE). Hedonic pricing methods tend to overvalue for within-sample forecasting but performed best, as measured by MAPE for high quality cows. The VECM tended to undervalue cows but performed best for younger animals. COP performed well, compared with the more data intensive methods. Examining each method individually across eight representative replacement beef female types, the VECM forecast resulted in a MAPE under 10% for 33% of forecasted months, followed by hedonic pricing at 24% of the forecasted months and COP at 14% of the forecasted months for average quality beef females. For high quality females, the hedonic pricing method worked best producing a MAPE under 10% in 36% of the forecasted months followed by the COP method at 21% of months and the VECM at 14% of the forecasted months. These results suggested that livestock valuation method selection was not one-size-fits-all and may need to vary based not only on the data available but also on the characteristics (e.g., quality or age) of the livestock being valued. PMID:29164141

  7. Time-series-based hybrid mathematical modelling method adapted to forecast automotive and medical waste generation: Case study of Lithuania.

    PubMed

    Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras

    2018-05-01

    The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.

  8. Nonmechanistic forecasts of seasonal influenza with iterative one-week-ahead distributions.

    PubMed

    Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni

    2018-06-15

    Accurate and reliable forecasts of seasonal epidemics of infectious disease can assist in the design of countermeasures and increase public awareness and preparedness. This article describes two main contributions we made recently toward this goal: a novel approach to probabilistic modeling of surveillance time series based on "delta densities", and an optimization scheme for combining output from multiple forecasting methods into an adaptively weighted ensemble. Delta densities describe the probability distribution of the change between one observation and the next, conditioned on available data; chaining together nonparametric estimates of these distributions yields a model for an entire trajectory. Corresponding distributional forecasts cover more observed events than alternatives that treat the whole season as a unit, and improve upon multiple evaluation metrics when extracting key targets of interest to public health officials. Adaptively weighted ensembles integrate the results of multiple forecasting methods, such as delta density, using weights that can change from situation to situation. We treat selection of optimal weightings across forecasting methods as a separate estimation task, and describe an estimation procedure based on optimizing cross-validation performance. We consider some details of the data generation process, including data revisions and holiday effects, both in the construction of these forecasting methods and when performing retrospective evaluation. The delta density method and an adaptively weighted ensemble of other forecasting methods each improve significantly on the next best ensemble component when applied separately, and achieve even better cross-validated performance when used in conjunction. We submitted real-time forecasts based on these contributions as part of CDC's 2015/2016 FluSight Collaborative Comparison. Among the fourteen submissions that season, this system was ranked by CDC as the most accurate.

  9. Performance of time-series methods in forecasting the demand for red blood cell transfusion.

    PubMed

    Pereira, Arturo

    2004-05-01

    Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.

  10. Forecasting the Short-Term Passenger Flow on High-Speed Railway with Neural Networks

    PubMed Central

    Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing

    2014-01-01

    Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway. PMID:25544838

  11. Ensemble flare forecasting: using numerical weather prediction techniques to improve space weather operations

    NASA Astrophysics Data System (ADS)

    Murray, S.; Guerra, J. A.

    2017-12-01

    One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.

  12. Residual uncertainty estimation using instance-based learning with applications to hydrologic forecasting

    NASA Astrophysics Data System (ADS)

    Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.

    2017-08-01

    A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.

  13. Forecasting daily patient volumes in the emergency department.

    PubMed

    Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L

    2008-02-01

    Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.

  14. Short-Term fo F2 Forecast: Present Day State of Art

    NASA Astrophysics Data System (ADS)

    Mikhailov, A. V.; Depuev, V. H.; Depueva, A. H.

    An analysis of the F2-layer short-term forecast problem has been done. Both objective and methodological problems prevent us from a deliberate F2-layer forecast issuing at present. An empirical approach based on statistical methods may be recommended for practical use. A forecast method based on a new aeronomic index (a proxy) AI has been proposed and tested over selected 64 severe storm events. The method provides an acceptable prediction accuracy both for strongly disturbed and quiet conditions. The problems with the prediction of the F2-layer quiet-time disturbances as well as some other unsolved problems are discussed

  15. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-07-25

    This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load valuesmore » to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  16. Short Term Load Forecasting with Fuzzy Logic Systems for power system planning and reliability-A Review

    NASA Astrophysics Data System (ADS)

    Holmukhe, R. M.; Dhumale, Mrs. Sunita; Chaudhari, Mr. P. S.; Kulkarni, Mr. P. P.

    2010-10-01

    Load forecasting is very essential to the operation of Electricity companies. It enhances the energy efficient and reliable operation of power system. Forecasting of load demand data forms an important component in planning generation schedules in a power system. The purpose of this paper is to identify issues and better method for load foecasting. In this paper we focus on fuzzy logic system based short term load forecasting. It serves as overview of the state of the art in the intelligent techniques employed for load forecasting in power system planning and reliability. Literature review has been conducted and fuzzy logic method has been summarized to highlight advantages and disadvantages of this technique. The proposed technique for implementing fuzzy logic based forecasting is by Identification of the specific day and by using maximum and minimum temperature for that day and finally listing the maximum temperature and peak load for that day. The results show that Load forecasting where there are considerable changes in temperature parameter is better dealt with Fuzzy Logic system method as compared to other short term forecasting techniques.

  17. Real-time eruption forecasting using the material Failure Forecast Method with a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.

    2015-04-01

    Many attempts for deterministic forecasting of eruptions and landslides have been performed using the material Failure Forecast Method (FFM). This method consists in adjusting an empirical power law on precursory patterns of seismicity or deformation. Until now, most of the studies have presented hindsight forecasts based on complete time series of precursors and do not evaluate the ability of the method for carrying out real-time forecasting with partial precursory sequences. In this study, we present a rigorous approach of the FFM designed for real-time applications on volcano-seismic precursors. We use a Bayesian approach based on the FFM theory and an automatic classification of seismic events. The probability distributions of the data deduced from the performance of this classification are used as input. As output, it provides the probability of the forecast time at each observation time before the eruption. The spread of the a posteriori probability density function of the prediction time and its stability with respect to the observation time are used as criteria to evaluate the reliability of the forecast. We test the method on precursory accelerations of long-period seismicity prior to vulcanian explosions at Volcán de Colima (Mexico). For explosions preceded by a single phase of seismic acceleration, we obtain accurate and reliable forecasts using approximately 80% of the whole precursory sequence. It is, however, more difficult to apply the method to multiple acceleration patterns.

  18. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    NASA Astrophysics Data System (ADS)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.

    2018-03-01

    Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  19. System load forecasts for an electric utility. [Hourly loads using Box-Jenkins method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uri, N.D.

    This paper discusses forecasting hourly system load for an electric utility using Box-Jenkins time-series analysis. The results indicate that a model based on the method of Box and Jenkins, given its simplicity, gives excellent results over the forecast horizon.

  20. Study on load forecasting to data centers of high power density based on power usage effectiveness

    NASA Astrophysics Data System (ADS)

    Zhou, C. C.; Zhang, F.; Yuan, Z.; Zhou, L. M.; Wang, F. M.; Li, W.; Yang, J. H.

    2016-08-01

    There is usually considerable energy consumption in data centers. Load forecasting to data centers is in favor of formulating regional load density indexes and of great benefit to getting regional spatial load forecasting more accurately. The building structure and the other influential factors, i.e. equipment, geographic and climatic conditions, are considered for the data centers, and a method to forecast the load of the data centers based on power usage effectiveness is proposed. The cooling capacity of a data center and the index of the power usage effectiveness are used to forecast the power load of the data center in the method. The cooling capacity is obtained by calculating the heat load of the data center. The index is estimated using the group decision-making method of mixed language information. An example is given to prove the applicability and accuracy of this method.

  1. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization techniques.

    PubMed

    Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan

    2013-06-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.

  2. A Solar Time-Based Analog Ensemble Method for Regional Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Zhang, Xinmin; Li, Yuan

    This paper presents a new analog ensemble method for day-ahead regional photovoltaic (PV) power forecasting with hourly resolution. By utilizing open weather forecast and power measurement data, this prediction method is processed within a set of historical data with similar meteorological data (temperature and irradiance), and astronomical date (solar time and earth declination angle). Further, clustering and blending strategies are applied to improve its accuracy in regional PV forecasting. The robustness of the proposed method is demonstrated with three different numerical weather prediction models, the North American Mesoscale Forecast System, the Global Forecast System, and the Short-Range Ensemble Forecast, formore » both region level and single site level PV forecasts. Using real measured data, the new forecasting approach is applied to the load zone in Southeastern Massachusetts as a case study. The normalized root mean square error (NRMSE) has been reduced by 13.80%-61.21% when compared with three tested baselines.« less

  3. A Hybrid Neural Network Model for Sales Forecasting Based on ARIMA and Search Popularity of Article Titles.

    PubMed

    Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren

    2016-01-01

    Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words.

  4. A Hybrid Neural Network Model for Sales Forecasting Based on ARIMA and Search Popularity of Article Titles

    PubMed Central

    Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren

    2016-01-01

    Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words. PMID:27313605

  5. Forecasting Japanese encephalitis incidence from historical morbidity patterns: Statistical analysis with 27 years of observation in Assam, India.

    PubMed

    Handique, Bijoy K; Khan, Siraj A; Mahanta, J; Sudhakar, S

    2014-09-01

    Japanese encephalitis (JE) is one of the dreaded mosquito-borne viral diseases mostly prevalent in south Asian countries including India. Early warning of the disease in terms of disease intensity is crucial for taking adequate and appropriate intervention measures. The present study was carried out in Dibrugarh district in the state of Assam located in the northeastern region of India to assess the accuracy of selected forecasting methods based on historical morbidity patterns of JE incidence during the past 22 years (1985-2006). Four selected forecasting methods, viz. seasonal average (SA), seasonal adjustment with last three observations (SAT), modified method adjusting long-term and cyclic trend (MSAT), and autoregressive integrated moving average (ARIMA) have been employed to assess the accuracy of each of the forecasting methods. The forecasting methods were validated for five consecutive years from 2007-2012 and accuracy of each method has been assessed. The forecasting method utilising seasonal adjustment with long-term and cyclic trend emerged as best forecasting method among the four selected forecasting methods and outperformed the even statistically more advanced ARIMA method. Peak of the disease incidence could effectively be predicted with all the methods, but there are significant variations in magnitude of forecast errors among the selected methods. As expected, variation in forecasts at primary health centre (PHC) level is wide as compared to that of district level forecasts. The study showed that adopted forecasting techniques could reasonably forecast the intensity of JE cases at PHC level without considering the external variables. The results indicate that the understanding of long-term and cyclic trend of the disease intensity will improve the accuracy of the forecasts, but there is a need for making the forecast models more robust to explain sudden variation in the disease intensity with detail analysis of parasite and host population dynamics.

  6. Influence of Forecast Accuracy of Photovoltaic Power Output on Capacity Optimization of Microgrid Composition under 30 min Power Balancing Control

    NASA Astrophysics Data System (ADS)

    Sone, Akihito; Kato, Takeyoshi; Shimakage, Toyonari; Suzuoki, Yasuo

    A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). If a number of MGs are controlled to maintain the predetermined electricity demand including RE-based DGs as negative demand, they would contribute to supply-demand balancing of whole electric power system. For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on a demonstrative study on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization. Three forecast cases with different accuracy are compared. The main results are as follows. Even with no forecast error during every 30 min. as the ideal forecast method, the required capacity of NaS battery reaches about 40% of PVS capacity for mitigating the instantaneous forecast error within 30 min. The required capacity to compensate for the forecast error is doubled with the actual forecast method. The influence of forecast error can be reduced by adjusting the scheduled power output of controllable DGs according to the weather forecast. Besides, the required capacity can be reduced significantly if the error of balancing control in a MG is acceptable for a few percentages of periods, because the total periods of large forecast error is not so often.

  7. Neural network based short-term load forecasting using weather compensation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, T.W.S.; Leung, C.T.

    This paper presents a novel technique for electric load forecasting based on neural weather compensation. The proposed method is a nonlinear generalization of Box and Jenkins approach for nonstationary time-series prediction. A weather compensation neural network is implemented for one-day ahead electric load forecasting. The weather compensation neural network can accurately predict the change of actual electric load consumption from the previous day. The results, based on Hong Kong Island historical load demand, indicate that this methodology is capable of providing a more accurate load forecast with a 0.9% reduction in forecast error.

  8. Research on time series data prediction based on clustering algorithm - A case study of Yuebao

    NASA Astrophysics Data System (ADS)

    Lu, Xu; Zhao, Tianzhong

    2017-08-01

    Forecasting is the prerequisite for making scientific decisions, it is based on the past information of the research on the phenomenon, and combined with some of the factors affecting this phenomenon, then using scientific methods to forecast the development trend of the future, it is an important way for people to know the world. This is particularly important in the prediction of financial data, because proper financial data forecasts can provide a great deal of help to financial institutions in their strategic implementation, strategic alignment and risk control. However, the current forecasts of financial data generally use the method of forecast of overall data, which lack of consideration of customer behavior and other factors in the financial data forecasting process, and they are important factors influencing the change of financial data. Based on this situation, this paper analyzed the data of Yuebao, and according to the user's attributes and the operating characteristics, this paper classified 567 users of Yuebao, and made further predicted the data of Yuebao for every class of users, the results showed that the forecasting model in this paper can meet the demand of forecasting.

  9. A framework for improving a seasonal hydrological forecasting system using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Pappenberger, Florian; Smith, Paul; Cloke, Hannah

    2017-04-01

    Seasonal streamflow forecasts are of great value for the socio-economic sector, for applications such as navigation, flood and drought mitigation and reservoir management for hydropower generation and water allocation to agriculture and drinking water. However, as we speak, the performance of dynamical seasonal hydrological forecasting systems (systems based on running seasonal meteorological forecasts through a hydrological model to produce seasonal hydrological forecasts) is still limited in space and time. In this context, the ESP (Ensemble Streamflow Prediction) remains an attractive forecasting method for seasonal streamflow forecasting as it relies on forcing a hydrological model (starting from the latest observed or simulated initial hydrological conditions) with historical meteorological observations. This makes it cheaper to run than a standard dynamical seasonal hydrological forecasting system, for which the seasonal meteorological forecasts will first have to be produced, while still producing skilful forecasts. There is thus the need to focus resources and time towards improvements in dynamical seasonal hydrological forecasting systems which will eventually lead to significant improvements in the skill of the streamflow forecasts generated. Sensitivity analyses are a powerful tool that can be used to disentangle the relative contributions of the two main sources of errors in seasonal streamflow forecasts, namely the initial hydrological conditions (IHC; e.g., soil moisture, snow cover, initial streamflow, among others) and the meteorological forcing (MF; i.e., seasonal meteorological forecasts of precipitation and temperature, input to the hydrological model). Sensitivity analyses are however most useful if they inform and change current operational practices. To this end, we propose a method to improve the design of a seasonal hydrological forecasting system. This method is based on sensitivity analyses, informing the forecasters as to which element of the forecasting chain (i.e., IHC or MF) could potentially lead to the highest increase in seasonal hydrological forecasting performance, after each forecast update.

  10. Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power

    NASA Astrophysics Data System (ADS)

    Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab

    2014-01-01

    Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.

  11. A Novel Multilevel-SVD Method to Improve Multistep Ahead Forecasting in Traffic Accidents Domain.

    PubMed

    Barba, Lida; Rodríguez, Nibaldo

    2017-01-01

    Here is proposed a novel method for decomposing a nonstationary time series in components of low and high frequency. The method is based on Multilevel Singular Value Decomposition (MSVD) of a Hankel matrix. The decomposition is used to improve the forecasting accuracy of Multiple Input Multiple Output (MIMO) linear and nonlinear models. Three time series coming from traffic accidents domain are used. They represent the number of persons with injuries in traffic accidents of Santiago, Chile. The data were continuously collected by the Chilean Police and were weekly sampled from 2000:1 to 2014:12. The performance of MSVD is compared with the decomposition in components of low and high frequency of a commonly accepted method based on Stationary Wavelet Transform (SWT). SWT in conjunction with the Autoregressive model (SWT + MIMO-AR) and SWT in conjunction with an Autoregressive Neural Network (SWT + MIMO-ANN) were evaluated. The empirical results have shown that the best accuracy was achieved by the forecasting model based on the proposed decomposition method MSVD, in comparison with the forecasting models based on SWT.

  12. A Novel Multilevel-SVD Method to Improve Multistep Ahead Forecasting in Traffic Accidents Domain

    PubMed Central

    Rodríguez, Nibaldo

    2017-01-01

    Here is proposed a novel method for decomposing a nonstationary time series in components of low and high frequency. The method is based on Multilevel Singular Value Decomposition (MSVD) of a Hankel matrix. The decomposition is used to improve the forecasting accuracy of Multiple Input Multiple Output (MIMO) linear and nonlinear models. Three time series coming from traffic accidents domain are used. They represent the number of persons with injuries in traffic accidents of Santiago, Chile. The data were continuously collected by the Chilean Police and were weekly sampled from 2000:1 to 2014:12. The performance of MSVD is compared with the decomposition in components of low and high frequency of a commonly accepted method based on Stationary Wavelet Transform (SWT). SWT in conjunction with the Autoregressive model (SWT + MIMO-AR) and SWT in conjunction with an Autoregressive Neural Network (SWT + MIMO-ANN) were evaluated. The empirical results have shown that the best accuracy was achieved by the forecasting model based on the proposed decomposition method MSVD, in comparison with the forecasting models based on SWT. PMID:28261267

  13. Comparison between stochastic and machine learning methods for hydrological multi-step ahead forecasting: All forecasts are wrong!

    NASA Astrophysics Data System (ADS)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2017-04-01

    Machine learning (ML) is considered to be a promising approach to hydrological processes forecasting. We conduct a comparison between several stochastic and ML point estimation methods by performing large-scale computational experiments based on simulations. The purpose is to provide generalized results, while the respective comparisons in the literature are usually based on case studies. The stochastic methods used include simple methods, models from the frequently used families of Autoregressive Moving Average (ARMA), Autoregressive Fractionally Integrated Moving Average (ARFIMA) and Exponential Smoothing models. The ML methods used are Random Forests (RF), Support Vector Machines (SVM) and Neural Networks (NN). The comparison refers to the multi-step ahead forecasting properties of the methods. A total of 20 methods are used, among which 9 are the ML methods. 12 simulation experiments are performed, while each of them uses 2 000 simulated time series of 310 observations. The time series are simulated using stochastic processes from the families of ARMA and ARFIMA models. Each time series is split into a fitting (first 300 observations) and a testing set (last 10 observations). The comparative assessment of the methods is based on 18 metrics, that quantify the methods' performance according to several criteria related to the accurate forecasting of the testing set, the capturing of its variation and the correlation between the testing and forecasted values. The most important outcome of this study is that there is not a uniformly better or worse method. However, there are methods that are regularly better or worse than others with respect to specific metrics. It appears that, although a general ranking of the methods is not possible, their classification based on their similar or contrasting performance in the various metrics is possible to some extent. Another important conclusion is that more sophisticated methods do not necessarily provide better forecasts compared to simpler methods. It is pointed out that the ML methods do not differ dramatically from the stochastic methods, while it is interesting that the NN, RF and SVM algorithms used in this study offer potentially very good performance in terms of accuracy. It should be noted that, although this study focuses on hydrological processes, the results are of general scientific interest. Another important point in this study is the use of several methods and metrics. Using fewer methods and fewer metrics would have led to a very different overall picture, particularly if those fewer metrics corresponded to fewer criteria. For this reason, we consider that the proposed methodology is appropriate for the evaluation of forecasting methods.

  14. Optimizing Tsunami Forecast Model Accuracy

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Nyland, D. L.; Huang, P. Y.

    2015-12-01

    Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.

  15. A travel time forecasting model based on change-point detection method

    NASA Astrophysics Data System (ADS)

    LI, Shupeng; GUANG, Xiaoping; QIAN, Yongsheng; ZENG, Junwei

    2017-06-01

    Travel time parameters obtained from road traffic sensors data play an important role in traffic management practice. A travel time forecasting model is proposed for urban road traffic sensors data based on the method of change-point detection in this paper. The first-order differential operation is used for preprocessing over the actual loop data; a change-point detection algorithm is designed to classify the sequence of large number of travel time data items into several patterns; then a travel time forecasting model is established based on autoregressive integrated moving average (ARIMA) model. By computer simulation, different control parameters are chosen for adaptive change point search for travel time series, which is divided into several sections of similar state.Then linear weight function is used to fit travel time sequence and to forecast travel time. The results show that the model has high accuracy in travel time forecasting.

  16. Self-Organizing Maps-based ocean currents forecasting system.

    PubMed

    Vilibić, Ivica; Šepić, Jadranka; Mihanović, Hrvoje; Kalinić, Hrvoje; Cosoli, Simone; Janeković, Ivica; Žagar, Nedjeljka; Jesenko, Blaž; Tudor, Martina; Dadić, Vlado; Ivanković, Damir

    2016-03-16

    An ocean surface currents forecasting system, based on a Self-Organizing Maps (SOM) neural network algorithm, high-frequency (HF) ocean radar measurements and numerical weather prediction (NWP) products, has been developed for a coastal area of the northern Adriatic and compared with operational ROMS-derived surface currents. The two systems differ significantly in architecture and algorithms, being based on either unsupervised learning techniques or ocean physics. To compare performance of the two methods, their forecasting skills were tested on independent datasets. The SOM-based forecasting system has a slightly better forecasting skill, especially during strong wind conditions, with potential for further improvement when data sets of higher quality and longer duration are used for training.

  17. Self-Organizing Maps-based ocean currents forecasting system

    PubMed Central

    Vilibić, Ivica; Šepić, Jadranka; Mihanović, Hrvoje; Kalinić, Hrvoje; Cosoli, Simone; Janeković, Ivica; Žagar, Nedjeljka; Jesenko, Blaž; Tudor, Martina; Dadić, Vlado; Ivanković, Damir

    2016-01-01

    An ocean surface currents forecasting system, based on a Self-Organizing Maps (SOM) neural network algorithm, high-frequency (HF) ocean radar measurements and numerical weather prediction (NWP) products, has been developed for a coastal area of the northern Adriatic and compared with operational ROMS-derived surface currents. The two systems differ significantly in architecture and algorithms, being based on either unsupervised learning techniques or ocean physics. To compare performance of the two methods, their forecasting skills were tested on independent datasets. The SOM-based forecasting system has a slightly better forecasting skill, especially during strong wind conditions, with potential for further improvement when data sets of higher quality and longer duration are used for training. PMID:26979129

  18. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  19. Short-term data forecasting based on wavelet transformation and chaos theory

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Li, Cunbin; Zhang, Liang

    2017-09-01

    A sketch of wavelet transformation and its application was given. Concerning the characteristics of time sequence, Haar wavelet was used to do data reduction. After processing, the effect of “data nail” on forecasting was reduced. Chaos theory was also introduced, a new chaos time series forecasting flow based on wavelet transformation was proposed. The largest Lyapunov exponent was larger than zero from small data sets, it verified the data change behavior still met chaotic behavior. Based on this, chaos time series to forecast short-term change behavior could be used. At last, the example analysis of the price from a real electricity market showed that the forecasting method increased the precision of the forecasting more effectively and steadily.

  20. Post-processing method for wind speed ensemble forecast using wind speed and direction

    NASA Astrophysics Data System (ADS)

    Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin

    2017-04-01

    Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.

  1. First Assessment of Itaipu Dam Ensemble Inflow Forecasting System

    NASA Astrophysics Data System (ADS)

    Mainardi Fan, Fernando; Machado Vieira Lisboa, Auder; Gomes Villa Trinidad, Giovanni; Rógenes Monteiro Pontes, Paulo; Collischonn, Walter; Tucci, Carlos; Costa Buarque, Diogo

    2017-04-01

    Inflow forecasting for Hydropower Plants (HPP) Dams is one of the prominent uses for hydrological forecasts. A very important HPP in terms of energy generation for South America is the Itaipu Dam, located in the Paraná River, between Brazil and Paraguay countries, with a drainage area of 820.000km2. In this work, we present the development of an ensemble forecasting system for Itaipu, operational since November 2015. The system is based in the MGB-IPH hydrological model, includes hydrodynamics simulations of the main river, and is run every day morning forced by seven different rainfall forecasts: (i) CPTEC-ETA 15km; (ii) CPTEC-BRAMS 5km; (iii) SIMEPAR WRF Ferrier; (iv) SIMEPAR WRF Lin; (v) SIMEPAR WRF Morrison; (vi) SIMEPAR WRF WDM6; (vii) SIMEPAR MEDIAN. The last one (vii) corresponds to the median value of SIMEPAR WRF model versions (iii to vi) rainfall forecasts. Besides the developed system, the "traditional" method for inflow forecasting generation for the Itaipu Dam is also run every day. This traditional method consists in the approximation of the future inflow based on the discharge tendency of upstream telemetric gauges. Nowadays, after all the forecasts are run, the hydrology team of Itaipu develop a consensus forecast, based on all obtained results, which is the one used for the Itaipu HPP Dam operation. After one year of operation a first evaluation of the Ensemble Forecasting System was conducted. Results show that the system performs satisfactory for rising flows up to five days lead time. However, some false alarms were also issued by most ensemble members in some cases. And not in all cases the system performed better than the traditional method, especially during hydrograph recessions. In terms of meteorological forecasts, some members usage are being discontinued. In terms of the hydrodynamics representation, it seems that a better information of rivers cross section could improve hydrographs recession curves forecasts. Those opportunities for improvements are currently being addressed in the system next update.

  2. A quantitative comparison of precipitation forecasts between the storm-scale numerical weather prediction model and auto-nowcast system in Jiangsu, China

    NASA Astrophysics Data System (ADS)

    Wang, Gaili; Yang, Ji; Wang, Dan; Liu, Liping

    2016-11-01

    Extrapolation techniques and storm-scale Numerical Weather Prediction (NWP) models are two primary approaches for short-term precipitation forecasts. The primary objective of this study is to verify precipitation forecasts and compare the performances of two nowcasting schemes: a Beijing Auto-Nowcast system (BJ-ANC) based on extrapolation techniques and a storm-scale NWP model called the Advanced Regional Prediction System (ARPS). The verification and comparison takes into account six heavy precipitation events that occurred in the summer of 2014 and 2015 in Jiangsu, China. The forecast performances of the two schemes were evaluated for the next 6 h at 1-h intervals using gridpoint-based measures of critical success index, bias, index of agreement, root mean square error, and using an object-based verification method called Structure-Amplitude-Location (SAL) score. Regarding gridpoint-based measures, BJ-ANC outperforms ARPS at first, but then the forecast accuracy decreases rapidly with lead time and performs worse than ARPS after 4-5 h of the initial forecast. Regarding the object-based verification method, most forecasts produced by BJ-ANC focus on the center of the diagram at the 1-h lead time and indicate high-quality forecasts. As the lead time increases, BJ-ANC overestimates precipitation amount and produces widespread precipitation, especially at a 6-h lead time. The ARPS model overestimates precipitation at all lead times, particularly at first.

  3. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    NASA Astrophysics Data System (ADS)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  4. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.

  5. Load Forecasting of Central Urban Area Power Grid Based on Saturated Load Density Index

    NASA Astrophysics Data System (ADS)

    Huping, Yang; Chengyi, Tang; Meng, Yu

    2018-03-01

    In the current society, coordination between urban power grid development and city development has become more and more prominent. Electricity saturated load forecasting plays an important role in the planning and development of power grids. Electricity saturated load forecasting is a new concept put forward by China in recent years in the field of grid planning. Urban saturation load forecast is different from the traditional load forecasting method for specific years, the time span of it often relatively large, and involves a wide range of aspects. This study takes a county in eastern Jiangxi as an example, this paper chooses a variety of load forecasting methods to carry on the recent load forecasting calculation to central urban area. At the same time, this paper uses load density index method to predict the Longterm load forecasting of electric saturation load of central urban area lasted until 2030. And further study shows the general distribution of the urban saturation load in space.

  6. Improved Anvil Forecasting

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2000-01-01

    This report describes the outcome of Phase 1 of the AMU's Improved Anvil Forecasting task. Forecasters in the 45th Weather Squadron and the Spaceflight Meteorology Group have found that anvil forecasting is a difficult task when predicting LCC and FR violations. The purpose of this task is to determine the technical feasibility of creating an anvil-forecasting tool. Work on this study was separated into three steps: literature search, forecaster discussions, and determination of technical feasibility. The literature search revealed no existing anvil-forecasting techniques. However, there appears to be growing interest in anvils in recent years. If this interest continues to grow, more information will be available to aid in developing a reliable anvil-forecasting tool. The forecaster discussion step revealed an array of methods on how better forecasting techniques could be developed. The forecasters have ideas based on sound meteorological principles and personal experience in forecasting and analyzing anvils. Based on the information gathered in the discussions with the forecasters, the conclusion of this report is that it is technically feasible at this time to develop an anvil forecasting technique that will significantly contribute to the confidence in anvil forecasts.

  7. Near real time wind energy forecasting incorporating wind tunnel modeling

    NASA Astrophysics Data System (ADS)

    Lubitz, William David

    A series of experiments and investigations were carried out to inform the development of a day-ahead wind power forecasting system. An experimental near-real time wind power forecasting system was designed and constructed that operates on a desktop PC and forecasts 12--48 hours in advance. The system uses model output of the Eta regional scale forecast (RSF) to forecast the power production of a wind farm in the Altamont Pass, California, USA from 12 to 48 hours in advance. It is of modular construction and designed to also allow diagnostic forecasting using archived RSF data, thereby allowing different methods of completing each forecasting step to be tested and compared using the same input data. Wind-tunnel investigations of the effect of wind direction and hill geometry on wind speed-up above a hill were conducted. Field data from an Altamont Pass, California site was used to evaluate several speed-up prediction algorithms, both with and without wind direction adjustment. These algorithms were found to be of limited usefulness for the complex terrain case evaluated. Wind-tunnel and numerical simulation-based methods were developed for determining a wind farm power curve (the relation between meteorological conditions at a point in the wind farm and the power production of the wind farm). Both methods, as well as two methods based on fits to historical data, ultimately showed similar levels of accuracy: mean absolute errors predicting power production of 5 to 7 percent of the wind farm power capacity. The downscaling of RSF forecast data to the wind farm was found to be complicated by the presence of complex terrain. Poor results using the geostrophic drag law and regression methods motivated the development of a database search method that is capable of forecasting not only wind speeds but also power production with accuracy better than persistence.

  8. Statistical Correction of Air Temperature Forecasts for City and Road Weather Applications

    NASA Astrophysics Data System (ADS)

    Mahura, Alexander; Petersen, Claus; Sass, Bent; Gilet, Nicolas

    2014-05-01

    The method for statistical correction of air /road surface temperatures forecasts was developed based on analysis of long-term time-series of meteorological observations and forecasts (from HIgh Resolution Limited Area Model & Road Conditions Model; 3 km horizontal resolution). It has been tested for May-Aug 2012 & Oct 2012 - Mar 2013, respectively. The developed method is based mostly on forecasted meteorological parameters with a minimal inclusion of observations (covering only a pre-history period). Although the st iteration correction is based taking into account relevant temperature observations, but the further adjustment of air and road temperature forecasts is based purely on forecasted meteorological parameters. The method is model independent, e.g. it can be applied for temperature correction with other types of models having different horizontal resolutions. It is relatively fast due to application of the singular value decomposition method for matrix solution to find coefficients. Moreover, there is always a possibility for additional improvement due to extra tuning of the temperature forecasts for some locations (stations), and in particular, where for example, the MAEs are generally higher compared with others (see Gilet et al., 2014). For the city weather applications, new operationalized procedure for statistical correction of the air temperature forecasts has been elaborated and implemented for the HIRLAM-SKA model runs at 00, 06, 12, and 18 UTCs covering forecast lengths up to 48 hours. The procedure includes segments for extraction of observations and forecast data, assigning these to forecast lengths, statistical correction of temperature, one-&multi-days statistical evaluation of model performance, decision-making on using corrections by stations, interpolation, visualisation and storage/backup. Pre-operational air temperature correction runs were performed for the mainland Denmark since mid-April 2013 and shown good results. Tests also showed that the CPU time required for the operational procedure is relatively short (less than 15 minutes including a large time spent for interpolation). These also showed that in order to start correction of forecasts there is no need to have a long-term pre-historical data (containing forecasts and observations) and, at least, a couple of weeks will be sufficient when a new observational station is included and added to the forecast point. Note for the road weather application, the operationalization of the statistical correction of the road surface temperature forecasts (for the RWM system daily hourly runs covering forecast length up to 5 hours ahead) for the Danish road network (for about 400 road stations) was also implemented, and it is running in a test mode since Sep 2013. The method can also be applied for correction of the dew point temperature and wind speed (as a part of observations/ forecasts at synoptical stations), where these both meteorological parameters are parts of the proposed system of equations. The evaluation of the method performance for improvement of the wind speed forecasts is planned as well, with considering possibilities for the wind direction improvements (which is more complex due to multi-modal types of such data distribution). The method worked for the entire domain of mainland Denmark (tested for 60 synoptical and 395 road stations), and hence, it can be also applied for any geographical point within this domain, as through interpolation to about 100 cities' locations (for Danish national byvejr forecasts). Moreover, we can assume that the same method can be used in other geographical areas. The evaluation for other domains (with a focus on Greenland and Nordic countries) is planned. In addition, a similar approach might be also tested for statistical correction of concentrations of chemical species, but such approach will require additional elaboration and evaluation.

  9. Establishing a method of short-term rainfall forecasting based on GNSS-derived PWV and its application.

    PubMed

    Yao, Yibin; Shan, Lulu; Zhao, Qingzhi

    2017-09-29

    Global Navigation Satellite System (GNSS) can effectively retrieve precipitable water vapor (PWV) with high precision and high-temporal resolution. GNSS-derived PWV can be used to reflect water vapor variation in the process of strong convection weather. By studying the relationship between time-varying PWV and rainfall, it can be found that PWV contents increase sharply before raining. Therefore, a short-term rainfall forecasting method is proposed based on GNSS-derived PWV. Then the method is validated using hourly GNSS-PWV data from Zhejiang Continuously Operating Reference Station (CORS) network of the period 1 September 2014 to 31 August 2015 and its corresponding hourly rainfall information. The results show that the forecasted correct rate can reach about 80%, while the false alarm rate is about 66%. Compared with results of the previous studies, the correct rate is improved by about 7%, and the false alarm rate is comparable. The method is also applied to other three actual rainfall events of different regions, different durations, and different types. The results show that the method has good applicability and high accuracy, which can be used for rainfall forecasting, and in the future study, it can be assimilated with traditional weather forecasting techniques to improve the forecasted accuracy.

  10. Short-term load and wind power forecasting using neural network-based prediction intervals.

    PubMed

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2014-02-01

    Electrical power systems are evolving from today's centralized bulk systems to more decentralized systems. Penetrations of renewable energies, such as wind and solar power, significantly increase the level of uncertainty in power systems. Accurate load forecasting becomes more complex, yet more important for management of power systems. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in system operations. To quantify potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for the construction of prediction intervals (PIs). A newly introduced method, called lower upper bound estimation (LUBE), is applied and extended to develop PIs using NN models. A new problem formulation is proposed, which translates the primary multiobjective problem into a constrained single-objective problem. Compared with the cost function, this new formulation is closer to the primary problem and has fewer parameters. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Electrical demands from Singapore and New South Wales (Australia), as well as wind power generation from Capital Wind Farm, are used to validate the PSO-based LUBE method. Comparative results show that the proposed method can construct higher quality PIs for load and wind power generation forecasts in a short time.

  11. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    NASA Astrophysics Data System (ADS)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  12. A nowcasting technique based on application of the particle filter blending algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yuanzhao; Lan, Hongping; Chen, Xunlai; Zhang, Wenhai

    2017-10-01

    To improve the accuracy of nowcasting, a new extrapolation technique called particle filter blending was configured in this study and applied to experimental nowcasting. Radar echo extrapolation was performed by using the radar mosaic at an altitude of 2.5 km obtained from the radar images of 12 S-band radars in Guangdong Province, China. The first bilateral filter was applied in the quality control of the radar data; an optical flow method based on the Lucas-Kanade algorithm and the Harris corner detection algorithm were used to track radar echoes and retrieve the echo motion vectors; then, the motion vectors were blended with the particle filter blending algorithm to estimate the optimal motion vector of the true echo motions; finally, semi-Lagrangian extrapolation was used for radar echo extrapolation based on the obtained motion vector field. A comparative study of the extrapolated forecasts of four precipitation events in 2016 in Guangdong was conducted. The results indicate that the particle filter blending algorithm could realistically reproduce the spatial pattern, echo intensity, and echo location at 30- and 60-min forecast lead times. The forecasts agreed well with observations, and the results were of operational significance. Quantitative evaluation of the forecasts indicates that the particle filter blending algorithm performed better than the cross-correlation method and the optical flow method. Therefore, the particle filter blending method is proved to be superior to the traditional forecasting methods and it can be used to enhance the ability of nowcasting in operational weather forecasts.

  13. Forecasting Construction Cost Index based on visibility graph: A network approach

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Ashuri, Baabak; Shyr, Yu; Deng, Yong

    2018-03-01

    Engineering News-Record (ENR), a professional magazine in the field of global construction engineering, publishes Construction Cost Index (CCI) every month. Cost estimators and contractors assess projects, arrange budgets and prepare bids by forecasting CCI. However, fluctuations and uncertainties of CCI cause irrational estimations now and then. This paper aims at achieving more accurate predictions of CCI based on a network approach in which time series is firstly converted into a visibility graph and future values are forecasted relied on link prediction. According to the experimental results, the proposed method shows satisfactory performance since the error measures are acceptable. Compared with other methods, the proposed method is easier to implement and is able to forecast CCI with less errors. It is convinced that the proposed method is efficient to provide considerably accurate CCI predictions, which will make contributions to the construction engineering by assisting individuals and organizations in reducing costs and making project schedules.

  14. A Method for Oscillation Errors Restriction of SINS Based on Forecasted Time Series.

    PubMed

    Zhao, Lin; Li, Jiushun; Cheng, Jianhua; Jia, Chun; Wang, Qiufan

    2015-07-17

    Continuity, real-time, and accuracy are the key technical indexes of evaluating comprehensive performance of a strapdown inertial navigation system (SINS). However, Schuler, Foucault, and Earth periodic oscillation errors significantly cut down the real-time accuracy of SINS. A method for oscillation error restriction of SINS based on forecasted time series is proposed by analyzing the characteristics of periodic oscillation errors. The innovative method gains multiple sets of navigation solutions with different phase delays in virtue of the forecasted time series acquired through the measurement data of the inertial measurement unit (IMU). With the help of curve-fitting based on least square method, the forecasted time series is obtained while distinguishing and removing small angular motion interference in the process of initial alignment. Finally, the periodic oscillation errors are restricted on account of the principle of eliminating the periodic oscillation signal with a half-wave delay by mean value. Simulation and test results show that the method has good performance in restricting the Schuler, Foucault, and Earth oscillation errors of SINS.

  15. A Method for Oscillation Errors Restriction of SINS Based on Forecasted Time Series

    PubMed Central

    Zhao, Lin; Li, Jiushun; Cheng, Jianhua; Jia, Chun; Wang, Qiufan

    2015-01-01

    Continuity, real-time, and accuracy are the key technical indexes of evaluating comprehensive performance of a strapdown inertial navigation system (SINS). However, Schuler, Foucault, and Earth periodic oscillation errors significantly cut down the real-time accuracy of SINS. A method for oscillation error restriction of SINS based on forecasted time series is proposed by analyzing the characteristics of periodic oscillation errors. The innovative method gains multiple sets of navigation solutions with different phase delays in virtue of the forecasted time series acquired through the measurement data of the inertial measurement unit (IMU). With the help of curve-fitting based on least square method, the forecasted time series is obtained while distinguishing and removing small angular motion interference in the process of initial alignment. Finally, the periodic oscillation errors are restricted on account of the principle of eliminating the periodic oscillation signal with a half-wave delay by mean value. Simulation and test results show that the method has good performance in restricting the Schuler, Foucault, and Earth oscillation errors of SINS. PMID:26193283

  16. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    NASA Astrophysics Data System (ADS)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble, which is a case-matching scheme. The presentation will provide (1) an overview of each method and the experimental design, (2) performance comparisons based on standard metrics such as bias, MAE and RMSE, (3) a summary of the performance characteristics of each approach and (4) a preview of further experiments to be conducted.

  17. Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs

    NASA Astrophysics Data System (ADS)

    Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan

    2016-04-01

    Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more accurate measure of forecast uncertainty that could result in better decision-making. It offers different level of abstractions to help with the recalibration of the RAR method. It also has an inspection tool that displays the selected analogs, their observations and statistical data. It gives the users access to inner parts of the method, unveiling hidden information. References [GR05] GNEITING T., RAFTERY A. E.: Weather forecasting with ensemble methods. Science 310, 5746, 248-249, 2005. [KAL03] KALNAY E.: Atmospheric modeling, data assimilation and predictability. Cambridge University Press, 2003. [PH06] PALMER T., HAGEDORN R.: Predictability of weather and climate. Cambridge University Press, 2006. [HW06] HAMILL T. M., WHITAKER J. S.: Probabilistic quantitative precipitation forecasts based on reforecast analogs: Theory and application. Monthly Weather Review 134, 11, 3209-3229, 2006. [DE06] DEITRICK S., EDSALL R.: The influence of uncertainty visualization on decision making: An empirical evaluation. Springer, 2006. [KMS08] KEIM D. A., MANSMANN F., SCHNEIDEWIND J., THOMAS J., ZIEGLER H.: Visual analytics: Scope and challenges. Springer, 2008.

  18. Use of Vertically Integrated Ice in WRF-Based Forecasts of Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, E. W., jr.; Goodman, S. J.

    2008-01-01

    Previously reported methods of forecasting lightning threat using fields of graupel flux from WRF simulations are extended to include the simulated field of vertically integrated ice within storms. Although the ice integral shows less temporal variability than graupel flux, it provides more areal coverage, and can thus be used to create a lightning forecast that better matches the areal coverage of the lightning threat found in observations of flash extent density. A blended lightning forecast threat can be constructed that retains much of the desirable temporal sensitivity of the graupel flux method, while also incorporating the coverage benefits of the ice integral method. The graupel flux and ice integral fields contributing to the blended forecast are calibrated against observed lightning flash origin density data, based on Lightning Mapping Array observations from a series of case studies chosen to cover a wide range of flash rate conditions. Linear curve fits that pass through the origin are found to be statistically robust for the calibration procedures.

  19. Winter Precipitation Forecast in the European and Mediterranean Regions Using Cluster Analysis

    NASA Astrophysics Data System (ADS)

    Totz, Sonja; Tziperman, Eli; Coumou, Dim; Pfeiffer, Karl; Cohen, Judah

    2017-12-01

    The European climate is changing under global warming, and especially the Mediterranean region has been identified as a hot spot for climate change with climate models projecting a reduction in winter rainfall and a very pronounced increase in summertime heat waves. These trends are already detectable over the historic period. Hence, it is beneficial to forecast seasonal droughts well in advance so that water managers and stakeholders can prepare to mitigate deleterious impacts. We developed a new cluster-based empirical forecast method to predict precipitation anomalies in winter. This algorithm considers not only the strength but also the pattern of the precursors. We compare our algorithm with dynamic forecast models and a canonical correlation analysis-based prediction method demonstrating that our prediction method performs better in terms of time and pattern correlation in the Mediterranean and European regions.

  20. a system approach to the long term forecasting of the climat data in baikal region

    NASA Astrophysics Data System (ADS)

    Abasov, N.; Berezhnykh, T.

    2003-04-01

    The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method of forecasting (with a year in advance) is based on the property of alternation of series of years with increase and decrease in the observed indicators (characteristic indices) of natural processes. Most of the series (98.4-99.6%) are represented by series of one to three years. The problem of forecasting is divided into two parts: 1) qualitative forecast of the probability that the started series will either continue or be replaced by a new series during the next year that is based on the frequency characteristics of series of years with increase or decrease of the forecasted sequence); 2) quantitative estimate of the forecasted value in the form of a curve of conditional frequencies is made on the base of intra-sequence interrelations among hydrometeorological elements by their differentiation with respect to series of years of increase or decrease, by construction of particular curves of conditional frequencies of the runoff for each expected variant of series development and by subsequent construction a generalized curve. Approximative learning methods form forecasted trajectories of the studied process indices for a long-term perspective. The method of analog-similarity relations is based on the fact that long periods of observations reveal some similarities in the character of variability of indices for some fragments of the sequence x (t) by definite criteria. The idea of the method is to estimate similarity of such fragments of the sequence that have been called the analogs. The method applies multistage optimization of both external parameters (e.g. the number of iterations of the sliding averaging needed to decompose the sequence into two components: the smoothed one with isolated periodic oscillations and the residual or random one). The method is applicable to current terms of forecasts and ending with the double solar cycle. Using a special procedure of integration, it separates terms with the best results for the given optimization subsample. Several optimal vectors of parameters obtained are tested on the examination (verifying) subsample. If the procedure is successful, the forecast is immediately made by integration of several best solutions. Peculiarities of forecasting extreme processes. Methods of long-term forecasting allow the sufficiently reliable forecasts to be made within the interval of xmin+Δ_1, xmax - Δ_2 (i.e. in the interval of medium values of indices). Meanwhile, in the intervals close to extreme ones, reliability of forecasts is substantially lower. While for medium values the statistics of the100-year sequence gives acceptable results owing to a sufficiently large number of revealed analogs that correspond to prognostic samples, for extreme values the situation is quite different, first of all by virtue of poverty of statistical data. Decreasing the values of Δ_1,Δ_2: Δ_1,Δ_2 rightarrow 0 (by including them into optimization parameters of the considered forecasting methods) could be one of the ways to improve reliability of forecasts. Partially, such an approach has been realized in the method of analog-similarity relations, giving the possibility to form a range of possible forecasted trajectories in two variants - from the minimum possible trajectory to the maximum possible one. Reliability of long-term forecasts. Both the methodology and the methods considered above have been realized as the information-forecasting system "GIPSAR". The system includes some tools implementing several methods of forecasting, analysis of initial and forecasted information, a developed database, a set of tools for verification of algorithms, additional information on the algorithms of statistical processing of sequences (sliding averaging, integral-difference curves, etc.), aids to organize input of initial information (in its various forms) as well as aids to draw up output prognostic documents. Risk management. The normal functioning of the Angara cascade is periodically interrupted by risks of two types that take place in the Baikal, the Bratsk and Ust-Ilimsk reservoirs: long low-water periods and sudden periods of extremely high water levels. For example, low-water periods, observed in the reservoirs of the Angara cascade can be classified under four risk categories : 1 - acceptable (negligible reduction of electric power generation by hydropower plants; certain difficulty in meeting environmental and navigation requirements); 2 - significant (substantial reduction of electric power generation by hydropower plants; certain restriction on water releases for navigation; violation of environmental requirements in some years); 3 - emergency (big losses in electric power generation; limited electricity supply to large consumers; significant restriction of water releases for navigation; threat of exposure of drinkable water intake works; violation of environmental requirements for a number of years); 4 - catastrophic (energy crisis; social crisis exposure of drinkable water intake works; termination of navigation; environmental catastrophe). Management of energy systems consists in operative, many-year regulation and perspective planning and has to take into account the analysis of operative data (water reserves in reservoirs), long-term statistics and relations among natural processes and also forecasts - short-term (for a day, week, decade), long-term and/or super-long-term (from a month to several decades). Such natural processes as water inflow to reservoirs, air temperatures during heating periods depend in turn on external factors: prevailing types of atmospheric circulation, intensity of the 11- and 22-year cycles of solar activity, volcanic activity, interaction between the ocean and atmosphere, etc. Until recently despite the formed scientific schools on long-term forecasting (I.P.Druzhinin, A.P.Reznikhov) the energy system management has been based on specially drawn dispatching schedules and long-term hydrometeorological forecasts only without attraction of perspective forecasted indices. Insertion of a parallel block of forecast (based on the analysis of data on natural processes and special methods of forecasting) into the scheme can largely smooth unfavorable consequences from the impact of natural processes on sustainable development of energy systems and especially on its safe operation. However, the requirements to reliability and accuracy of long-term forecasts significantly increase. The considered approach to long term forecasting can be used for prediction: mean winter and summer air temperatures, droughts and wood fires.

  1. Performance of univariate forecasting on seasonal diseases: the case of tuberculosis.

    PubMed

    Permanasari, Adhistya Erna; Rambli, Dayang Rohaya Awang; Dominic, P Dhanapal Durai

    2011-01-01

    The annual disease incident worldwide is desirable to be predicted for taking appropriate policy to prevent disease outbreak. This chapter considers the performance of different forecasting method to predict the future number of disease incidence, especially for seasonal disease. Six forecasting methods, namely linear regression, moving average, decomposition, Holt-Winter's, ARIMA, and artificial neural network (ANN), were used for disease forecasting on tuberculosis monthly data. The model derived met the requirement of time series with seasonality pattern and downward trend. The forecasting performance was compared using similar error measure in the base of the last 5 years forecast result. The findings indicate that ARIMA model was the most appropriate model since it obtained the less relatively error than the other model.

  2. Predicting areas of sustainable error growth in quasigeostrophic flows using perturbation alignment properties

    NASA Astrophysics Data System (ADS)

    Rivière, G.; Hua, B. L.

    2004-10-01

    A new perturbation initialization method is used to quantify error growth due to inaccuracies of the forecast model initial conditions in a quasigeostrophic box ocean model describing a wind-driven double gyre circulation. This method is based on recent analytical results on Lagrangian alignment dynamics of the perturbation velocity vector in quasigeostrophic flows. More specifically, it consists in initializing a unique perturbation from the sole knowledge of the control flow properties at the initial time of the forecast and whose velocity vector orientation satisfies a Lagrangian equilibrium criterion. This Alignment-based Initialization method is hereafter denoted as the AI method.In terms of spatial distribution of the errors, we have compared favorably the AI error forecast with the mean error obtained with a Monte-Carlo ensemble prediction. It is shown that the AI forecast is on average as efficient as the error forecast initialized with the leading singular vector for the palenstrophy norm, and significantly more efficient than that for total energy and enstrophy norms. Furthermore, a more precise examination shows that the AI forecast is systematically relevant for all control flows whereas the palenstrophy singular vector forecast leads sometimes to very good scores and sometimes to very bad ones.A principal component analysis at the final time of the forecast shows that the AI mode spatial structure is comparable to that of the first eigenvector of the error covariance matrix for a "bred mode" ensemble. Furthermore, the kinetic energy of the AI mode grows at the same constant rate as that of the "bred modes" from the initial time to the final time of the forecast and is therefore characterized by a sustained phase of error growth. In this sense, the AI mode based on Lagrangian dynamics of the perturbation velocity orientation provides a rationale of the "bred mode" behavior.

  3. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Robertson, Franklin R.; Bosilovich, Michael; Lyon, Bradfield; Funk, Chris

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period

  4. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, J. Brent; Bosilovich, Michael; Lyon, Bradfield

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period.

  5. Statistical Short-Range Forecast Guidance for Cloud Ceilings Over the Shuttle Landing Facility

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2001-01-01

    This report describes the results of the AMU's Short-Range Statistical Forecasting task. The cloud ceiling forecast over the Shuttle Landing Facility (SLF) is a critical element in determining whether a Shuttle should land. Spaceflight Meteorology Group (SMG) forecasters find that ceilings at the SLF are challenging to forecast. The AMU was tasked to develop ceiling forecast equations to minimize the challenge. Studies in the literature that showed success in improving short-term forecasts of ceiling provided the basis for the AMU task. A 20-year record of cool-season hourly surface observations from stations in east-central Florida was used for the equation development. Two methods were used: an observations-based (OBS) method that incorporated data from all stations, and a persistence climatology (PCL) method used as the benchmark. Equations were developed for 1-, 2-, and 3-hour lead times at each hour of the day. A comparison between the two methods indicated that the OBS equations performed well and produced an improvement over the PCL equations. Therefore, the conclusion of the AMU study is that OBS equations produced more accurate forecasts than the PCL equations, and can be used in operations. They provide another tool with which to make the ceiling forecasts that are critical to safe Shuttle landings at KSC.

  6. Applications of the gambling score in evaluating earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  7. Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test

    PubMed Central

    Jiang, Chuanjin; Zhang, Jing; Song, Fugen

    2014-01-01

    Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061

  8. Selecting single model in combination forecasting based on cointegration test and encompassing test.

    PubMed

    Jiang, Chuanjin; Zhang, Jing; Song, Fugen

    2014-01-01

    Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.

  9. A scoping review of malaria forecasting: past work and future directions

    PubMed Central

    Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L

    2012-01-01

    Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505

  10. Combination of synoptical-analogous and dynamical methods to increase skill score of monthly air temperature forecasts over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Khan, Valentina; Tscepelev, Valery; Vilfand, Roman; Kulikova, Irina; Kruglova, Ekaterina; Tischenko, Vladimir

    2016-04-01

    Long-range forecasts at monthly-seasonal time scale are in great demand of socio-economic sectors for exploiting climate-related risks and opportunities. At the same time, the quality of long-range forecasts is not fully responding to user application necessities. Different approaches, including combination of different prognostic models, are used in forecast centers to increase the prediction skill for specific regions and globally. In the present study, two forecasting methods are considered which are exploited in operational practice of Hydrometeorological Center of Russia. One of them is synoptical-analogous method of forecasting of surface air temperature at monthly scale. Another one is dynamical system based on the global semi-Lagrangian model SL-AV, developed in collaboration of Institute of Numerical Mathematics and Hydrometeorological Centre of Russia. The seasonal version of this model has been used to issue global and regional forecasts at monthly-seasonal time scales. This study presents results of the evaluation of surface air temperature forecasts generated with using above mentioned synoptical-statistical and dynamical models, and their combination to potentially increase skill score over Northern Eurasia. The test sample of operational forecasts is encompassing period from 2010 through 2015. The seasonal and interannual variability of skill scores of these methods has been discussed. It was noticed that the quality of all forecasts is highly dependent on the inertia of macro-circulation processes. The skill scores of forecasts are decreasing during significant alterations of synoptical fields for both dynamical and empirical schemes. Procedure of combination of forecasts from different methods, in some cases, has demonstrated its effectiveness. For this study the support has been provided by Grant of Russian Science Foundation (№14-37-00053).

  11. The Betting Odds Rating System: Using soccer forecasts to forecast soccer.

    PubMed

    Wunderlich, Fabian; Memmert, Daniel

    2018-01-01

    Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods.

  12. The Betting Odds Rating System: Using soccer forecasts to forecast soccer

    PubMed Central

    Memmert, Daniel

    2018-01-01

    Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods. PMID:29870554

  13. Forecasting Non-Stationary Diarrhea, Acute Respiratory Infection, and Malaria Time-Series in Niono, Mali

    PubMed Central

    Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou

    2007-01-01

    Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322

  14. Forecasting non-stationary diarrhea, acute respiratory infection, and malaria time-series in Niono, Mali.

    PubMed

    Medina, Daniel C; Findley, Sally E; Guindo, Boubacar; Doumbia, Seydou

    2007-11-21

    Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. In this longitudinal retrospective (01/1996-06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel.

  15. Model-free aftershock forecasts constructed from similar sequences in the past

    NASA Astrophysics Data System (ADS)

    van der Elst, N.; Page, M. T.

    2017-12-01

    The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.

  16. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  17. Discrete post-processing of total cloud cover ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian

    2017-04-01

    This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.

  18. Short-term solar activity forecasting

    NASA Technical Reports Server (NTRS)

    Xie-Zhen, C.; Ai-Di, Z.

    1979-01-01

    A method of forecasting the level of activity of every active region on the surface of the Sun within one to three days is proposed in order to estimate the possibility of the occurrence of ionospheric disturbances and proton events. The forecasting method is a probability process based on statistics. In many of the cases, the accuracy in predicting the short term solar activity was in the range of 70%, although there were many false alarms.

  19. Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain

    PubMed Central

    Dai, Yonghui; Han, Dongmei; Dai, Weihui

    2014-01-01

    The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659

  20. Forecasting hotspots using predictive visual analytics approach

    DOEpatents

    Maciejewski, Ross; Hafen, Ryan; Rudolph, Stephen; Cleveland, William; Ebert, David

    2014-12-30

    A method for forecasting hotspots is provided. The method may include the steps of receiving input data at an input of the computational device, generating a temporal prediction based on the input data, generating a geospatial prediction based on the input data, and generating output data based on the time series and geospatial predictions. The output data may be configured to display at least one user interface at an output of the computational device.

  1. Assessment of GNSS-based height data of multiple ships for measuring and forecasting great tsunamis

    NASA Astrophysics Data System (ADS)

    Inazu, Daisuke; Waseda, Takuji; Hibiya, Toshiyuki; Ohta, Yusaku

    2016-12-01

    Ship height positioning by the Global Navigation Satellite System (GNSS) was investigated for measuring and forecasting great tsunamis. We first examined GNSS height-positioning data of a navigating vessel. If we use the kinematic precise point positioning (PPP) method, tsunamis greater than 10-1 m will be detected by ship height positioning. Based on Automatic Identification System (AIS) data, we found that tens of cargo ships and tankers are usually identified to navigate over the Nankai Trough, southwest Japan. We assumed that a future Nankai Trough great earthquake tsunami will be observed by the kinematic PPP height positioning of an AIS-derived ship distribution, and examined the tsunami forecast capability of the offshore tsunami measurements based on the PPP-based ship height. A method to estimate the initial tsunami height distribution using offshore tsunami observations was used for forecasting. Tsunami forecast tests were carried out using simulated tsunami data by the PPP-based ship height of 92 cargo ships/tankers, and by currently operating deep-sea pressure and Global Positioning System (GPS) buoy observations at 71 stations over the Nankai Trough. The forecast capability using the PPP-based height of the 92 ships was shown to be comparable to or better than that using the operating offshore observatories at the 71 stations. We suppose that, immediately after the occurrence of a great earthquake, stations receiving successive ship information (AIS data) along certain areas of the coast would fail to acquire ship data due to strong ground shaking, especially near the epicenter. Such a situation would significantly deteriorate the tsunami-forecast capability using ship data. On the other hand, operational real-time analysis of seismic/geodetic data would be carried out for estimating a tsunamigenic fault model. Incorporating the seismic/geodetic fault model estimation into the tsunami forecast above possibly compensates for the deteriorated forecast capability.

  2. Ensemble-based methods for forecasting census in hospital units

    PubMed Central

    2013-01-01

    Background The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. Methods In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Results Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Conclusions Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts. PMID:23721123

  3. Probabilistic precipitation nowcasting based on an extrapolation of radar reflectivity and an ensemble approach

    NASA Astrophysics Data System (ADS)

    Sokol, Zbyněk; Mejsnar, Jan; Pop, Lukáš; Bližňák, Vojtěch

    2017-09-01

    A new method for the probabilistic nowcasting of instantaneous rain rates (ENS) based on the ensemble technique and extrapolation along Lagrangian trajectories of current radar reflectivity is presented. Assuming inaccurate forecasts of the trajectories, an ensemble of precipitation forecasts is calculated and used to estimate the probability that rain rates will exceed a given threshold in a given grid point. Although the extrapolation neglects the growth and decay of precipitation, their impact on the probability forecast is taken into account by the calibration of forecasts using the reliability component of the Brier score (BS). ENS forecasts the probability that the rain rates will exceed thresholds of 0.1, 1.0 and 3.0 mm/h in squares of 3 km by 3 km. The lead times were up to 60 min, and the forecast accuracy was measured by the BS. The ENS forecasts were compared with two other methods: combined method (COM) and neighbourhood method (NEI). NEI considered the extrapolated values in the square neighbourhood of 5 by 5 grid points of the point of interest as ensemble members, and the COM ensemble was comprised of united ensemble members of ENS and NEI. The results showed that the calibration technique significantly improves bias of the probability forecasts by including additional uncertainties that correspond to neglected processes during the extrapolation. In addition, the calibration can also be used for finding the limits of maximum lead times for which the forecasting method is useful. We found that ENS is useful for lead times up to 60 min for thresholds of 0.1 and 1 mm/h and approximately 30 to 40 min for a threshold of 3 mm/h. We also found that a reasonable size of the ensemble is 100 members, which provided better scores than ensembles with 10, 25 and 50 members. In terms of the BS, the best results were obtained by ENS and COM, which are comparable. However, ENS is better calibrated and thus preferable.

  4. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan

    2015-08-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reductionmore » in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.« less

  5. Replacing climatological potential evapotranspiration estimates with dynamic satellite-based observations in operational hydrologic prediction models

    NASA Astrophysics Data System (ADS)

    Franz, K. J.; Bowman, A. L.; Hogue, T. S.; Kim, J.; Spies, R.

    2011-12-01

    In the face of a changing climate, growing populations, and increased human habitation in hydrologically risky locations, both short- and long-range planners increasingly require robust and reliable streamflow forecast information. Current operational forecasting utilizes watershed-scale, conceptual models driven by ground-based (commonly point-scale) observations of precipitation and temperature and climatological potential evapotranspiration (PET) estimates. The PET values are derived from historic pan evaporation observations and remain static from year-to-year. The need for regional dynamic PET values is vital for improved operational forecasting. With the advent of satellite remote sensing and the adoption of a more flexible operational forecast system by the National Weather Service, incorporation of advanced data products is now more feasible than in years past. In this study, we will test a previously developed satellite-derived PET product (UCLA MODIS-PET) in the National Weather Service forecast models and compare the model results to current methods. The UCLA MODIS-PET method is based on the Priestley-Taylor formulation, is driven with MODIS satellite products, and produces a daily, 250m PET estimate. The focus area is eight headwater basins in the upper Midwest U.S. There is a need to develop improved forecasting methods for this region that are able to account for climatic and landscape changes more readily and effectively than current methods. This region is highly flood prone yet sensitive to prolonged dry periods in late summer and early fall, and is characterized by a highly managed landscape, which has drastically altered the natural hydrologic cycle. Our goal is to improve model simulations, and thereby, the initial conditions prior to the start of a forecast through the use of PET values that better reflect actual watershed conditions. The forecast models are being tested in both distributed and lumped mode.

  6. Comparative assessment of several post-processing methods for correcting evapotranspiration forecasts derived from TIGGE datasets.

    NASA Astrophysics Data System (ADS)

    Tian, D.; Medina, H.

    2017-12-01

    Post-processing of medium range reference evapotranspiration (ETo) forecasts based on numerical weather prediction (NWP) models has the potential of improving the quality and utility of these forecasts. This work compares the performance of several post-processing methods for correcting ETo forecasts over the continental U.S. generated from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database using data from Europe (EC), the United Kingdom (MO), and the United States (NCEP). The pondered post-processing techniques are: simple bias correction, the use of multimodels, the Ensemble Model Output Statistics (EMOS, Gneitting et al., 2005) and the Bayesian Model Averaging (BMA, Raftery et al., 2005). ETo estimates based on quality-controlled U.S. Regional Climate Reference Network measurements, and computed with the FAO 56 Penman Monteith equation, are adopted as baseline. EMOS and BMA are generally the most efficient post-processing techniques of the ETo forecasts. Nevertheless, the simple bias correction of the best model is commonly much more rewarding than using multimodel raw forecasts. Our results demonstrate the potential of different forecasting and post-processing frameworks in operational evapotranspiration and irrigation advisory systems at national scale.

  7. Assessing the skill of seasonal precipitation and streamflow forecasts in sixteen French catchments

    NASA Astrophysics Data System (ADS)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian

    2015-04-01

    Meteorological centres make sustained efforts to provide seasonal forecasts that are increasingly skilful. Streamflow forecasting is one of the many applications than can benefit from these efforts. Seasonal flow forecasts generated using seasonal ensemble precipitation forecasts as input to a hydrological model can help to take anticipatory measures for water supply reservoir operation or drought risk management. The objective of the study is to assess the skill of seasonal precipitation and streamflow forecasts in France. First, we evaluated the skill of ECMWF SYS4 seasonal precipitation forecasts for streamflow forecasting in sixteen French catchments. Daily flow forecasts were produced using raw seasonal precipitation forecasts as input to the GR6J hydrological model. Ensemble forecasts are issued every month with 15 or 51 members according to the month of the year and evaluated for up to 90 days ahead. In a second step, we applied eight variants of bias correction approaches to the precipitation forecasts prior to generating the flow forecasts. The approaches were based on the linear scaling and the distribution mapping methods. The skill of the ensemble forecasts was assessed in accuracy (MAE), reliability (PIT Diagram) and overall performance (CRPS). The results show that, in most catchments, raw seasonal precipitation and streamflow forecasts are more skilful in terms of accuracy and overall performance than a reference prediction based on historic observed precipitation and watershed initial conditions at the time of forecast. Reliability is the only attribute that is not significantly improved. The skill of the forecasts is, in general, improved when applying bias correction. Two bias correction methods showed the best performance for the studied catchments: the simple linear scaling of monthly values and the empirical distribution mapping of daily values. L. Crochemore is funded by the Interreg IVB DROP Project (Benefit of governance in DROught adaPtation).

  8. 3D cloud detection and tracking system for solar forecast using multiple sky imagers

    DOE PAGES

    Peng, Zhenzhou; Yu, Dantong; Huang, Dong; ...

    2015-06-23

    We propose a system for forecasting short-term solar irradiance based on multiple total sky imagers (TSIs). The system utilizes a novel method of identifying and tracking clouds in three-dimensional space and an innovative pipeline for forecasting surface solar irradiance based on the image features of clouds. First, we develop a supervised classifier to detect clouds at the pixel level and output cloud mask. In the next step, we design intelligent algorithms to estimate the block-wise base height and motion of each cloud layer based on images from multiple TSIs. Thus, this information is then applied to stitch images together intomore » larger views, which are then used for solar forecasting. We examine the system’s ability to track clouds under various cloud conditions and investigate different irradiance forecast models at various sites. We confirm that this system can 1) robustly detect clouds and track layers, and 2) extract the significant global and local features for obtaining stable irradiance forecasts with short forecast horizons from the obtained images. Finally, we vet our forecasting system at the 32-megawatt Long Island Solar Farm (LISF). Compared with the persistent model, our system achieves at least a 26% improvement for all irradiance forecasts between one and fifteen minutes.« less

  9. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

    DOE PAGES

    Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; ...

    2015-11-10

    Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

  10. Why preferring parametric forecasting to nonparametric methods?

    PubMed

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Application and evaluation of forecasting methods for municipal solid waste generation in an Eastern-European city.

    PubMed

    Rimaityte, Ingrida; Ruzgas, Tomas; Denafas, Gintaras; Racys, Viktoras; Martuzevicius, Dainius

    2012-01-01

    Forecasting of generation of municipal solid waste (MSW) in developing countries is often a challenging task due to the lack of data and selection of suitable forecasting method. This article aimed to select and evaluate several methods for MSW forecasting in a medium-scaled Eastern European city (Kaunas, Lithuania) with rapidly developing economics, with respect to affluence-related and seasonal impacts. The MSW generation was forecast with respect to the economic activity of the city (regression modelling) and using time series analysis. The modelling based on social-economic indicators (regression implemented in LCA-IWM model) showed particular sensitivity (deviation from actual data in the range from 2.2 to 20.6%) to external factors, such as the synergetic effects of affluence parameters or changes in MSW collection system. For the time series analysis, the combination of autoregressive integrated moving average (ARIMA) and seasonal exponential smoothing (SES) techniques were found to be the most accurate (mean absolute percentage error equalled to 6.5). Time series analysis method was very valuable for forecasting the weekly variation of waste generation data (r (2) > 0.87), but the forecast yearly increase should be verified against the data obtained by regression modelling. The methods and findings of this study may assist the experts, decision-makers and scientists performing forecasts of MSW generation, especially in developing countries.

  12. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan

    2015-10-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reductionmore » in the amount of reserves that must be held to accommodate the uncertainty of solar power output.« less

  13. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  14. A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2015-04-01

    Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.

  15. Adaptive correction of ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane

    2017-04-01

    Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.

  16. Improving flood forecasting capability of physically based distributed hydrological model by parameter optimization

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Li, J.; Xu, H.

    2015-10-01

    Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for Liuxihe model parameter optimization effectively, and could improve the model capability largely in catchment flood forecasting, thus proven that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological model. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for Liuxihe model catchment flood forcasting is 20 and 30, respectively.

  17. A Condition Based Maintenance Approach to Forecasting B-1 Aircraft Parts

    DTIC Science & Technology

    2017-03-23

    1 Problem Statement...aimed at making the USAF aware of CBM methods, and recommending which techniques to consider for implementation. Problem Statement The USAF relies on... problem , this research will seek to highlight common CBM forecasting methods that are well established and evaluate its suitability with current USAF

  18. Forecasting Lightning Threat using Cloud-resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.

    2009-01-01

    As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because models tend to have more difficulty in correctly predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models, the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of cloud-allowing forecasts become available.

  19. Some Advances in Downscaling Probabilistic Climate Forecasts for Agricultural Decision Support

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A.

    2015-12-01

    Seasonal climate forecasts, commonly provided in tercile-probabilities format (below-, near- and above-normal), need to be translated into more meaningful information for decision support of practitioners in agriculture. In this paper, we will present two new novel approaches to temporally downscale probabilistic seasonal climate forecasts: one non-parametric and another parametric method. First, the non-parametric downscaling approach called FResampler1 uses the concept of 'conditional block sampling' of weather data to create daily weather realizations of a tercile-based seasonal climate forecasts. FResampler1 randomly draws time series of daily weather parameters (e.g., rainfall, maximum and minimum temperature and solar radiation) from historical records, for the season of interest from years that belong to a certain rainfall tercile category (e.g., being below-, near- and above-normal). In this way, FResampler1 preserves the covariance between rainfall and other weather parameters as if conditionally sampling maximum and minimum temperature and solar radiation if that day is wet or dry. The second approach called predictWTD is a parametric method based on a conditional stochastic weather generator. The tercile-based seasonal climate forecast is converted into a theoretical forecast cumulative probability curve. Then the deviates for each percentile is converted into rainfall amount or frequency or intensity to downscale the 'full' distribution of probabilistic seasonal climate forecasts. Those seasonal deviates are then disaggregated on a monthly basis and used to constrain the downscaling of forecast realizations at different percentile values of the theoretical forecast curve. As well as the theoretical basis of the approaches we will discuss sensitivity analysis (length of data and size of samples) of them. In addition their potential applications for managing climate-related risks in agriculture will be shown through a couple of case studies based on actual seasonal climate forecasts for: rice cropping in the Philippines and maize cropping in India and Kenya.

  20. Forecasting hotspots in East Kutai, Kutai Kartanegara, and West Kutai as early warning information

    NASA Astrophysics Data System (ADS)

    Wahyuningsih, S.; Goejantoro, R.; Rizki, N. A.

    2018-04-01

    The aims of this research are to model hotspots and forecast hotspot 2017 in East Kutai, Kutai Kartanegara and West Kutai. The methods which used in this research were Holt exponential smoothing, Holt’s additive dump trend method, Holt-Winters’ additive method, additive decomposition method, multiplicative decomposition method, Loess decomposition method and Box-Jenkins method. For smoothing techniques, additive decomposition is better than Holt’s exponential smoothing. The hotspots model using Box-Jenkins method were Autoregressive Moving Average ARIMA(1,1,0), ARIMA(0,2,1), and ARIMA(0,1,0). Comparing the results from all methods which were used in this research, and based on Root of Mean Squared Error (RMSE), show that Loess decomposition method is the best times series model, because it has the least RMSE. Thus the Loess decomposition model used to forecast the number of hotspot. The forecasting result indicatethat hotspots pattern tend to increase at the end of 2017 in Kutai Kartanegara and West Kutai, but stationary in East Kutai.

  1. Interval forecasting of cyberattack intensity on informatization objects of industry using probability cluster model

    NASA Astrophysics Data System (ADS)

    Krakovsky, Y. M.; Luzgin, A. N.; Mikhailova, E. A.

    2018-05-01

    At present, cyber-security issues associated with the informatization objects of industry occupy one of the key niches in the state management system. As a result of functional disruption of these systems via cyberattacks, an emergency may arise related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. When cyberattacks occur with high intensity, in these conditions there is the need to develop protection against them, based on machine learning methods. This paper examines interval forecasting and presents results with a pre-set intensity level. The interval forecasting is carried out based on a probabilistic cluster model. This method involves forecasting of one of the two predetermined intervals in which a future value of the indicator will be located; probability estimates are used for this purpose. A dividing bound of these intervals is determined by a calculation method based on statistical characteristics of the indicator. Source data are used that includes a number of hourly cyberattacks using a honeypot from March to September 2013.

  2. Post-processing through linear regression

    NASA Astrophysics Data System (ADS)

    van Schaeybroeck, B.; Vannitsem, S.

    2011-03-01

    Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.

  3. Model of medicines sales forecasting taking into account factors of influence

    NASA Astrophysics Data System (ADS)

    Kravets, A. G.; Al-Gunaid, M. A.; Loshmanov, V. I.; Rasulov, S. S.; Lempert, L. B.

    2018-05-01

    The article describes a method for forecasting sales of medicines in conditions of data sampling, which is insufficient for building a model based on historical data alone. The developed method is applicable mainly to new drugs that are already licensed and released for sale but do not yet have stable sales performance in the market. The purpose of this study is to prove the effectiveness of the suggested method forecasting drug sales, taking into account the selected factors of influence, revealed during the review of existing solutions and analysis of the specificity of the area under study. Three experiments were performed on samples of different volumes, which showed an improvement in the accuracy of forecasting sales in small samples.

  4. Stress-based aftershock forecasts made within 24h post mainshock: Expected north San Francisco Bay area seismicity changes after the 2014M=6.0 West Napa earthquake

    USGS Publications Warehouse

    Parsons, Thomas E.; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-01-01

    We calculate stress changes resulting from the M= 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  5. Stress-based aftershock forecasts made within 24 h postmain shock: Expected north San Francisco Bay area seismicity changes after the 2014 M = 6.0 West Napa earthquake

    NASA Astrophysics Data System (ADS)

    Parsons, Tom; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-12-01

    We calculate stress changes resulting from the M = 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  6. Landslide early warning based on failure forecast models: the example of Mt. de La Saxe rockslide, northern Italy

    NASA Astrophysics Data System (ADS)

    Manconi, A.; Giordan, D.

    2015-02-01

    We investigate the use of landslide failure forecast models by exploiting near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and apply straightforward statistical methods to obtain confidence intervals on the estimated time of failure. Here we describe the main concepts of our method, and show an example of application to a real emergency scenario, the La Saxe rockslide, Aosta Valley region, northern Italy. Based on the herein presented case study, we identify operational thresholds based on the reliability of the forecast models, in order to support the management of early warning systems in the most critical phases of the landslide emergency.

  7. Improving medium-range and seasonal hydroclimate forecasts in the southeast USA

    NASA Astrophysics Data System (ADS)

    Tian, Di

    Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.

  8. Product demand forecasts using wavelet kernel support vector machine and particle swarm optimization in manufacture system

    NASA Astrophysics Data System (ADS)

    Wu, Qi

    2010-03-01

    Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.

  9. Improving flood forecasting capability of physically based distributed hydrological models by parameter optimization

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Li, J.; Xu, H.

    2016-01-01

    Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for the Liuxihe model parameter optimization effectively and could improve the model capability largely in catchment flood forecasting, thus proving that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological models. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for the Liuxihe model catchment flood forecasting are 20 and 30 respectively.

  10. Appraisal of artificial neural network for forecasting of economic parameters

    NASA Astrophysics Data System (ADS)

    Kordanuli, Bojana; Barjaktarović, Lidija; Jeremić, Ljiljana; Alizamir, Meysam

    2017-01-01

    The main aim of this research is to develop and apply artificial neural network (ANN) with extreme learning machine (ELM) and back propagation (BP) to forecast gross domestic product (GDP) and Hirschman-Herfindahl Index (HHI). GDP could be developed based on combination of different factors. In this investigation GDP forecasting based on the agriculture and industry added value in gross domestic product (GDP) was analysed separately. Other inputs are final consumption expenditure of general government, gross fixed capital formation (investments) and fertility rate. The relation between product market competition and corporate investment is contentious. On one hand, the relation can be positive, but on the other hand, the relation can be negative. Several methods have been proposed to monitor market power for the purpose of developing procedures to mitigate or eliminate the effects. The most widely used methods are based on indices such as the Hirschman-Herfindahl Index (HHI). The reliability of the ANN models were accessed based on simulation results and using several statistical indicators. Based upon simulation results, it was presented that ELM shows better performances than BP learning algorithm in applications of GDP and HHI forecasting.

  11. Automatic load forecasting. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, D.J.; Vemuri, S.

    A method which lends itself to on-line forecasting of hourly electric loads is presented and the results of its use are compared to models developed using the Box-Jenkins method. The method consists of processing the historical hourly loads with a sequential least-squares estimator to identify a finite order autoregressive model which in turn is used to obtain a parsimonious autoregressive-moving average model. A procedure is also defined for incorporating temperature as a variable to improve forecasts where loads are temperature dependent. The method presented has several advantages in comparison to the Box-Jenkins method including much less human intervention and improvedmore » model identification. The method has been tested using three-hourly data from the Lincoln Electric System, Lincoln, Nebraska. In the exhaustive analyses performed on this data base this method produced significantly better results than the Box-Jenkins method. The method also proved to be more robust in that greater confidence could be placed in the accuracy of models based upon the various measures available at the identification stage.« less

  12. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    PubMed

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  13. Evaluating the performance of the Lee-Carter method and its variants in modelling and forecasting Malaysian mortality

    NASA Astrophysics Data System (ADS)

    Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.

    2014-12-01

    This study investigated the performance of the Lee-Carter (LC) method and it variants in modeling and forecasting Malaysia mortality. These include the original LC, the Lee-Miller (LM) variant and the Booth-Maindonald-Smith (BMS) variant. These methods were evaluated using Malaysia's mortality data which was measured based on age specific death rates (ASDR) for 1971 to 2009 for overall population while those for 1980-2009 were used in separate models for male and female population. The performance of the variants has been examined in term of the goodness of fit of the models and forecasting accuracy. Comparison was made based on several criteria namely, mean square error (MSE), root mean square error (RMSE), mean absolute deviation (MAD) and mean absolute percentage error (MAPE). The results indicate that BMS method was outperformed in in-sample fitting for overall population and when the models were fitted separately for male and female population. However, in the case of out-sample forecast accuracy, BMS method only best when the data were fitted to overall population. When the data were fitted separately for male and female, LCnone performed better for male population and LM method is good for female population.

  14. Two levels ARIMAX and regression models for forecasting time series data with calendar variation effects

    NASA Astrophysics Data System (ADS)

    Suhartono, Lee, Muhammad Hisyam; Prastyo, Dedy Dwi

    2015-12-01

    The aim of this research is to develop a calendar variation model for forecasting retail sales data with the Eid ul-Fitr effect. The proposed model is based on two methods, namely two levels ARIMAX and regression methods. Two levels ARIMAX and regression models are built by using ARIMAX for the first level and regression for the second level. Monthly men's jeans and women's trousers sales in a retail company for the period January 2002 to September 2009 are used as case study. In general, two levels of calendar variation model yields two models, namely the first model to reconstruct the sales pattern that already occurred, and the second model to forecast the effect of increasing sales due to Eid ul-Fitr that affected sales at the same and the previous months. The results show that the proposed two level calendar variation model based on ARIMAX and regression methods yields better forecast compared to the seasonal ARIMA model and Neural Networks.

  15. The value of forecasting key-decision variables for rain-fed farming

    NASA Astrophysics Data System (ADS)

    Winsemius, Hessel; Werner, Micha

    2013-04-01

    Rain-fed farmers are highly vulnerable to variability in rainfall. Timely knowledge of the onset of the rainy season, the expected amount of rainfall and the occurrence of dry spells can help rain-fed farmers to plan the cropping season. Seasonal probabilistic weather forecasts may provide such information to farmers, but need to provide reliable forecasts of key variables with which farmers can make decisions. In this contribution, we present a new method to evaluate the value of meteorological forecasts in predicting these key variables. The proposed method measures skill by assessing whether a forecast was useful to this decision. This is done by taking into account the required accuracy of timing of the event to make the decision useful. The method progresses the estimate of forecast skill to forecast value by taking into account the required accuracy that is needed to make the decision valuable, based on the cost/loss ratio of possible decisions. The method is applied over the Limpopo region in Southern Africa. We demonstrate the method using the example of temporary water harvesting techniques. Such techniques require time to construct and must be ready long enough before the occurrence of a dry spell to be effective. The value of the forecasts to the decision used as an example is shown to be highly sensitive to the accuracy in the timing of forecasted dry spells, and the tolerance in the decision to timing error. The skill with which dry spells can be predicted is shown to be higher in some parts of the basin, indicating that these forecasts have higher value for the decision in those parts than in others. Through assessing the skill of forecasting key decision variables to the farmers we show that it is easier to understand if the forecasts have value in reducing risk, or if other adaptation strategies should be implemented.

  16. [Predicting Incidence of Hepatitis E in Chinausing Fuzzy Time Series Based on Fuzzy C-Means Clustering Analysis].

    PubMed

    Luo, Yi; Zhang, Tao; Li, Xiao-song

    2016-05-01

    To explore the application of fuzzy time series model based on fuzzy c-means clustering in forecasting monthly incidence of Hepatitis E in mainland China. Apredictive model (fuzzy time series method based on fuzzy c-means clustering) was developed using Hepatitis E incidence data in mainland China between January 2004 and July 2014. The incidence datafrom August 2014 to November 2014 were used to test the fitness of the predictive model. The forecasting results were compared with those resulted from traditional fuzzy time series models. The fuzzy time series model based on fuzzy c-means clustering had 0.001 1 mean squared error (MSE) of fitting and 6.977 5 x 10⁻⁴ MSE of forecasting, compared with 0.0017 and 0.0014 from the traditional forecasting model. The results indicate that the fuzzy time series model based on fuzzy c-means clustering has a better performance in forecasting incidence of Hepatitis E.

  17. Forecasting VaR and ES of stock index portfolio: A Vine copula method

    NASA Astrophysics Data System (ADS)

    Zhang, Bangzheng; Wei, Yu; Yu, Jiang; Lai, Xiaodong; Peng, Zhenfeng

    2014-12-01

    Risk measurement has both theoretical and practical significance in risk management. Using daily sample of 10 international stock indices, firstly this paper models the internal structures among different stock markets with C-Vine, D-Vine and R-Vine copula models. Secondly, the Value-at-Risk (VaR) and Expected Shortfall (ES) of the international stock markets portfolio are forecasted using Monte Carlo method based on the estimated dependence of different Vine copulas. Finally, the accuracy of VaR and ES measurements obtained from different statistical models are evaluated by UC, IND, CC and Posterior analysis. The empirical results show that the VaR forecasts at the quantile levels of 0.9, 0.95, 0.975 and 0.99 with three kinds of Vine copula models are sufficiently accurate. Several traditional methods, such as historical simulation, mean-variance and DCC-GARCH models, fail to pass the CC backtesting. The Vine copula methods can accurately forecast the ES of the portfolio on the base of VaR measurement, and D-Vine copula model is superior to other Vine copulas.

  18. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    PubMed Central

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  19. A four-stage hybrid model for hydrological time series forecasting.

    PubMed

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  20. A comparative verification of high resolution precipitation forecasts using model output statistics

    NASA Astrophysics Data System (ADS)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  1. A stochastic HMM-based forecasting model for fuzzy time series.

    PubMed

    Li, Sheng-Tun; Cheng, Yi-Chung

    2010-10-01

    Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.

  2. Forecasting Lightning Threat using Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    McCaul, Eugene W., Jr.; Goodman, Steven J.; LaCasse, Katherine M.; Cecil, Daniel J.

    2008-01-01

    Two new approaches are proposed and developed for making time and space dependent, quantitative short-term forecasts of lightning threat, and a blend of these approaches is devised that capitalizes on the strengths of each. The new methods are distinctive in that they are based entirely on the ice-phase hydrometeor fields generated by regional cloud-resolving numerical simulations, such as those produced by the WRF model. These methods are justified by established observational evidence linking aspects of the precipitating ice hydrometeor fields to total flash rates. The methods are straightforward and easy to implement, and offer an effective near-term alternative to the incorporation of complex and costly cloud electrification schemes into numerical models. One method is based on upward fluxes of precipitating ice hydrometeors in the mixed phase region at the-15 C level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domain-wide statistics of the peak values of simulated flash rate proxy fields against domain-wide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. Our blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Exploratory tests for selected North Alabama cases show that, because WRF can distinguish the general character of most convective events, our methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because the models tend to have more difficulty in predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models,the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of forecasts become available.

  3. Study on the medical meteorological forecast of the number of hypertension inpatient based on SVR

    NASA Astrophysics Data System (ADS)

    Zhai, Guangyu; Chai, Guorong; Zhang, Haifeng

    2017-06-01

    The purpose of this study is to build a hypertension prediction model by discussing the meteorological factors for hypertension incidence. The research method is selecting the standard data of relative humidity, air temperature, visibility, wind speed and air pressure of Lanzhou from 2010 to 2012(calculating the maximum, minimum and average value with 5 days as a unit ) as the input variables of Support Vector Regression(SVR) and the standard data of hypertension incidence of the same period as the output dependent variables to obtain the optimal prediction parameters by cross validation algorithm, then by SVR algorithm learning and training, a SVR forecast model for hypertension incidence is built. The result shows that the hypertension prediction model is composed of 15 input independent variables, the training accuracy is 0.005, the final error is 0.0026389. The forecast accuracy based on SVR model is 97.1429%, which is higher than statistical forecast equation and neural network prediction method. It is concluded that SVR model provides a new method for hypertension prediction with its simple calculation, small error as well as higher historical sample fitting and Independent sample forecast capability.

  4. Improvement of the analog forecasting method by using local thermodynamic data. Application to autumn precipitation in Catalonia

    NASA Astrophysics Data System (ADS)

    Gibergans-Báguena, J.; Llasat, M. C.

    2007-12-01

    The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.

  5. Technical note: Combining quantile forecasts and predictive distributions of streamflows

    NASA Astrophysics Data System (ADS)

    Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano

    2017-11-01

    The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.

  6. A Simulation Optimization Approach to Epidemic Forecasting

    PubMed Central

    Nsoesie, Elaine O.; Beckman, Richard J.; Shashaani, Sara; Nagaraj, Kalyani S.; Marathe, Madhav V.

    2013-01-01

    Reliable forecasts of influenza can aid in the control of both seasonal and pandemic outbreaks. We introduce a simulation optimization (SIMOP) approach for forecasting the influenza epidemic curve. This study represents the final step of a project aimed at using a combination of simulation, classification, statistical and optimization techniques to forecast the epidemic curve and infer underlying model parameters during an influenza outbreak. The SIMOP procedure combines an individual-based model and the Nelder-Mead simplex optimization method. The method is used to forecast epidemics simulated over synthetic social networks representing Montgomery County in Virginia, Miami, Seattle and surrounding metropolitan regions. The results are presented for the first four weeks. Depending on the synthetic network, the peak time could be predicted within a 95% CI as early as seven weeks before the actual peak. The peak infected and total infected were also accurately forecasted for Montgomery County in Virginia within the forecasting period. Forecasting of the epidemic curve for both seasonal and pandemic influenza outbreaks is a complex problem, however this is a preliminary step and the results suggest that more can be achieved in this area. PMID:23826222

  7. A Simulation Optimization Approach to Epidemic Forecasting.

    PubMed

    Nsoesie, Elaine O; Beckman, Richard J; Shashaani, Sara; Nagaraj, Kalyani S; Marathe, Madhav V

    2013-01-01

    Reliable forecasts of influenza can aid in the control of both seasonal and pandemic outbreaks. We introduce a simulation optimization (SIMOP) approach for forecasting the influenza epidemic curve. This study represents the final step of a project aimed at using a combination of simulation, classification, statistical and optimization techniques to forecast the epidemic curve and infer underlying model parameters during an influenza outbreak. The SIMOP procedure combines an individual-based model and the Nelder-Mead simplex optimization method. The method is used to forecast epidemics simulated over synthetic social networks representing Montgomery County in Virginia, Miami, Seattle and surrounding metropolitan regions. The results are presented for the first four weeks. Depending on the synthetic network, the peak time could be predicted within a 95% CI as early as seven weeks before the actual peak. The peak infected and total infected were also accurately forecasted for Montgomery County in Virginia within the forecasting period. Forecasting of the epidemic curve for both seasonal and pandemic influenza outbreaks is a complex problem, however this is a preliminary step and the results suggest that more can be achieved in this area.

  8. Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Mohamed Ismael, Hawa; Vandyck, George Kobina

    The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.

  9. Prediction on sunspot activity based on fuzzy information granulation and support vector machine

    NASA Astrophysics Data System (ADS)

    Peng, Lingling; Yan, Haisheng; Yang, Zhigang

    2018-04-01

    In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.

  10. Increased performance in the short-term water demand forecasting through the use of a parallel adaptive weighting strategy

    NASA Astrophysics Data System (ADS)

    Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.

    2018-03-01

    Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.

  11. Robustness of disaggregate oil and gas discovery forecasting models

    USGS Publications Warehouse

    Attanasi, E.D.; Schuenemeyer, J.H.

    1989-01-01

    The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.

  12. Distributed HUC-based modeling with SUMMA for ensemble streamflow forecasting over large regional domains.

    NASA Astrophysics Data System (ADS)

    Saharia, M.; Wood, A.; Clark, M. P.; Bennett, A.; Nijssen, B.; Clark, E.; Newman, A. J.

    2017-12-01

    Most operational streamflow forecasting systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow require an experienced human forecaster. But this approach faces challenges surrounding process reproducibility, hindcasting capability, and extension to large domains. The operational hydrologic community is increasingly moving towards `over-the-loop' (completely automated) large-domain simulations yet recent developments indicate a widespread lack of community knowledge about the strengths and weaknesses of such systems for forecasting. A realistic representation of land surface hydrologic processes is a critical element for improving forecasts, but often comes at the substantial cost of forecast system agility and efficiency. While popular grid-based models support the distributed representation of land surface processes, intermediate-scale Hydrologic Unit Code (HUC)-based modeling could provide a more efficient and process-aligned spatial discretization, reducing the need for tradeoffs between model complexity and critical forecasting requirements such as ensemble methods and comprehensive model calibration. The National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the USACE to implement, assess, and demonstrate real-time, over-the-loop distributed streamflow forecasting for several large western US river basins and regions. In this presentation, we present early results from short to medium range hydrologic and streamflow forecasts for the Pacific Northwest (PNW). We employ a real-time 1/16th degree daily ensemble model forcings as well as downscaled Global Ensemble Forecasting System (GEFS) meteorological forecasts. These datasets drive an intermediate-scale configuration of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model, which represents the PNW using over 11,700 HUCs. The system produces not only streamflow forecasts (using the MizuRoute channel routing tool) but also distributed model states such as soil moisture and snow water equivalent. We also describe challenges in distributed model-based forecasting, including the application and early results of real-time hydrologic data assimilation.

  13. Seasonal drought ensemble predictions based on multiple climate models in the upper Han River Basin, China

    NASA Astrophysics Data System (ADS)

    Ma, Feng; Ye, Aizhong; Duan, Qingyun

    2017-03-01

    An experimental seasonal drought forecasting system is developed based on 29-year (1982-2010) seasonal meteorological hindcasts generated by the climate models from the North American Multi-Model Ensemble (NMME) project. This system made use of a bias correction and spatial downscaling method, and a distributed time-variant gain model (DTVGM) hydrologic model. DTVGM was calibrated using observed daily hydrological data and its streamflow simulations achieved Nash-Sutcliffe efficiency values of 0.727 and 0.724 during calibration (1978-1995) and validation (1996-2005) periods, respectively, at the Danjiangkou reservoir station. The experimental seasonal drought forecasting system (known as NMME-DTVGM) is used to generate seasonal drought forecasts. The forecasts were evaluated against the reference forecasts (i.e., persistence forecast and climatological forecast). The NMME-DTVGM drought forecasts have higher detectability and accuracy and lower false alarm rate than the reference forecasts at different lead times (from 1 to 4 months) during the cold-dry season. No apparent advantage is shown in drought predictions during spring and summer seasons because of a long memory of the initial conditions in spring and a lower predictive skill for precipitation in summer. Overall, the NMME-based seasonal drought forecasting system has meaningful skill in predicting drought several months in advance, which can provide critical information for drought preparedness and response planning as well as the sustainable practice of water resource conservation over the basin.

  14. Operating Hours Based Inventory Management.

    DTIC Science & Technology

    1986-12-01

    forecasting can be based on the expert opinion. The Delphi technique is one such method of forecasting which uses a group of decision makers with a...errors occ.ur. If larce a’nounts of’ material are procured and warehoused, there could be a greater chance that the material will no longer be needed

  15. An information-theoretical perspective on weighted ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Weijs, Steven V.; van de Giesen, Nick

    2013-08-01

    This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.

  16. A perturbative approach for enhancing the performance of time series forecasting.

    PubMed

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Three-Month Real-Time Dengue Forecast Models: An Early Warning System for Outbreak Alerts and Policy Decision Support in Singapore.

    PubMed

    Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S Y; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R

    2016-09-01

    With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore's dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369-1375; http://dx.doi.org/10.1289/ehp.1509981.

  18. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.

  19. A Canonical Ensemble Correlation Prediction Model for Seasonal Precipitation Anomaly

    NASA Technical Reports Server (NTRS)

    Shen, Samuel S. P.; Lau, William K. M.; Kim, Kyu-Myong; Li, Guilong

    2001-01-01

    This report describes an optimal ensemble forecasting model for seasonal precipitation and its error estimation. Each individual forecast is based on the canonical correlation analysis (CCA) in the spectral spaces whose bases are empirical orthogonal functions (EOF). The optimal weights in the ensemble forecasting crucially depend on the mean square error of each individual forecast. An estimate of the mean square error of a CCA prediction is made also using the spectral method. The error is decomposed onto EOFs of the predictand and decreases linearly according to the correlation between the predictor and predictand. This new CCA model includes the following features: (1) the use of area-factor, (2) the estimation of prediction error, and (3) the optimal ensemble of multiple forecasts. The new CCA model is applied to the seasonal forecasting of the United States precipitation field. The predictor is the sea surface temperature.

  20. Three-Month Real-Time Dengue Forecast Models: An Early Warning System for Outbreak Alerts and Policy Decision Support in Singapore

    PubMed Central

    Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S.Y.; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R.

    2015-01-01

    Background: With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. Objectives: We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. Methods: We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Results: Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore’s dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Conclusions: Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Citation: Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369–1375; http://dx.doi.org/10.1289/ehp.1509981 PMID:26662617

  1. Forecasting United States heartworm Dirofilaria immitis prevalence in dogs.

    PubMed

    Bowman, Dwight D; Liu, Yan; McMahan, Christopher S; Nordone, Shila K; Yabsley, Michael J; Lund, Robert B

    2016-10-10

    This paper forecasts next year's canine heartworm prevalence in the United States from 16 climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 31 million antigen heartworm tests conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on 16 predictive factors, including temperature, precipitation, median household income, local forest and surface water coverage, and presence/absence of eight mosquito species. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county heartworm prevalence for the 5-year period 2011-2015 is 0.727, demonstrating reasonable model accuracy. The correlation between 2015 observed and forecasted county-by-county heartworm prevalence is 0.940, demonstrating significant skill and showing that heartworm prevalence can be forecasted reasonably accurately. The forecast presented herein can a priori alert veterinarians to areas expected to see higher than normal heartworm activity. The proposed methods may prove useful for forecasting other diseases.

  2. Does money matter in inflation forecasting?

    NASA Astrophysics Data System (ADS)

    Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.

    2010-11-01

    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.

  3. Geochemical monitoring of drilling fluids; A powerful tool to forecast and detect formation waters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vuataz, F.D.; Brach, M.; Criaud, A.

    1990-06-01

    This paper describes a method based on the difference between the chemical compositions of formation and drilling fluids for analyzing drilling mud to forecast fluid-producing zones. The method was successfully applied in three boreholes in crystalline rocks in France. Subsequent geophysical logs and hydraulic tests confirmed the occurrence of flowing fractures.

  4. An operational integrated short-term warning solution for solar radiation storms: introducing the Forecasting Solar Particle Events and Flares (FORSPEF) system

    NASA Astrophysics Data System (ADS)

    Anastasiadis, Anastasios; Sandberg, Ingmar; Papaioannou, Athanasios; Georgoulis, Manolis; Tziotziou, Kostas; Jiggens, Piers; Hilgers, Alain

    2015-04-01

    We present a novel integrated prediction system, of both solar flares and solar energetic particle (SEP) events, which is in place to provide short-term warnings for hazardous solar radiation storms. FORSPEF system provides forecasting of solar eruptive events, such as solar flares with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. It also provides nowcasting of SEP events based on actual solar flare and CME near real-time alerts, as well as SEP characteristics (peak flux, fluence, rise time, duration) per parent solar event. The prediction of solar flares relies on a morphological method which is based on the sophisticated derivation of the effective connected magnetic field strength (Beff) of potentially flaring active-region (AR) magnetic configurations and it utilizes analysis of a large number of AR magnetograms. For the prediction of SEP events a new reductive statistical method has been implemented based on a newly constructed database of solar flares, CMEs and SEP events that covers a large time span from 1984-2013. The method is based on flare location (longitude), flare size (maximum soft X-ray intensity), and the occurrence (or not) of a CME. Warnings are issued for all > C1.0 soft X-ray flares. The warning time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective warning time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes. We discuss the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of solar flare and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK

  5. Verification of ECMWF System 4 for seasonal hydrological forecasting in a northern climate

    NASA Astrophysics Data System (ADS)

    Bazile, Rachel; Boucher, Marie-Amélie; Perreault, Luc; Leconte, Robert

    2017-11-01

    Hydropower production requires optimal dam and reservoir management to prevent flooding damage and avoid operation losses. In a northern climate, where spring freshet constitutes the main inflow volume, seasonal forecasts can help to establish a yearly strategy. Long-term hydrological forecasts often rely on past observations of streamflow or meteorological data. Another alternative is to use ensemble meteorological forecasts produced by climate models. In this paper, those produced by the ECMWF (European Centre for Medium-Range Forecast) System 4 are examined and bias is characterized. Bias correction, through the linear scaling method, improves the performance of the raw ensemble meteorological forecasts in terms of continuous ranked probability score (CRPS). Then, three seasonal ensemble hydrological forecasting systems are compared: (1) the climatology of simulated streamflow, (2) the ensemble hydrological forecasts based on climatology (ESP) and (3) the hydrological forecasts based on bias-corrected ensemble meteorological forecasts from System 4 (corr-DSP). Simulated streamflow computed using observed meteorological data is used as benchmark. Accounting for initial conditions is valuable even for long-term forecasts. ESP and corr-DSP both outperform the climatology of simulated streamflow for lead times from 1 to 5 months depending on the season and watershed. Integrating information about future meteorological conditions also improves monthly volume forecasts. For the 1-month lead time, a gain exists for almost all watersheds during winter, summer and fall. However, volume forecasts performance for spring varies from one watershed to another. For most of them, the performance is close to the performance of ESP. For longer lead times, the CRPS skill score is mostly in favour of ESP, even if for many watersheds, ESP and corr-DSP have comparable skill. Corr-DSP appears quite reliable but, in some cases, under-dispersion or bias is observed. A more complex bias-correction method should be further investigated to remedy this weakness and take more advantage of the ensemble forecasts produced by the climate model. Overall, in this study, bias-corrected ensemble meteorological forecasts appear to be an interesting source of information for hydrological forecasting for lead times up to 1 month. They could also complement ESP for longer lead times.

  6. Approaches in Health Human Resource Forecasting: A Roadmap for Improvement

    PubMed Central

    Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh

    2016-01-01

    Introduction Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. Methods A literature review was conducted for studies published in English from 1990–2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies’ references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses Results Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. Conclusions An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems. PMID:27790343

  7. Application of a hybrid method combining grey model and back propagation artificial neural networks to forecast hepatitis B in china.

    PubMed

    Gan, Ruijing; Chen, Xiaojun; Yan, Yu; Huang, Daizheng

    2015-01-01

    Accurate incidence forecasting of infectious disease provides potentially valuable insights in its own right. It is critical for early prevention and may contribute to health services management and syndrome surveillance. This study investigates the use of a hybrid algorithm combining grey model (GM) and back propagation artificial neural networks (BP-ANN) to forecast hepatitis B in China based on the yearly numbers of hepatitis B and to evaluate the method's feasibility. The results showed that the proposal method has advantages over GM (1, 1) and GM (2, 1) in all the evaluation indexes.

  8. Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2017-03-01

    Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.

  9. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  10. Comparative study of four time series methods in forecasting typhoid fever incidence in China.

    PubMed

    Zhang, Xingyu; Liu, Yuanyuan; Yang, Min; Zhang, Tao; Young, Alistair A; Li, Xiaosong

    2013-01-01

    Accurate incidence forecasting of infectious disease is critical for early prevention and for better government strategic planning. In this paper, we present a comprehensive study of different forecasting methods based on the monthly incidence of typhoid fever. The seasonal autoregressive integrated moving average (SARIMA) model and three different models inspired by neural networks, namely, back propagation neural networks (BPNN), radial basis function neural networks (RBFNN), and Elman recurrent neural networks (ERNN) were compared. The differences as well as the advantages and disadvantages, among the SARIMA model and the neural networks were summarized and discussed. The data obtained for 2005 to 2009 and for 2010 from the Chinese Center for Disease Control and Prevention were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The results showed that RBFNN obtained the smallest MAE, MAPE and MSE in both the modeling and forecasting processes. The performances of the four models ranked in descending order were: RBFNN, ERNN, BPNN and the SARIMA model.

  11. Comparative Study of Four Time Series Methods in Forecasting Typhoid Fever Incidence in China

    PubMed Central

    Zhang, Xingyu; Liu, Yuanyuan; Yang, Min; Zhang, Tao; Young, Alistair A.; Li, Xiaosong

    2013-01-01

    Accurate incidence forecasting of infectious disease is critical for early prevention and for better government strategic planning. In this paper, we present a comprehensive study of different forecasting methods based on the monthly incidence of typhoid fever. The seasonal autoregressive integrated moving average (SARIMA) model and three different models inspired by neural networks, namely, back propagation neural networks (BPNN), radial basis function neural networks (RBFNN), and Elman recurrent neural networks (ERNN) were compared. The differences as well as the advantages and disadvantages, among the SARIMA model and the neural networks were summarized and discussed. The data obtained for 2005 to 2009 and for 2010 from the Chinese Center for Disease Control and Prevention were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The results showed that RBFNN obtained the smallest MAE, MAPE and MSE in both the modeling and forecasting processes. The performances of the four models ranked in descending order were: RBFNN, ERNN, BPNN and the SARIMA model. PMID:23650546

  12. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Jackson, D. D.; Rhoades, D. A.; Zechar, J. D.; Marzocchi, W.

    2016-12-01

    The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 442 models under evaluation. The California testing center, started by SCEC, Sept 1, 2007, currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. Our tests are now based on the hypocentral locations and magnitudes of cataloged earthquakes, but we plan to test focal mechanisms, seismic hazard models, ground motion forecasts, and finite rupture forecasts as well. We have increased computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model, introduced Bayesian ensemble models, and implemented support for non-Poissonian simulation-based forecasts models. We are currently developing formats and procedures to evaluate externally hosted forecasts and predictions. CSEP supports the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. We found that earthquakes as small as magnitude 2.5 provide important information on subsequent earthquakes larger than magnitude 5. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence showed that some physics-based and hybrid models outperform catalog-based (e.g., ETAS) models. This experiment also demonstrates the ability of the CSEP infrastructure to support retrospective forecast testing. Current CSEP development activities include adoption of the Comprehensive Earthquake Catalog (ComCat) as an authorized data source, retrospective testing of simulation-based forecasts, and support for additive ensemble methods. We describe the open-source CSEP software that is available to researchers as they develop their forecast models. We also discuss how CSEP procedures are being adapted to intensity and ground motion prediction experiments as well as hazard model testing.

  13. A Dirichlet process model for classifying and forecasting epidemic curves.

    PubMed

    Nsoesie, Elaine O; Leman, Scotland C; Marathe, Madhav V

    2014-01-09

    A forecast can be defined as an endeavor to quantitatively estimate a future event or probabilities assigned to a future occurrence. Forecasting stochastic processes such as epidemics is challenging since there are several biological, behavioral, and environmental factors that influence the number of cases observed at each point during an epidemic. However, accurate forecasts of epidemics would impact timely and effective implementation of public health interventions. In this study, we introduce a Dirichlet process (DP) model for classifying and forecasting influenza epidemic curves. The DP model is a nonparametric Bayesian approach that enables the matching of current influenza activity to simulated and historical patterns, identifies epidemic curves different from those observed in the past and enables prediction of the expected epidemic peak time. The method was validated using simulated influenza epidemics from an individual-based model and the accuracy was compared to that of the tree-based classification technique, Random Forest (RF), which has been shown to achieve high accuracy in the early prediction of epidemic curves using a classification approach. We also applied the method to forecasting influenza outbreaks in the United States from 1997-2013 using influenza-like illness (ILI) data from the Centers for Disease Control and Prevention (CDC). We made the following observations. First, the DP model performed as well as RF in identifying several of the simulated epidemics. Second, the DP model correctly forecasted the peak time several days in advance for most of the simulated epidemics. Third, the accuracy of identifying epidemics different from those already observed improved with additional data, as expected. Fourth, both methods correctly classified epidemics with higher reproduction numbers (R) with a higher accuracy compared to epidemics with lower R values. Lastly, in the classification of seasonal influenza epidemics based on ILI data from the CDC, the methods' performance was comparable. Although RF requires less computational time compared to the DP model, the algorithm is fully supervised implying that epidemic curves different from those previously observed will always be misclassified. In contrast, the DP model can be unsupervised, semi-supervised or fully supervised. Since both methods have their relative merits, an approach that uses both RF and the DP model could be beneficial.

  14. Prediction of welding shrinkage deformation of bridge steel box girder based on wavelet neural network

    NASA Astrophysics Data System (ADS)

    Tao, Yulong; Miao, Yunshui; Han, Jiaqi; Yan, Feiyun

    2018-05-01

    Aiming at the low accuracy of traditional forecasting methods such as linear regression method, this paper presents a prediction method for predicting the relationship between bridge steel box girder and its displacement with wavelet neural network. Compared with traditional forecasting methods, this scheme has better local characteristics and learning ability, which greatly improves the prediction ability of deformation. Through analysis of the instance and found that after compared with the traditional prediction method based on wavelet neural network, the rigid beam deformation prediction accuracy is higher, and is superior to the BP neural network prediction results, conform to the actual demand of engineering design.

  15. Ensemble-based methods for forecasting census in hospital units.

    PubMed

    Koestler, Devin C; Ombao, Hernando; Bender, Jesse

    2013-05-30

    The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts.

  16. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.

  17. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009)

    PubMed Central

    2011-01-01

    Background Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. Methods A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. Results The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Conclusions Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance. PMID:21324153

  18. Application of new methods based on ECMWF ensemble model for predicting severe convective weather situations

    NASA Astrophysics Data System (ADS)

    Lazar, Dora; Ihasz, Istvan

    2013-04-01

    The short and medium range operational forecasts, warning and alarm of the severe weather are one of the most important activities of the Hungarian Meteorological Service. Our study provides comprehensive summary of newly developed methods based on ECMWF ensemble forecasts to assist successful prediction of the convective weather situations. . In the first part of the study a brief overview is given about the components of atmospheric convection, which are the atmospheric lifting force, convergence and vertical wind shear. The atmospheric instability is often used to characterize the so-called instability index; one of the most popular and often used indexes is the convective available potential energy. Heavy convective events, like intensive storms, supercells and tornadoes are needed the vertical instability, adequate moisture and vertical wind shear. As a first step statistical studies of these three parameters are based on nine years time series of 51-member ensemble forecasting model based on convective summer time period, various statistical analyses were performed. Relationship of the rate of the convective and total precipitation and above three parameters was studied by different statistical methods. Four new visualization methods were applied for supporting successful forecasts of severe weathers. Two of the four visualization methods the ensemble meteogram and the ensemble vertical profiles had been available at the beginning of our work. Both methods show probability of the meteorological parameters for the selected location. Additionally two new methods have been developed. First method provides probability map of the event exceeding predefined values, so the incident of the spatial uncertainty is well-defined. The convective weather events are characterized by the incident of space often rhapsodic occurs rather have expected the event area can be selected so that the ensemble forecasts give very good support. Another new visualization tool shows time evolution of predefined multiple thresholds in graphical form for any selected location. With applying this tool degree of the dangerous weather conditions can be well estimated. Besides intensive convective periods are clearly marked during the forecasting period. Developments were done by MAGICS++ software under UNIX operating system. The third part of the study usefulness of these tools is demonstrated in three interesting cases studies of last summer.

  19. Sensitivity of WRF precipitation field to assimilation sources in northeastern Spain

    NASA Astrophysics Data System (ADS)

    Lorenzana, Jesús; Merino, Andrés; García-Ortega, Eduardo; Fernández-González, Sergio; Gascón, Estíbaliz; Hermida, Lucía; Sánchez, José Luis; López, Laura; Marcos, José Luis

    2015-04-01

    Numerical weather prediction (NWP) of precipitation is a challenge. Models predict precipitation after solving many physical processes. In particular, mesoscale NWP models have different parameterizations, such as microphysics, cumulus or radiation schemes. These facilitate, according to required spatial and temporal resolutions, precipitation fields with increasing reliability. Nevertheless, large uncertainties are inherent to precipitation forecasting. Consequently, assimilation methods are very important. The Atmospheric Physics Group at the University of León in Spain and the Castile and León Supercomputing Center carry out daily weather prediction based on the Weather Research and Forecasting (WRF) model, covering the entire Iberian Peninsula. Forecasts of severe precipitation affecting the Ebro Valley, in the southern Pyrenees range of northeastern Spain, are crucial in the decision-making process for managing reservoirs or initializing runoff models. These actions can avert floods and ensure uninterrupted economic activity in the area. We investigated a set of cases corresponding to intense or severe precipitation patterns, using a rain gauge network. Simulations were performed with a dual objective, i.e., to analyze forecast improvement using a specific assimilation method, and to study the sensitivity of model outputs to different types of assimilation data. A WRF forecast model initialized by an NCEP SST analysis was used as the control run. The assimilation was based on the Meteorological Assimilation Data Ingest System (MADIS) developed by NOAA. The MADIS data used were METAR, maritime, ACARS, radiosonde, and satellite products. The results show forecast improvement using the suggested assimilation method, and differences in the accuracy of forecast precipitation patterns varied with the assimilation data source.

  20. Data assimilation method based on the constraints of confidence region

    NASA Astrophysics Data System (ADS)

    Li, Yong; Li, Siming; Sheng, Yao; Wang, Luheng

    2018-03-01

    The ensemble Kalman filter (EnKF) is a distinguished data assimilation method that is widely used and studied in various fields including methodology and oceanography. However, due to the limited sample size or imprecise dynamics model, it is usually easy for the forecast error variance to be underestimated, which further leads to the phenomenon of filter divergence. Additionally, the assimilation results of the initial stage are poor if the initial condition settings differ greatly from the true initial state. To address these problems, the variance inflation procedure is usually adopted. In this paper, we propose a new method based on the constraints of a confidence region constructed by the observations, called EnCR, to estimate the inflation parameter of the forecast error variance of the EnKF method. In the new method, the state estimate is more robust to both the inaccurate forecast models and initial condition settings. The new method is compared with other adaptive data assimilation methods in the Lorenz-63 and Lorenz-96 models under various model parameter settings. The simulation results show that the new method performs better than the competing methods.

  1. Deep Learning Based Solar Flare Forecasting Model. I. Results for Line-of-sight Magnetograms

    NASA Astrophysics Data System (ADS)

    Huang, Xin; Wang, Huaning; Xu, Long; Liu, Jinfu; Li, Rong; Dai, Xinghua

    2018-03-01

    Solar flares originate from the release of the energy stored in the magnetic field of solar active regions, the triggering mechanism for these flares, however, remains unknown. For this reason, the conventional solar flare forecast is essentially based on the statistic relationship between solar flares and measures extracted from observational data. In the current work, the deep learning method is applied to set up the solar flare forecasting model, in which forecasting patterns can be learned from line-of-sight magnetograms of solar active regions. In order to obtain a large amount of observational data to train the forecasting model and test its performance, a data set is created from line-of-sight magnetogarms of active regions observed by SOHO/MDI and SDO/HMI from 1996 April to 2015 October and corresponding soft X-ray solar flares observed by GOES. The testing results of the forecasting model indicate that (1) the forecasting patterns can be automatically reached with the MDI data and they can also be applied to the HMI data; furthermore, these forecasting patterns are robust to the noise in the observational data; (2) the performance of the deep learning forecasting model is not sensitive to the given forecasting periods (6, 12, 24, or 48 hr); (3) the performance of the proposed forecasting model is comparable to that of the state-of-the-art flare forecasting models, even if the duration of the total magnetograms continuously spans 19.5 years. Case analyses demonstrate that the deep learning based solar flare forecasting model pays attention to areas with the magnetic polarity-inversion line or the strong magnetic field in magnetograms of active regions.

  2. A temperature match based optimization method for daily load prediction considering DLC effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Z.

    This paper presents a unique optimization method for short term load forecasting. The new method is based on the optimal template temperature match between the future and past temperatures. The optimal error reduction technique is a new concept introduced in this paper. Two case studies show that for hourly load forecasting, this method can yield results as good as the rather complicated Box-Jenkins Transfer Function method, and better than the Box-Jenkins method; for peak load prediction, this method is comparable in accuracy to the neural network method with back propagation, and can produce more accurate results than the multi-linear regressionmore » method. The DLC effect on system load is also considered in this method.« less

  3. Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J.; Hodge, B. M.; Florita, A.

    2013-10-01

    Forecasting solar energy generation is a challenging task due to the variety of solar power systems and weather regimes encountered. Forecast inaccuracies can result in substantial economic losses and power system reliability issues. This paper presents a suite of generally applicable and value-based metrics for solar forecasting for a comprehensive set of scenarios (i.e., different time horizons, geographic locations, applications, etc.). In addition, a comprehensive framework is developed to analyze the sensitivity of the proposed metrics to three types of solar forecasting improvements using a design of experiments methodology, in conjunction with response surface and sensitivity analysis methods. The resultsmore » show that the developed metrics can efficiently evaluate the quality of solar forecasts, and assess the economic and reliability impact of improved solar forecasting.« less

  4. A New Approach in Generating Meteorological Forecasts for Ensemble Streamflow Forecasting using Multivariate Functions

    NASA Astrophysics Data System (ADS)

    Khajehei, S.; Madadgar, S.; Moradkhani, H.

    2014-12-01

    The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).

  5. Winter precipitation forecast in the European and Mediterranean regions using cluster analysis

    NASA Astrophysics Data System (ADS)

    Molnos, S.

    2017-12-01

    The European and Mediterranean climates are sensitive to large-scale circulation of the atmosphere andocean making it difficult to forecast precipitation or temperature on seasonal time-scales. In addition, theMediterranean region has been identified as a hotspot for climate change and already today a drying in theMediterranean region is observed.Thus, it is critically important to predict seasonal droughts as early as possible such that water managersand stakeholders can mitigate impacts.We developed a novel cluster-based forecast method to empirically predict winter's precipitationanomalies in European and Mediterranean regions using precursors in autumn. This approach does notonly utilizes the amplitude but also the pattern of the precursors in generating the forecast.Using a toy model we show that it achieves a better forecast skill than more traditional regression models. Furthermore, we compare our algorithm with dynamic forecast models demonstrating that our prediction method performs better in terms of time and pattern correlation in the Mediterranean and European regions.

  6. Approaches in Health Human Resource Forecasting: A Roadmap for Improvement.

    PubMed

    Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh

    2016-09-01

    Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. A literature review was conducted for studies published in English from 1990-2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies' references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses. Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems.

  7. Performance of the 'material Failure Forecast Method' in real-time situations: A Bayesian approach applied on effusive and explosive eruptions

    NASA Astrophysics Data System (ADS)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.; Arámbula-Mendoza, R.; Budi-Santoso, A.

    2016-11-01

    Most attempts of deterministic eruption forecasting are based on the material Failure Forecast Method (FFM). This method assumes that a precursory observable, such as the rate of seismic activity, can be described by a simple power law which presents a singularity at a time close to the eruption onset. Until now, this method has been applied only in a small number of cases, generally for forecasts in hindsight. In this paper, a rigorous Bayesian approach of the FFM designed for real-time applications is applied. Using an automatic recognition system, seismo-volcanic events are detected and classified according to their physical mechanism and time series of probability distributions of the rates of events are calculated. At each time of observation, a Bayesian inversion provides estimations of the exponent of the power law and of the time of eruption, together with their probability density functions. Two criteria are defined in order to evaluate the quality and reliability of the forecasts. Our automated procedure has allowed the analysis of long, continuous seismic time series: 13 years from Volcán de Colima, Mexico, 10 years from Piton de la Fournaise, Reunion Island, France, and several months from Merapi volcano, Java, Indonesia. The new forecasting approach has been applied to 64 pre-eruptive sequences which present various types of dominant seismic activity (volcano-tectonic or long-period events) and patterns of seismicity with different level of complexity. This has allowed us to test the FFM assumptions, to determine in which conditions the method can be applied, and to quantify the success rate of the forecasts. 62% of the precursory sequences analysed are suitable for the application of FFM and half of the total number of eruptions are successfully forecast in hindsight. In real-time, the method allows for the successful forecast of 36% of all the eruptions considered. Nevertheless, real-time forecasts are successful for 83% of the cases that fulfil the reliability criteria. Therefore, good confidence on the method is obtained when the reliability criteria are met.

  8. Calibration of limited-area ensemble precipitation forecasts for hydrological predictions

    NASA Astrophysics Data System (ADS)

    Diomede, Tommaso; Marsigli, Chiara; Montani, Andrea; Nerozzi, Fabrizio; Paccagnella, Tiziana

    2015-04-01

    The main objective of this study is to investigate the impact of calibration for limited-area ensemble precipitation forecasts, to be used for driving discharge predictions up to 5 days in advance. A reforecast dataset, which spans 30 years, based on the Consortium for Small Scale Modeling Limited-Area Ensemble Prediction System (COSMO-LEPS) was used for testing the calibration strategy. Three calibration techniques were applied: quantile-to-quantile mapping, linear regression, and analogs. The performance of these methodologies was evaluated in terms of statistical scores for the precipitation forecasts operationally provided by COSMO-LEPS in the years 2003-2007 over Germany, Switzerland, and the Emilia-Romagna region (northern Italy). The analog-based method seemed to be preferred because of its capability of correct position errors and spread deficiencies. A suitable spatial domain for the analog search can help to handle model spatial errors as systematic errors. However, the performance of the analog-based method may degrade in cases where a limited training dataset is available. A sensitivity test on the length of the training dataset over which to perform the analog search has been performed. The quantile-to-quantile mapping and linear regression methods were less effective, mainly because the forecast-analysis relation was not so strong for the available training dataset. A comparison between the calibration based on the deterministic reforecast and the calibration based on the full operational ensemble used as training dataset has been considered, with the aim to evaluate whether reforecasts are really worthy for calibration, given that their computational cost is remarkable. The verification of the calibration process was then performed by coupling ensemble precipitation forecasts with a distributed rainfall-runoff model. This test was carried out for a medium-sized catchment located in Emilia-Romagna, showing a beneficial impact of the analog-based method on the reduction of missed events for discharge predictions.

  9. Bias Adjusted Precipitation Threat Scores

    NASA Astrophysics Data System (ADS)

    Mesinger, F.

    2008-04-01

    Among the wide variety of performance measures available for the assessment of skill of deterministic precipitation forecasts, the equitable threat score (ETS) might well be the one used most frequently. It is typically used in conjunction with the bias score. However, apart from its mathematical definition the meaning of the ETS is not clear. It has been pointed out (Mason, 1989; Hamill, 1999) that forecasts with a larger bias tend to have a higher ETS. Even so, the present author has not seen this having been accounted for in any of numerous papers that in recent years have used the ETS along with bias "as a measure of forecast accuracy". A method to adjust the threat score (TS) or the ETS so as to arrive at their values that correspond to unit bias in order to show the model's or forecaster's accuracy in placing precipitation has been proposed earlier by the present author (Mesinger and Brill, the so-called dH/dF method). A serious deficiency however has since been noted with the dH/dF method in that the hypothetical function that it arrives at to interpolate or extrapolate the observed value of hits to unit bias can have values of hits greater than forecast when the forecast area tends to zero. Another method is proposed here based on the assumption that the increase in hits per unit increase in false alarms is proportional to the yet unhit area. This new method removes the deficiency of the dH/dF method. Examples of its performance for 12 months of forecasts by three NCEP operational models are given.

  10. Software selection based on analysis and forecasting methods, practised in 1C

    NASA Astrophysics Data System (ADS)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  11. A short-term and high-resolution distribution system load forecasting approach using support vector regression with hybrid parameters optimization

    DOE PAGES

    Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard; ...

    2016-01-01

    This paper proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system. The performance of the proposed approach is compared to some classic methods in later sections of the paper.« less

  12. Forbush Decrease Prediction Based on Remote Solar Observations

    NASA Astrophysics Data System (ADS)

    Dumbovic, Mateja; Vrsnak, Bojan; Calogovic, Jasa

    2016-04-01

    We study the relation between remote observations of coronal mass ejections (CMEs), their associated solar flares and short-term depressions in the galactic cosmic-ray flux (so called Forbush decreases). Statistical relations between Forbush decrease magnitude and several CME/flare parameters are examined. In general we find that Forbush decrease magnitude is larger for faster CMEs with larger apparent width, which is associated with stronger flares that originate close to the center of the solar disk and are (possibly) involved in a CME-CME interaction. The statistical relations are quantified and employed to forecast expected Forbush decrease magnitude range based on the selected remote solar observations of the CME and associated solar flare. Several verification measures are used to evaluate the forecast method. We find that the forecast is most reliable in predicting whether or not a CME will produce a Forbush decrease with a magnitude >3 %. The main advantage of the method is that it provides an early prediction, 1-4 days in advance. Based on the presented research, an online forecast tool was developed (Forbush Decrease Forecast Tool, FDFT) available at Hvar Observatory web page: http://oh.geof.unizg.hr/FDFT/fdft.php. We acknowledge the support of Croatian Science Foundation under the project 6212 "Solar and Stellar Variability" and of European social fond under the project "PoKRet".

  13. Electricity forecasting on the individual household level enhanced based on activity patterns

    PubMed Central

    Gajowniczek, Krzysztof; Ząbkowski, Tomasz

    2017-01-01

    Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents’ daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken. PMID:28423039

  14. Electricity forecasting on the individual household level enhanced based on activity patterns.

    PubMed

    Gajowniczek, Krzysztof; Ząbkowski, Tomasz

    2017-01-01

    Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents' daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken.

  15. Three ingredients for Improved global aftershock forecasts: Tectonic region, time-dependent catalog incompleteness, and inter-sequence variability

    USGS Publications Warehouse

    Page, Morgan T.; Van Der Elst, Nicholas; Hardebeck, Jeanne L.; Felzer, Karen; Michael, Andrew J.

    2016-01-01

    Following a large earthquake, seismic hazard can be orders of magnitude higher than the long‐term average as a result of aftershock triggering. Because of this heightened hazard, emergency managers and the public demand rapid, authoritative, and reliable aftershock forecasts. In the past, U.S. Geological Survey (USGS) aftershock forecasts following large global earthquakes have been released on an ad hoc basis with inconsistent methods, and in some cases aftershock parameters adapted from California. To remedy this, the USGS is currently developing an automated aftershock product based on the Reasenberg and Jones (1989) method that will generate more accurate forecasts. To better capture spatial variations in aftershock productivity and decay, we estimate regional aftershock parameters for sequences within the García et al. (2012) tectonic regions. We find that regional variations for mean aftershock productivity reach almost a factor of 10. We also develop a method to account for the time‐dependent magnitude of completeness following large events in the catalog. In addition to estimating average sequence parameters within regions, we develop an inverse method to estimate the intersequence parameter variability. This allows for a more complete quantification of the forecast uncertainties and Bayesian updating of the forecast as sequence‐specific information becomes available.

  16. Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy

    DOE PAGES

    Rosewater, David; Ferreira, Summer; Schoenwald, David; ...

    2018-01-25

    Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less

  17. Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosewater, David; Ferreira, Summer; Schoenwald, David

    Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less

  18. Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States.

    PubMed

    Yamana, Teresa K; Kandula, Sasikiran; Shaman, Jeffrey

    2017-11-01

    Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time.

  19. Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States

    PubMed Central

    Kandula, Sasikiran; Shaman, Jeffrey

    2017-01-01

    Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time. PMID:29107987

  20. How much are you prepared to PAY for a forecast?

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Coughlan, Erin; Ramos, Maria-Helena; Pappenberger, Florian; Wetterhall, Fredrik; Bachofen, Carina; van Andel, Schalk Jan

    2015-04-01

    Probabilistic hydro-meteorological forecasts are a crucial element of the decision-making chain in the field of flood prevention. The operational use of probabilistic forecasts is increasingly promoted through the development of new novel state-of-the-art forecast methods and numerical skill is continuously increasing. However, the value of such forecasts for flood early-warning systems is a topic of diverging opinions. Indeed, the word value, when applied to flood forecasting, is multifaceted. It refers, not only to the raw cost of acquiring and maintaining a probabilistic forecasting system (in terms of human and financial resources, data volume and computational time), but also and most importantly perhaps, to the use of such products. This game aims at investigating this point. It is a willingness to pay game, embedded in a risk-based decision-making experiment. Based on a ``Red Cross/Red Crescent, Climate Centre'' game, it is a contribution to the international Hydrologic Ensemble Prediction Experiment (HEPEX). A limited number of probabilistic forecasts will be auctioned to the participants; the price of these forecasts being market driven. All participants (irrespective of having bought or not a forecast set) will then be taken through a decision-making process to issue warnings for extreme rainfall. This game will promote discussions around the topic of the value of forecasts for decision-making in the field of flood prevention.

  1. Demand forecast model based on CRM

    NASA Astrophysics Data System (ADS)

    Cai, Yuancui; Chen, Lichao

    2006-11-01

    With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.

  2. Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model

    NASA Astrophysics Data System (ADS)

    Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2018-04-01

    A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.

  3. A Wind Forecasting System for Energy Application

    NASA Astrophysics Data System (ADS)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2010-05-01

    Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.

  4. Automation of energy demand forecasting

    NASA Astrophysics Data System (ADS)

    Siddique, Sanzad

    Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.

  5. Fuzzy time-series based on Fibonacci sequence for stock price forecasting

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia

    2007-07-01

    Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.

  6. Research on electricity consumption forecast based on mutual information and random forests algorithm

    NASA Astrophysics Data System (ADS)

    Shi, Jing; Shi, Yunli; Tan, Jian; Zhu, Lei; Li, Hu

    2018-02-01

    Traditional power forecasting models cannot efficiently take various factors into account, neither to identify the relation factors. In this paper, the mutual information in information theory and the artificial intelligence random forests algorithm are introduced into the medium and long-term electricity demand prediction. Mutual information can identify the high relation factors based on the value of average mutual information between a variety of variables and electricity demand, different industries may be highly associated with different variables. The random forests algorithm was used for building the different industries forecasting models according to the different correlation factors. The data of electricity consumption in Jiangsu Province is taken as a practical example, and the above methods are compared with the methods without regard to mutual information and the industries. The simulation results show that the above method is scientific, effective, and can provide higher prediction accuracy.

  7. Using time series structural characteristics to analyze grain prices in food insecure countries

    USGS Publications Warehouse

    Davenport, Frank; Funk, Chris

    2015-01-01

    Two components of food security monitoring are accurate forecasts of local grain prices and the ability to identify unusual price behavior. We evaluated a method that can both facilitate forecasts of cross-country grain price data and identify dissimilarities in price behavior across multiple markets. This method, characteristic based clustering (CBC), identifies similarities in multiple time series based on structural characteristics in the data. Here, we conducted a simulation experiment to determine if CBC can be used to improve the accuracy of maize price forecasts. We then compared forecast accuracies among clustered and non-clustered price series over a rolling time horizon. We found that the accuracy of forecasts on clusters of time series were equal to or worse than forecasts based on individual time series. However, in the following experiment we found that CBC was still useful for price analysis. We used the clusters to explore the similarity of price behavior among Kenyan maize markets. We found that price behavior in the isolated markets of Mandera and Marsabit has become increasingly dissimilar from markets in other Kenyan cities, and that these dissimilarities could not be explained solely by geographic distance. The structural isolation of Mandera and Marsabit that we find in this paper is supported by field studies on food security and market integration in Kenya. Our results suggest that a market with a unique price series (as measured by structural characteristics that differ from neighboring markets) may lack market integration and food security.

  8. Fuzzy logic-based analogue forecasting and hybrid modelling of horizontal visibility

    NASA Astrophysics Data System (ADS)

    Tuba, Zoltán; Bottyán, Zsolt

    2018-04-01

    Forecasting visibility is one of the greatest challenges in aviation meteorology. At the same time, high accuracy visibility forecasts can significantly reduce or make avoidable weather-related risk in aviation as well. To improve forecasting visibility, this research links fuzzy logic-based analogue forecasting and post-processed numerical weather prediction model outputs in hybrid forecast. Performance of analogue forecasting model was improved by the application of Analytic Hierarchy Process. Then, linear combination of the mentioned outputs was applied to create ultra-short term hybrid visibility prediction which gradually shifts the focus from statistical to numerical products taking their advantages during the forecast period. It gives the opportunity to bring closer the numerical visibility forecast to the observations even it is wrong initially. Complete verification of categorical forecasts was carried out; results are available for persistence and terminal aerodrome forecasts (TAF) as well in order to compare. The average value of Heidke Skill Score (HSS) of examined airports of analogue and hybrid forecasts shows very similar results even at the end of forecast period where the rate of analogue prediction in the final hybrid output is 0.1-0.2 only. However, in case of poor visibility (1000-2500 m), hybrid (0.65) and analogue forecasts (0.64) have similar average of HSS in the first 6 h of forecast period, and have better performance than persistence (0.60) or TAF (0.56). Important achievement that hybrid model takes into consideration physics and dynamics of the atmosphere due to the increasing part of the numerical weather prediction. In spite of this, its performance is similar to the most effective visibility forecasting methods and does not follow the poor verification results of clearly numerical outputs.

  9. Generalization of information-based concepts in forecast verification

    NASA Astrophysics Data System (ADS)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  10. Forecast on Water Locking Damage of Low Permeable Reservoir with Quantum Neural Network

    NASA Astrophysics Data System (ADS)

    Zhao, Jingyuan; Sun, Yuxue; Feng, Fuping; Zhao, Fulei; Sui, Dianjie; Xu, Jianjun

    2018-01-01

    It is of great importance in oil-gas reservoir protection to timely and correctly forecast the water locking damage, the greatest damage for low permeable reservoir. An analysis is conducted on the production mechanism and various influence factors of water locking damage, based on which a quantum neuron is constructed based on the information processing manner of a biological neuron and the principle of quantum neural algorithm, besides, the quantum neural network model forecasting the water locking of the reservoir is established and related software is also made to forecast the water locking damage of the gas reservoir. This method has overcome the defects of grey correlation analysis that requires evaluation matrix analysis and complicated operation. According to the practice in Longxi Area of Daqing Oilfield, this method is characterized by fast operation, few system parameters and high accuracy rate (the general incidence rate may reach 90%), which can provide reliable support for the protection technique of low permeable reservoir.

  11. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  12. A pilot study of river flow prediction in urban area based on phase space reconstruction

    NASA Astrophysics Data System (ADS)

    Adenan, Nur Hamiza; Hamid, Nor Zila Abd; Mohamed, Zulkifley; Noorani, Mohd Salmi Md

    2017-08-01

    River flow prediction is significantly related to urban hydrology impact which can provide information to solve any problems such as flood in urban area. The daily river flow of Klang River, Malaysia was chosen to be forecasted in this pilot study which based on phase space reconstruction. The reconstruction of phase space involves a single variable of river flow data to m-dimensional phase space in which the dimension (m) is based on the optimal values of Cao method. The results from the reconstruction of phase space have been used in the forecasting process using local linear approximation method. From our investigation, river flow at Klang River is chaotic based on the analysis from Cao method. The overall results provide good value of correlation coefficient. The value of correlation coefficient is acceptable since the area of the case study is influence by a lot of factors. Therefore, this pilot study may be proposed to forecast daily river flow data with the purpose of providing information about the flow of the river system in urban area.

  13. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    NASA Astrophysics Data System (ADS)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  14. A new scoring method for evaluating the performance of earthquake forecasts and predictions

    NASA Astrophysics Data System (ADS)

    Zhuang, J.

    2009-12-01

    This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  15. Comparison of Observation Impacts in Two Forecast Systems using Adjoint Methods

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Langland, Rolf; Todling, Ricardo

    2009-01-01

    An experiment is being conducted to compare directly the impact of all assimilated observations on short-range forecast errors in different operational forecast systems. We use the adjoint-based method developed by Langland and Baker (2004), which allows these impacts to be efficiently calculated. This presentation describes preliminary results for a "baseline" set of observations, including both satellite radiances and conventional observations, used by the Navy/NOGAPS and NASA/GEOS-5 forecast systems for the month of January 2007. In each system, about 65% of the total reduction in 24-h forecast error is provided by satellite observations, although the impact of rawinsonde, aircraft, land, and ship-based observations remains significant. Only a small majority (50- 55%) of all observations assimilated improves the forecast, while the rest degrade it. It is found that most of the total forecast error reduction comes from observations with moderate-size innovations providing small to moderate impacts, not from outliers with very large positive or negative innovations. In a global context, the relative impacts of the major observation types are fairly similar in each system, although regional differences in observation impact can be significant. Of particular interest is the fact that while satellite radiances have a large positive impact overall, they degrade the forecast in certain locations common to both systems, especially over land and ice surfaces. Ongoing comparisons of this type, with results expected from other operational centers, should lead to more robust conclusions about the impacts of the various components of the observing system as well as about the strengths and weaknesses of the methodologies used to assimilate them.

  16. The total probabilities from high-resolution ensemble forecasting of floods

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2015-04-01

    Ensemble forecasting has for a long time been used in meteorological modelling, to give an indication of the uncertainty of the forecasts. As meteorological ensemble forecasts often show some bias and dispersion errors, there is a need for calibration and post-processing of the ensembles. Typical methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). To make optimal predictions of floods along the stream network in hydrology, we can easily use the ensemble members as input to the hydrological models. However, some of the post-processing methods will need modifications when regionalizing the forecasts outside the calibration locations, as done by Hemri et al. (2013). We present a method for spatial regionalization of the post-processed forecasts based on EMOS and top-kriging (Skøien et al., 2006). We will also look into different methods for handling the non-normality of runoff and the effect on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005. Skøien, J. O., Merz, R. and Blöschl, G.: Top-kriging - Geostatistics on stream networks, Hydrol. Earth Syst. Sci., 10(2), 277-287, 2006.

  17. Day-Ahead PM2.5 Concentration Forecasting Using WT-VMD Based Decomposition Method and Back Propagation Neural Network Improved by Differential Evolution

    PubMed Central

    Wang, Deyun; Liu, Yanling; Luo, Hongyuan; Yue, Chenqiang; Cheng, Sheng

    2017-01-01

    Accurate PM2.5 concentration forecasting is crucial for protecting public health and atmospheric environment. However, the intermittent and unstable nature of PM2.5 concentration series makes its forecasting become a very difficult task. In order to improve the forecast accuracy of PM2.5 concentration, this paper proposes a hybrid model based on wavelet transform (WT), variational mode decomposition (VMD) and back propagation (BP) neural network optimized by differential evolution (DE) algorithm. Firstly, WT is employed to disassemble the PM2.5 concentration series into a number of subsets with different frequencies. Secondly, VMD is applied to decompose each subset into a set of variational modes (VMs). Thirdly, DE-BP model is utilized to forecast all the VMs. Fourthly, the forecast value of each subset is obtained through aggregating the forecast results of all the VMs obtained from VMD decomposition of this subset. Finally, the final forecast series of PM2.5 concentration is obtained by adding up the forecast values of all subsets. Two PM2.5 concentration series collected from Wuhan and Tianjin, respectively, located in China are used to test the effectiveness of the proposed model. The results demonstrate that the proposed model outperforms all the other considered models in this paper. PMID:28704955

  18. Forecasting Nonlinear Chaotic Time Series with Function Expression Method Based on an Improved Genetic-Simulated Annealing Algorithm

    PubMed Central

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior. PMID:26000011

  19. Forecast and analysis of the ratio of electric energy to terminal energy consumption for global energy internet

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si

    2018-02-01

    In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.

  20. Forecasting nonlinear chaotic time series with function expression method based on an improved genetic-simulated annealing algorithm.

    PubMed

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.

  1. Load forecasting via suboptimal seasonal autoregressive models and iteratively reweighted least squares estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mbamalu, G.A.N.; El-Hawary, M.E.

    The authors propose suboptimal least squares or IRWLS procedures for estimating the parameters of a seasonal multiplicative AR model encountered during power system load forecasting. The proposed method involves using an interactive computer environment to estimate the parameters of a seasonal multiplicative AR process. The method comprises five major computational steps. The first determines the order of the seasonal multiplicative AR process, and the second uses the least squares or the IRWLS to estimate the optimal nonseasonal AR model parameters. In the third step one obtains the intermediate series by back forecast, which is followed by using the least squaresmore » or the IRWLS to estimate the optimal season AR parameters. The final step uses the estimated parameters to forecast future load. The method is applied to predict the Nova Scotia Power Corporation's 168 lead time hourly load. The results obtained are documented and compared with results based on the Box and Jenkins method.« less

  2. The method of planning the energy consumption for electricity market

    NASA Astrophysics Data System (ADS)

    Russkov, O. V.; Saradgishvili, S. E.

    2017-10-01

    The limitations of existing forecast models are defined. The offered method is based on game theory, probabilities theory and forecasting the energy prices relations. New method is the basis for planning the uneven energy consumption of industrial enterprise. Ecological side of the offered method is disclosed. The program module performed the algorithm of the method is described. Positive method tests at the industrial enterprise are shown. The offered method allows optimizing the difference between planned and factual consumption of energy every hour of a day. The conclusion about applicability of the method for addressing economic and ecological challenges is made.

  3. Crude oil price analysis and forecasting based on variational mode decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    E, Jianwei; Bao, Yanling; Ye, Jimin

    2017-10-01

    As one of the most vital energy resources in the world, crude oil plays a significant role in international economic market. The fluctuation of crude oil price has attracted academic and commercial attention. There exist many methods in forecasting the trend of crude oil price. However, traditional models failed in predicting accurately. Based on this, a hybrid method will be proposed in this paper, which combines variational mode decomposition (VMD), independent component analysis (ICA) and autoregressive integrated moving average (ARIMA), called VMD-ICA-ARIMA. The purpose of this study is to analyze the influence factors of crude oil price and predict the future crude oil price. Major steps can be concluded as follows: Firstly, applying the VMD model on the original signal (crude oil price), the modes function can be decomposed adaptively. Secondly, independent components are separated by the ICA, and how the independent components affect the crude oil price is analyzed. Finally, forecasting the price of crude oil price by the ARIMA model, the forecasting trend demonstrates that crude oil price declines periodically. Comparing with benchmark ARIMA and EEMD-ICA-ARIMA, VMD-ICA-ARIMA can forecast the crude oil price more accurately.

  4. Skill of Global Raw and Postprocessed Ensemble Predictions of Rainfall over Northern Tropical Africa

    NASA Astrophysics Data System (ADS)

    Vogel, Peter; Knippertz, Peter; Fink, Andreas H.; Schlueter, Andreas; Gneiting, Tilmann

    2018-04-01

    Accumulated precipitation forecasts are of high socioeconomic importance for agriculturally dominated societies in northern tropical Africa. In this study, we analyze the performance of nine operational global ensemble prediction systems (EPSs) relative to climatology-based forecasts for 1 to 5-day accumulated precipitation based on the monsoon seasons 2007-2014 for three regions within northern tropical Africa. To assess the full potential of raw ensemble forecasts across spatial scales, we apply state-of-the-art statistical postprocessing methods in form of Bayesian Model Averaging (BMA) and Ensemble Model Output Statistics (EMOS), and verify against station and spatially aggregated, satellite-based gridded observations. Raw ensemble forecasts are uncalibrated, unreliable, and underperform relative to climatology, independently of region, accumulation time, monsoon season, and ensemble. Differences between raw ensemble and climatological forecasts are large, and partly stem from poor prediction for low precipitation amounts. BMA and EMOS postprocessed forecasts are calibrated, reliable, and strongly improve on the raw ensembles, but - somewhat disappointingly - typically do not outperform climatology. Most EPSs exhibit slight improvements over the period 2007-2014, but overall have little added value compared to climatology. We suspect that the parametrization of convection is a potential cause for the sobering lack of ensemble forecast skill in a region dominated by mesoscale convective systems.

  5. Adaptive time-variant models for fuzzy-time-series forecasting.

    PubMed

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  6. Automated flare forecasting using a statistical learning technique

    NASA Astrophysics Data System (ADS)

    Yuan, Yuan; Shih, Frank Y.; Jing, Ju; Wang, Hai-Min

    2010-08-01

    We present a new method for automatically forecasting the occurrence of solar flares based on photospheric magnetic measurements. The method is a cascading combination of an ordinal logistic regression model and a support vector machine classifier. The predictive variables are three photospheric magnetic parameters, i.e., the total unsigned magnetic flux, length of the strong-gradient magnetic polarity inversion line, and total magnetic energy dissipation. The output is true or false for the occurrence of a certain level of flares within 24 hours. Experimental results, from a sample of 230 active regions between 1996 and 2005, show the accuracies of a 24-hour flare forecast to be 0.86, 0.72, 0.65 and 0.84 respectively for the four different levels. Comparison shows an improvement in the accuracy of X-class flare forecasting.

  7. An assessment of a North American Multi-Model Ensemble (NMME) based global drought early warning forecast system

    NASA Astrophysics Data System (ADS)

    Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.

    2013-12-01

    One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.

  8. Evaluation of the fast orthogonal search method for forecasting chloride levels in the Deltona groundwater supply (Florida, USA)

    NASA Astrophysics Data System (ADS)

    El-Jaat, Majda; Hulley, Michael; Tétreault, Michel

    2018-02-01

    Despite the broad impact and importance of saltwater intrusion in coastal aquifers, little research has been directed towards forecasting saltwater intrusion in areas where the source of saltwater is uncertain. Saline contamination in inland groundwater supplies is a concern for numerous communities in the southern US including the city of Deltona, Florida. Furthermore, conventional numerical tools for forecasting saltwater contamination are heavily dependent on reliable characterization of the physical characteristics of underlying aquifers, information that is often absent or challenging to obtain. To overcome these limitations, a reliable alternative data-driven model for forecasting salinity in a groundwater supply was developed for Deltona using the fast orthogonal search (FOS) method. FOS was applied on monthly water-demand data and corresponding chloride concentrations at water supply wells. Groundwater salinity measurements from Deltona water supply wells were applied to evaluate the forecasting capability and accuracy of the FOS model. Accurate and reliable groundwater salinity forecasting is necessary to support effective and sustainable coastal-water resource planning and management. The available (27) water supply wells for Deltona were randomly split into three test groups for the purposes of FOS model development and performance assessment. Based on four performance indices (RMSE, RSR, NSEC, and R), the FOS model proved to be a reliable and robust forecaster of groundwater salinity. FOS is relatively inexpensive to apply, is not based on rigorous physical characterization of the water supply aquifer, and yields reliable estimates of groundwater salinity in active water supply wells.

  9. The Past, Present and Future of the Meteorological Phenomena Identification Near the Ground (mPING) Project

    NASA Astrophysics Data System (ADS)

    Elmore, K. L.

    2016-12-01

    The Metorological Phenomemna Identification NeartheGround (mPING) project is an example of a crowd-sourced, citizen science effort to gather data of sufficeint quality and quantity needed by new post processing methods that use machine learning. Transportation and infrastructure are particularly sensitive to precipitation type in winter weather. We extract attributes from operational numerical forecast models and use them in a random forest to generate forecast winter precipitation types. We find that random forests applied to forecast soundings are effective at generating skillful forecasts of surface ptype with consideralbly more skill than the current algorithms, especuially for ice pellets and freezing rain. We also find that three very different forecast models yuield similar overall results, showing that random forests are able to extract essentially equivalent information from different forecast models. We also show that the random forest for each model, and each profile type is unique to the particular forecast model and that the random forests developed using a particular model suffer significant degradation when given attributes derived from a different model. This implies that no single algorithm can perform well across all forecast models. Clearly, random forests extract information unavailable to "physically based" methods because the physical information in the models does not appear as we expect. One intersting result is that results from the classic "warm nose" sounding profile are, by far, the most sensitive to the particular forecast model, but this profile is also the one for which random forests are most skillful. Finally, a method for calibrarting probabilties for each different ptype using multinomial logistic regression is shown.

  10. Predictability and Prediction of Low-Frequency Rainfall Over the Lower Reaches of the Yangtze River Valley on the Time Scale of 20 to 30 days

    NASA Astrophysics Data System (ADS)

    Yang, Qiuming

    2018-01-01

    This paper presents a predictability study of the 20-30-day low-frequency rainfall over the lower reaches of the Yangtze River valley (LYRV). This study relies on an extended complex autoregressive (ECAR) model method, which is based on the principal components of the global 850 hPa low-frequency meridional wind. ECAR is a recently advanced climate forecast method, based on data-driven models. It not only reflects the lagged variations information between the leading low-frequency components of the global circulation and rainfall in a complex space, but also displays the ability to describe the synergy variations of low-frequency components of a climate system in a low dimensional space. A 6-year forecast experiment is conducted on the low-frequency rainfall over the LYRV for the extended-range daily forecasts during 2009-2014, based on the time-varying high-order ECAR. These experimental results demonstrate that the useful skills of the real-time forecasts are achieved for an extended lead-time up to 28 days with a fifth-order model, and are also shown to be 27-day lead for forecasts which are initiated from weak intraseasonal oscillation (ISO). This high-order ECAR displays the ability to significantly improve the predictions of the ISO. The analysis of the 20-30-day ISO predictability reveals a predictability limit of about 28-40 days. Therefore, the forecast framework used in this study is determined to have the potential to assist in improving the real-time forecasts for the 20-30-day oscillations related to the heavy rainfall over the LYRV in summer.

  11. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    NASA Technical Reports Server (NTRS)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  12. New Aspects of Probabilistic Forecast Verification Using Information Theory

    NASA Astrophysics Data System (ADS)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  13. The Weather Forecast Using Data Mining Research Based on Cloud Computing.

    NASA Astrophysics Data System (ADS)

    Wang, ZhanJie; Mazharul Mujib, A. B. M.

    2017-10-01

    Weather forecasting has been an important application in meteorology and one of the most scientifically and technologically challenging problem around the world. In my study, we have analyzed the use of data mining techniques in forecasting weather. This paper proposes a modern method to develop a service oriented architecture for the weather information systems which forecast weather using these data mining techniques. This can be carried out by using Artificial Neural Network and Decision tree Algorithms and meteorological data collected in Specific time. Algorithm has presented the best results to generate classification rules for the mean weather variables. The results showed that these data mining techniques can be enough for weather forecasting.

  14. Growth of Errors and Uncertainties in Medium Range Ensemble Forecasts of U.S. East Coast Cool Season Extratropical Cyclones

    NASA Astrophysics Data System (ADS)

    Zheng, Minghua

    Cool-season extratropical cyclones near the U.S. East Coast often have significant impacts on the safety, health, environment and economy of this most densely populated region. Hence it is of vital importance to forecast these high-impact winter storm events as accurately as possible by numerical weather prediction (NWP), including in the medium-range. Ensemble forecasts are appealing to operational forecasters when forecasting such events because they can provide an envelope of likely solutions to serve user communities. However, it is generally accepted that ensemble outputs are not used efficiently in NWS operations mainly due to the lack of simple and quantitative tools to communicate forecast uncertainties and ensemble verification to assess model errors and biases. Ensemble sensitivity analysis (ESA), which employs a linear correlation and regression between a chosen forecast metric and the forecast state vector, can be used to analyze the forecast uncertainty development for both short- and medium-range forecasts. The application of ESA to a high-impact winter storm in December 2010 demonstrated that the sensitivity signals based on different forecast metrics are robust. In particular, the ESA based on the leading two EOF PCs can separate sensitive regions associated with cyclone amplitude and intensity uncertainties, respectively. The sensitivity signals were verified using the leave-one-out cross validation (LOOCV) method based on a multi-model ensemble from CMC, ECMWF, and NCEP. The climatology of ensemble sensitivities for the leading two EOF PCs based on 3-day and 6-day forecasts of historical cyclone cases was presented. It was found that the EOF1 pattern often represents the intensity variations while the EOF2 pattern represents the track variations along west-southwest and east-northeast direction. For PC1, the upper-level trough associated with the East Coast cyclone and its downstream ridge are important to the forecast uncertainty in cyclone strength. The initial differences in forecasting the ridge along the west coast of North America impact the EOF1 pattern most. For PC2, it was shown that the shift of the tri-polar structure is most significantly related to the cyclone track forecasts. The EOF/fuzzy clustering tool was applied to diagnose the scenarios in operational ensemble forecast of East Coast winter storms. It was shown that the clustering method could efficiently separate the forecast scenarios associated with East Coast storms based on the 90-member multi-model ensemble. A scenario-based ensemble verification method has been proposed and applied it to examine the capability of different EPSs in capturing the analysis scenarios for historical East Coast cyclone cases at lead times of 1-9 days. The results suggest that the NCEP model performs better in short-range forecasts in capturing the analysis scenario although it is under-dispersed. The ECMWF ensemble shows the best performance in the medium range. The CMC model is found to show the smallest percentage of members in the analysis group and a relatively high missing rate, suggesting that it is less reliable regarding capturing the analysis scenario when compared with the other two EPSs. A combination of NCEP and CMC models has been found to reduce the missing rate and improve the error-spread skill in medium- to extended-range forecasts. Based on the orthogonal features of the EOF patterns, the model errors for 1-6-day forecasts have been decomposed for the leading two EOF patterns. The results for error decomposition show that the NCEP model tends to better represent both EOF1 and EOF2 patterns by showing less intensity and displacement errors during 1-3 days. The ECMWF model is found to have the smallest errors in both EOF1 and EOF2 patterns during 4-6 days. We have also found that East Coast cyclones in the ECMWF forecast tend to be towards the southwest of the other two models in representing the EOF2 pattern, which is associated with the southwest-northeast shifting of the cyclone. This result suggests that ECMWF model may have a tendency to show a closer-to-shore solution in forecasting East Coast winter storms. The downstream impacts of Rossby wave packets (RWPs) on the predictability of winter storms are investigated to explore the source of ensemble uncertainties. The composited RWPA anomalies show that there are enhanced RWPs propagating across the Pacific in both large-error and large-spread cases over the verification regions. There are also indications that the errors might propagate with a speed comparable with the group velocity of RWPs. Based on the composite results as well as our observations of the operation daily RWPA, a conceptual model of errors/uncertainty development associated with RWPs has been proposed to serve as a practical tool to understand the evolution of forecast errors and uncertainties associated with the coherent RWPs originating from upstream as far as western Pacific. (Abstract shortened by ProQuest.).

  15. Real-time localization of mobile device by filtering method for sensor fusion

    NASA Astrophysics Data System (ADS)

    Fuse, Takashi; Nagara, Keita

    2017-06-01

    Most of the applications with mobile devices require self-localization of the devices. GPS cannot be used in indoor environment, the positions of mobile devices are estimated autonomously by using IMU. Since the self-localization is based on IMU of low accuracy, and then the self-localization in indoor environment is still challenging. The selflocalization method using images have been developed, and the accuracy of the method is increasing. This paper develops the self-localization method without GPS in indoor environment by integrating sensors, such as IMU and cameras, on mobile devices simultaneously. The proposed method consists of observations, forecasting and filtering. The position and velocity of the mobile device are defined as a state vector. In the self-localization, observations correspond to observation data from IMU and camera (observation vector), forecasting to mobile device moving model (system model) and filtering to tracking method by inertial surveying and coplanarity condition and inverse depth model (observation model). Positions of a mobile device being tracked are estimated by system model (forecasting step), which are assumed as linearly moving model. Then estimated positions are optimized referring to the new observation data based on likelihood (filtering step). The optimization at filtering step corresponds to estimation of the maximum a posterior probability. Particle filter are utilized for the calculation through forecasting and filtering steps. The proposed method is applied to data acquired by mobile devices in indoor environment. Through the experiments, the high performance of the method is confirmed.

  16. Operational Planning of Channel Airlift Missions Using Forecasted Demand

    DTIC Science & Technology

    2013-03-01

    tailored to the specific problem ( Metaheuristics , 2005). As seen in the section Cargo Loading Algorithm , heuristic methods are often iterative...that are equivalent to the forecasted cargo amount. The simulated pallets are then used in a heuristic cargo loading algorithm . The loading... algorithm places cargo onto available aircraft (based on real schedules) given the date and the destination and outputs statistics based on the aircraft ton

  17. Global Earthquake Activity Rate models based on version 2 of the Global Strain Rate Map

    NASA Astrophysics Data System (ADS)

    Bird, P.; Kreemer, C.; Kagan, Y. Y.; Jackson, D. D.

    2013-12-01

    Global Earthquake Activity Rate (GEAR) models have usually been based on either relative tectonic motion (fault slip rates and/or distributed strain rates), or on smoothing of seismic catalogs. However, a hybrid approach appears to perform better than either parent, at least in some retrospective tests. First, we construct a Tectonic ('T') forecast of shallow (≤ 70 km) seismicity based on global plate-boundary strain rates from version 2 of the Global Strain Rate Map. Our approach is the SHIFT (Seismic Hazard Inferred From Tectonics) method described by Bird et al. [2010, SRL], in which the character of the strain rate tensor (thrusting and/or strike-slip and/or normal) is used to select the most comparable type of plate boundary for calibration of the coupled seismogenic lithosphere thickness and corner magnitude. One difference is that activity of offshore plate boundaries is spatially smoothed using empirical half-widths [Bird & Kagan, 2004, BSSA] before conversion to seismicity. Another is that the velocity-dependence of coupling in subduction and continental-convergent boundaries [Bird et al., 2009, BSSA] is incorporated. Another forecast component is the smoothed-seismicity ('S') forecast model of [Kagan & Jackson, 1994, JGR; Kagan & Jackson, 2010, GJI], which was based on optimized smoothing of the shallow part of the GCMT catalog, years 1977-2004. Both forecasts were prepared for threshold magnitude 5.767. Then, we create hybrid forecasts by one of 3 methods: (a) taking the greater of S or T; (b) simple weighted-average of S and T; or (c) log of the forecast rate is a weighted average of the logs of S and T. In methods (b) and (c) there is one free parameter, which is the fractional contribution from S. All hybrid forecasts are normalized to the same global rate. Pseudo-prospective tests for 2005-2012 (using versions of S and T calibrated on years 1977-2004) show that many hybrid models outperform both parents (S and T), and that the optimal weight on S is in the neighborhood of 5/8. This is true whether forecast performance is scored by Kagan's [2009, GJI] I1 information score, or by the S-test of Zechar & Jordan [2010, BSSA]. These hybrids also score well (0.97) in the ASS-test of Zechar & Jordan [2008, GJI] with respect to prior relative intensity.

  18. Assessing methods for developing crop forecasting in the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Ines, A. V. M.; Capa Morocho, M. I.; Baethgen, W.; Rodriguez-Fonseca, B.; Han, E.; Ruiz Ramos, M.

    2015-12-01

    Seasonal climate prediction may allow predicting crop yield to reduce the vulnerability of agricultural production to climate variability and its extremes. It has been already demonstrated that seasonal climate predictions at European (or Iberian) scale from ensembles of global coupled climate models have some skill (Palmer et al., 2004). The limited predictability that exhibits the atmosphere in mid-latitudes, and therefore de Iberian Peninsula (PI), can be managed by a probabilistic approach based in terciles. This study presents an application for the IP of two methods for linking tercile-based seasonal climate forecasts with crop models to improve crop predictability. Two methods were evaluated and applied for disaggregating seasonal rainfall forecasts into daily weather realizations: 1) a stochastic weather generator and 2) a forecast tercile resampler. Both methods were evaluated in a case study where the impacts of two seasonal rainfall forecasts (wet and dry forecast for 1998 and 2015 respectively) on rainfed wheat yield and irrigation requirements of maize in IP were analyzed. Simulated wheat yield and irrigation requirements of maize were computed with the crop models CERES-wheat and CERES-maize which are included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at several locations in Spain where the crop model was calibrated and validated with independent field data. These methodologies would allow quantifying the benefits and risks of a seasonal climate forecast to potential users as farmers, agroindustry and insurance companies in the IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse ones. ReferencesPalmer, T. et al., 2004. Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bulletin of the American Meteorological Society, 85(6): 853-872.

  19. Measuring and forecasting great tsunamis by GNSS-based vertical positioning of multiple ships

    NASA Astrophysics Data System (ADS)

    Inazu, D.; Waseda, T.; Hibiya, T.; Ohta, Y.

    2016-12-01

    Vertical ship positioning by the Global Navigation Satellite System (GNSS) was investigated for measuring and forecasting great tsunamis. We first examined existing GNSS vertical position data of a navigating vessel. The result indicated that by using the kinematic Precise Point Positioning (PPP) method, tsunamis greater than 10^-1 m can be detected from the vertical position of the ship. Based on Automatic Identification System (AIS) data, tens of cargo ships and tankers are regularly identified navigating over the Nankai Trough, southwest of Japan. We then assumed that a future Nankai Trough great earthquake tsunami will be observed by ships at locations based on AIS data. The tsunami forecast capability by these virtual offshore tsunami measurements was examined. A conventional Green's function based inversion was used to determine the initial tsunami height distribution. Tsunami forecast tests over the Nankai Trough were carried out using simulated tsunami data of the vertical positions of multiple cargo ships/tankers on a certain day, and of the currently operating observations by deep-sea pressure gauges and Global Positioning System (GPS) buoys. The forecast capability of ship-based tsunami height measurements alone was shown to be comparable to or better than that using the existing offshore observations.

  20. Financial forecasts accuracy in Brazil's social security system.

    PubMed

    Silva, Carlos Patrick Alves da; Puty, Claudio Alberto Castelo Branco; Silva, Marcelino Silva da; Carvalho, Solon Venâncio de; Francês, Carlos Renato Lisboa

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.

  1. Financial forecasts accuracy in Brazil’s social security system

    PubMed Central

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government’s proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts. PMID:28859172

  2. A Dirichlet process model for classifying and forecasting epidemic curves

    PubMed Central

    2014-01-01

    Background A forecast can be defined as an endeavor to quantitatively estimate a future event or probabilities assigned to a future occurrence. Forecasting stochastic processes such as epidemics is challenging since there are several biological, behavioral, and environmental factors that influence the number of cases observed at each point during an epidemic. However, accurate forecasts of epidemics would impact timely and effective implementation of public health interventions. In this study, we introduce a Dirichlet process (DP) model for classifying and forecasting influenza epidemic curves. Methods The DP model is a nonparametric Bayesian approach that enables the matching of current influenza activity to simulated and historical patterns, identifies epidemic curves different from those observed in the past and enables prediction of the expected epidemic peak time. The method was validated using simulated influenza epidemics from an individual-based model and the accuracy was compared to that of the tree-based classification technique, Random Forest (RF), which has been shown to achieve high accuracy in the early prediction of epidemic curves using a classification approach. We also applied the method to forecasting influenza outbreaks in the United States from 1997–2013 using influenza-like illness (ILI) data from the Centers for Disease Control and Prevention (CDC). Results We made the following observations. First, the DP model performed as well as RF in identifying several of the simulated epidemics. Second, the DP model correctly forecasted the peak time several days in advance for most of the simulated epidemics. Third, the accuracy of identifying epidemics different from those already observed improved with additional data, as expected. Fourth, both methods correctly classified epidemics with higher reproduction numbers (R) with a higher accuracy compared to epidemics with lower R values. Lastly, in the classification of seasonal influenza epidemics based on ILI data from the CDC, the methods’ performance was comparable. Conclusions Although RF requires less computational time compared to the DP model, the algorithm is fully supervised implying that epidemic curves different from those previously observed will always be misclassified. In contrast, the DP model can be unsupervised, semi-supervised or fully supervised. Since both methods have their relative merits, an approach that uses both RF and the DP model could be beneficial. PMID:24405642

  3. Development and evaluation of a data-adaptive alerting algorithm for univariate temporal biosurveillance data.

    PubMed

    Elbert, Yevgeniy; Burkom, Howard S

    2009-11-20

    This paper discusses further advances in making robust predictions with the Holt-Winters forecasts for a variety of syndromic time series behaviors and introduces a control-chart detection approach based on these forecasts. Using three collections of time series data, we compare biosurveillance alerting methods with quantified measures of forecast agreement, signal sensitivity, and time-to-detect. The study presents practical rules for initialization and parameterization of biosurveillance time series. Several outbreak scenarios are used for detection comparison. We derive an alerting algorithm from forecasts using Holt-Winters-generalized smoothing for prospective application to daily syndromic time series. The derived algorithm is compared with simple control-chart adaptations and to more computationally intensive regression modeling methods. The comparisons are conducted on background data from both authentic and simulated data streams. Both types of background data include time series that vary widely by both mean value and cyclic or seasonal behavior. Plausible, simulated signals are added to the background data for detection performance testing at signal strengths calculated to be neither too easy nor too hard to separate the compared methods. Results show that both the sensitivity and the timeliness of the Holt-Winters-based algorithm proved to be comparable or superior to that of the more traditional prediction methods used for syndromic surveillance.

  4. A Bayesian spatio-temporal model for forecasting Anaplasma species seroprevalence in domestic dogs within the contiguous United States.

    PubMed

    Liu, Yan; Watson, Stella C; Gettings, Jenna R; Lund, Robert B; Nordone, Shila K; Yabsley, Michael J; McMahan, Christopher S

    2017-01-01

    This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011-2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases.

  5. Influenza forecasting with Google Flu Trends.

    PubMed

    Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E

    2013-01-01

    We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.

  6. Ensemble Flow Forecasts for Risk Based Reservoir Operations of Lake Mendocino in Mendocino County, California

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Hartman, R. K.; Mendoza, J.; Evans, K. M.; Evett, S.

    2016-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation or flow forecasts to inform the flood operations of reservoirs. Previous research and modeling for flood control reservoirs has shown that FIRO can reduce flood risk and increase water supply for many reservoirs. The risk-based method of FIRO presents a unique approach that incorporates flow forecasts made by NOAA's California-Nevada River Forecast Center (CNRFC) to model and assess risk of meeting or exceeding identified management targets or thresholds. Forecasted risk is evaluated against set risk tolerances to set reservoir flood releases. A water management model was developed for Lake Mendocino, a 116,500 acre-foot reservoir located near Ukiah, California. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United State Army Corps of Engineers and is operated by the Sonoma County Water Agency for water supply. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has been plagued with water supply reliability issues since 2007. FIRO is applied to Lake Mendocino by simulating daily hydrologic conditions from 1985 to 2010 in the Upper Russian River from Lake Mendocino to the City of Healdsburg approximately 50 miles downstream. The risk-based method is simulated using a 15-day, 61 member streamflow hindcast by the CNRFC. Model simulation results of risk-based flood operations demonstrate a 23% increase in average end of water year (September 30) storage levels over current operations. Model results show no increase in occurrence of flood damages for points downstream of Lake Mendocino. This investigation demonstrates that FIRO may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.

  7. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  8. Forecasting daily passenger traffic volumes in the Moscow metro

    NASA Astrophysics Data System (ADS)

    Ivanov, V. V.; Osetrov, E. S.

    2018-01-01

    In this paper we have developed a methodology for the medium-term prediction of daily volumes of passenger traffic in the Moscow metro. It includes three options for the forecast: (1) based on artificial neural networks (ANNs), (2) singular-spectral analysis implemented in the Caterpillar-SSA package, and (3) a combination of the ANN and Caterpillar-SSA approaches. The methods and algorithms allow the mediumterm forecasting of passenger traffic flows in the Moscow metro with reasonable accuracy.

  9. Economic analysis for transmission operation and planning

    NASA Astrophysics Data System (ADS)

    Zhou, Qun

    2011-12-01

    Restructuring of the electric power industry has caused dramatic changes in the use of transmission system. The increasing congestion conditions as well as the necessity of integrating renewable energy introduce new challenges and uncertainties to transmission operation and planning. Accurate short-term congestion forecasting facilitates market traders in bidding and trading activities. Cost sharing and recovery issue is a major impediment for long-term transmission investment to integrate renewable energy. In this research, a new short-term forecasting algorithm is proposed for predicting congestion, LMPs, and other power system variables based on the concept of system patterns. The advantage of this algorithm relative to standard statistical forecasting methods is that structural aspects underlying power market operations are exploited to reduce the forecasting error. The advantage relative to previously proposed structural forecasting methods is that data requirements are substantially reduced. Forecasting results based on a NYISO case study demonstrate the feasibility and accuracy of the proposed algorithm. Moreover, a negotiation methodology is developed to guide transmission investment for integrating renewable energy. Built on Nash Bargaining theory, the negotiation of investment plans and payment rate can proceed between renewable generation and transmission companies for cost sharing and recovery. The proposed approach is applied to Garver's six bus system. The numerical results demonstrate fairness and efficiency of the approach, and hence can be used as guidelines for renewable energy investors. The results also shed light on policy-making of renewable energy subsidies.

  10. Comparisons of forecasting for hepatitis in Guangxi Province, China by using three neural networks models.

    PubMed

    Gan, Ruijing; Chen, Ni; Huang, Daizheng

    2016-01-01

    This study compares and evaluates the prediction of hepatitis in Guangxi Province, China by using back propagation neural networks based genetic algorithm (BPNN-GA), generalized regression neural networks (GRNN), and wavelet neural networks (WNN). In order to compare the results of forecasting, the data obtained from 2004 to 2013 and 2014 were used as modeling and forecasting samples, respectively. The results show that when the small data set of hepatitis has seasonal fluctuation, the prediction result by BPNN-GA will be better than the two other methods. The WNN method is suitable for predicting the large data set of hepatitis that has seasonal fluctuation and the same for the GRNN method when the data increases steadily.

  11. Applying Binary Forecasting Approaches to Induced Seismicity in the Western Canada Sedimentary Basin

    NASA Astrophysics Data System (ADS)

    Kahue, R.; Shcherbakov, R.

    2016-12-01

    The Western Canada Sedimentary Basin has been chosen as a focus due to an increase in the recent observed seismicity there which is most likely linked to anthropogenic activities related to unconventional oil and gas exploration. Seismicity caused by these types of activities is called induced seismicity. The occurrence of moderate to larger induced earthquakes in areas where critical infrastructure is present can be potentially problematic. Here we use a binary forecast method to analyze past seismicity and well production data in order to quantify future areas of increased seismicity. This method splits the given region into spatial cells. The binary forecast method used here has been suggested in the past to retroactively forecast large earthquakes occurring globally in areas called alarm cells. An alarm cell, or alert zone, is a bin in which there is a higher likelihood for earthquakes to occur based on previous data. The first method utilizes the cumulative Benioff strain, based on earthquakes that had occurred in each bin above a given magnitude over a time interval called the training period. The second method utilizes the cumulative well production data within each bin. Earthquakes that occurred within an alert zone in the retrospective forecast period contribute to the hit rate, while alert zones that did not have an earthquake occur within them in the forecast period contribute to the false alarm rate. In the resulting analysis the hit rate and false alarm rate are determined after optimizing and modifying the initial parameters using the receiver operating characteristic diagram. It is found that when modifying the cell size and threshold magnitude parameters within various training periods, hit and false alarm rates are obtained for specific regions in Western Canada using both recent seismicity and cumulative well production data. Certain areas are thus shown to be more prone to potential larger earthquakes based on both datasets. This has implications for the potential link between oil and gas production and induced seismicity observed in the Western Canada Sedimentary Basin.

  12. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we will estimate by simulations. Each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results would be archived and posted on the RELM web site. The same methods can be applied to any region with adequate monitoring and sufficient earthquakes. If fewer than ten events are forecasted, the likelihood tests may not give definitive results. The tests do force certain requirements on the forecast models. Because the tests are based on absolute rates, stress models must be explicit about how stress increments affect past seismicity rates. Aftershocks of triggered events must be accounted for. Furthermore, the tests are sensitive to magnitude, so forecast models must specify the magnitude distribution of triggered events. Models should account for probable errors in magnitude and location by appropriate smoothing of the probabilities, as the tests will be "cold hearted:" near misses won't count.

  13. Multi-step-ahead crude oil price forecasting using a hybrid grey wave model

    NASA Astrophysics Data System (ADS)

    Chen, Yanhui; Zhang, Chuan; He, Kaijian; Zheng, Aibing

    2018-07-01

    Crude oil is crucial to the operation and economic well-being of the modern society. Huge changes of crude oil price always cause panics to the global economy. There are many factors influencing crude oil price. Crude oil price prediction is still a difficult research problem widely discussed among researchers. Based on the researches on Heterogeneous Market Hypothesis and the relationship between crude oil price and macroeconomic factors, exchange market, stock market, this paper proposes a hybrid grey wave forecasting model, which combines Random Walk (RW)/ARMA to forecast multi-step-ahead crude oil price. More specifically, we use grey wave forecasting model to model the periodical characteristics of crude oil price and ARMA/RW to simulate the daily random movements. The innovation also comes from using the information of the time series graph to forecast crude oil price, since grey wave forecasting is a graphical prediction method. The empirical results demonstrate that based on the daily data of crude oil price, the hybrid grey wave forecasting model performs well in 15- to 20-step-ahead prediction and it always dominates ARMA and Random Walk in correct direction prediction.

  14. Interplay between past market correlation structure changes and future volatility outbursts.

    PubMed

    Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T

    2016-11-18

    We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of "correlation structure persistence" on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a "metacorrelation" that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility.

  15. Interplay between past market correlation structure changes and future volatility outbursts

    NASA Astrophysics Data System (ADS)

    Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.

    2016-11-01

    We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of “correlation structure persistence” on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a “metacorrelation” that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility.

  16. Interplay between past market correlation structure changes and future volatility outbursts

    PubMed Central

    Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.

    2016-01-01

    We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of “correlation structure persistence” on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a “metacorrelation” that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility. PMID:27857144

  17. Spatial-temporal forecasting the sunspot diagram

    NASA Astrophysics Data System (ADS)

    Covas, Eurico

    2017-09-01

    Aims: We attempt to forecast the Sun's sunspot butterfly diagram in both space (I.e. in latitude) and time, instead of the usual one-dimensional time series forecasts prevalent in the scientific literature. Methods: We use a prediction method based on the non-linear embedding of data series in high dimensions. We use this method to forecast both in latitude (space) and in time, using a full spatial-temporal series of the sunspot diagram from 1874 to 2015. Results: The analysis of the results shows that it is indeed possible to reconstruct the overall shape and amplitude of the spatial-temporal pattern of sunspots, but that the method in its current form does not have real predictive power. We also apply a metric called structural similarity to compare the forecasted and the observed butterfly cycles, showing that this metric can be a useful addition to the usual root mean square error metric when analysing the efficiency of different prediction methods. Conclusions: We conclude that it is in principle possible to reconstruct the full sunspot butterfly diagram for at least one cycle using this approach and that this method and others should be explored since just looking at metrics such as sunspot count number or sunspot total area coverage is too reductive given the spatial-temporal dynamical complexity of the sunspot butterfly diagram. However, more data and/or an improved approach is probably necessary to have true predictive power.

  18. Using Landslide Failure Forecast Models in Near Real Time: the Mt. de La Saxe case-study

    NASA Astrophysics Data System (ADS)

    Manconi, Andrea; Giordan, Daniele

    2014-05-01

    Forecasting the occurrence of landslide phenomena in space and time is a major scientific challenge. The approaches used to forecast landslides mainly depend on the spatial scale analyzed (regional vs. local), the temporal range of forecast (long- vs. short-term), as well as the triggering factor and the landslide typology considered. By focusing on short-term forecast methods for large, deep seated slope instabilities, the potential time of failure (ToF) can be estimated by studying the evolution of the landslide deformation over time (i.e., strain rate) provided that, under constant stress conditions, landslide materials follow creep mechanism before reaching rupture. In the last decades, different procedures have been proposed to estimate ToF by considering simplified empirical and/or graphical methods applied to time series of deformation data. Fukuzono, 1985 proposed a failure forecast method based on the experience performed during large scale laboratory experiments, which were aimed at observing the kinematic evolution of a landslide induced by rain. This approach, known also as the inverse-velocity method, considers the evolution over time of the inverse value of the surface velocity (v) as an indicator of the ToF, by assuming that failure approaches while 1/v tends to zero. Here we present an innovative method to aimed at achieving failure forecast of landslide phenomena by considering near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and then apply straightforward statistical methods to obtain confidence intervals on the time of failure. Our results can be relevant to support the management of early warning systems during landslide emergency conditions, also when the predefined displacement and/or velocity thresholds are exceeded. In addition, our statistical approach for the definition of confidence interval and forecast reliability can be applied also to different failure forecast methods. We applied for the first time the herein presented approach in near real time during the emergency scenario relevant to the reactivation of the La Saxe rockslide, a large mass wasting menacing the population of Courmayeur, northern Italy, and the important European route E25. We show how the application of simplified but robust forecast models can be a convenient method to manage and support early warning systems during critical situations. References: Fukuzono T. (1985), A New Method for Predicting the Failure Time of a Slope, Proc. IVth International Conference and Field Workshop on Landslides, Tokyo.

  19. A Comparison of the Forecast Skills among Three Numerical Models

    NASA Astrophysics Data System (ADS)

    Lu, D.; Reddy, S. R.; White, L. J.

    2003-12-01

    Three numerical weather forecast models, MM5, COAMPS and WRF, operating with a joint effort of NOAA HU-NCAS and Jackson State University (JSU) during summer 2003 have been chosen to study their forecast skills against observations. The models forecast over the same region with the same initialization, boundary condition, forecast length and spatial resolution. AVN global dataset have been ingested as initial conditions. Grib resolution of 27 km is chosen to represent the current mesoscale model. The forecasts with the length of 36h are performed to output the result with 12h interval. The key parameters used to evaluate the forecast skill include 12h accumulated precipitation, sea level pressure, wind, surface temperature and dew point. Precipitation is evaluated statistically using conventional skill scores, Threat Score (TS) and Bias Score (BS), for different threshold values based on 12h rainfall observations whereas other statistical methods such as Mean Error (ME), Mean Absolute Error(MAE) and Root Mean Square Error (RMSE) are applied to other forecast parameters.

  20. Study on Battery Capacity for Grid-connection Power Planning with Forecasts in Clustered Photovoltaic Systems

    NASA Astrophysics Data System (ADS)

    Shimada, Takae; Kawasaki, Norihiro; Ueda, Yuzuru; Sugihara, Hiroyuki; Kurokawa, Kosuke

    This paper aims to clarify the battery capacity required by a residential area with densely grid-connected photovoltaic (PV) systems. This paper proposes a planning method of tomorrow's grid-connection power from/to the external electric power system by using demand power forecasting and insolation forecasting for PV power predictions, and defines a operation method of the electricity storage device to control the grid-connection power as planned. A residential area consisting of 389 houses consuming 2390 MWh/year of electricity with 2390kW PV systems is simulated based on measured data and actual forecasts. The simulation results show that 8.3MWh of battery capacity is required in the conditions of half-hour planning and 1% or less of planning error ratio and PV output limiting loss ratio. The results also show that existing technologies of forecasting reduce required battery capacity to 49%, and increase the allowable installing PV amount to 210%.

  1. ENSURF: multi-model sea level forecast - implementation and validation results for the IBIROOS and Western Mediterranean regions

    NASA Astrophysics Data System (ADS)

    Pérez, B.; Brouwer, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hackett, B.; Verlaan, M.; Fanjul, E. A.

    2012-03-01

    ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of several storm surge or circulation models and near-real time tide gauge data in the region, with the following main goals: 1. providing easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool; 2. generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average technique (BMA). The Bayesian Model Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the Bayesian likelihood that a model will give the correct forecast and are continuously updated based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. The system was implemented for the European Atlantic facade (IBIROOS region) and Western Mediterranean coast based on the MATROOS visualization tool developed by Deltares. Results of validation of the different models and BMA implementation for the main harbours are presented for these regions where this kind of activity is performed for the first time. The system is currently operational at Puertos del Estado and has proved to be useful in the detection of calibration problems in some of the circulation models, in the identification of the systematic differences between baroclinic and barotropic models for sea level forecasts and to demonstrate the feasibility of providing an overall probabilistic forecast, based on the BMA method.

  2. [Medium-term forecast of solar cosmic rays radiation risk during a manned Mars mission].

    PubMed

    Petrov, V M; Vlasov, A G

    2006-01-01

    Medium-term forecasting radiation hazard from solar cosmic rays will be vital in a manned Mars mission. Modern methods of space physics lack acceptable reliability in medium-term forecasting the SCR onset and parameters. The proposed estimation of average radiation risk from SCR during the manned Mars mission is made with the use of existing SCR fluence and spectrum models and correlation of solar particle event frequency with predicted Wolf number. Radiation risk is considered an additional death probability from acute radiation reactions (ergonomic component) or acute radial disease in flight. The algorithm for radiation risk calculation is described and resulted risk levels for various periods of the 23-th solar cycle are presented. Applicability of this method to advance forecasting and possible improvements are being investigated. Recommendations to the crew based on risk estimation are exemplified.

  3. Multiple "buy buttons" in the brain: Forecasting chocolate sales at point-of-sale based on functional brain activation using fMRI.

    PubMed

    Kühn, Simone; Strelow, Enrique; Gallinat, Jürgen

    2016-08-01

    We set out to forecast consumer behaviour in a supermarket based on functional magnetic resonance imaging (fMRI). Data was collected while participants viewed six chocolate bar communications and product pictures before and after each communication. Then self-reports liking judgement were collected. fMRI data was extracted from a priori selected brain regions: nucleus accumbens, medial orbitofrontal cortex, amygdala, hippocampus, inferior frontal gyrus, dorsomedial prefrontal cortex assumed to contribute positively and dorsolateral prefrontal cortex and insula were hypothesized to contribute negatively to sales. The resulting values were rank ordered. After our fMRI-based forecast an instore test was conducted in a supermarket on n=63.617 shoppers. Changes in sales were best forecasted by fMRI signal during communication viewing, second best by a comparison of brain signal during product viewing before and after communication and least by explicit liking judgements. The results demonstrate the feasibility of applying neuroimaging methods in a relatively small sample to correctly forecast sales changes at point-of-sale. Copyright © 2016. Published by Elsevier Inc.

  4. Mixture EMOS model for calibrating ensemble forecasts of wind speed.

    PubMed

    Baran, S; Lerch, S

    2016-03-01

    Ensemble model output statistics (EMOS) is a statistical tool for post-processing forecast ensembles of weather variables obtained from multiple runs of numerical weather prediction models in order to produce calibrated predictive probability density functions. The EMOS predictive probability density function is given by a parametric distribution with parameters depending on the ensemble forecasts. We propose an EMOS model for calibrating wind speed forecasts based on weighted mixtures of truncated normal (TN) and log-normal (LN) distributions where model parameters and component weights are estimated by optimizing the values of proper scoring rules over a rolling training period. The new model is tested on wind speed forecasts of the 50 member European Centre for Medium-range Weather Forecasts ensemble, the 11 member Aire Limitée Adaptation dynamique Développement International-Hungary Ensemble Prediction System ensemble of the Hungarian Meteorological Service, and the eight-member University of Washington mesoscale ensemble, and its predictive performance is compared with that of various benchmark EMOS models based on single parametric families and combinations thereof. The results indicate improved calibration of probabilistic and accuracy of point forecasts in comparison with the raw ensemble and climatological forecasts. The mixture EMOS model significantly outperforms the TN and LN EMOS methods; moreover, it provides better calibrated forecasts than the TN-LN combination model and offers an increased flexibility while avoiding covariate selection problems. © 2016 The Authors Environmetrics Published by JohnWiley & Sons Ltd.

  5. A Short-Term and High-Resolution System Load Forecasting Approach Using Support Vector Regression with Hybrid Parameters Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang

    This work proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system.« less

  6. Verification of different forecasts of Hungarian Meteorological Service

    NASA Astrophysics Data System (ADS)

    Feher, B.

    2009-09-01

    In this paper I show the results of the forecasts made by the Hungarian Meteorological Service. I focus on the general short- and medium-range forecasts, which contains cloudiness, precipitation, wind speed and temperature for six regions of Hungary. I would like to show the results of some special forecasts as well, such as precipitation predictions which are made for the catchment area of Danube and Tisza rivers, and daily mean temperature predictions used by Hungarian energy companies. The product received by the user is made by the general forecaster, but these predictions are based on the ALADIN and ECMWF outputs. Because of these, the product of the forecaster and the models were also verified. Method like this is able to show us, which weather elements are more difficult to forecast or which regions have higher errors. During the verification procedure the basic errors (mean error, mean absolute error) are calculated. Precipitation amount is classified into five categories, and scores like POD, TS, PC,…etc. were defined by contingency table determined by these categories. The procedure runs fully automatically, all the things forecasters have to do is to print the daily result each morning. Beside the daily result, verification is also made for longer periods like week, month or year. Analyzing the results of longer periods we can say that the best predictions are made for the first few days, and precipitation forecasts are less good for mountainous areas, even, the scores of the forecasters sometimes are higher than the errors of the models. Since forecaster receive results next day, it can helps him/her to reduce mistakes and learn the weakness of the models. This paper contains the verification scores, their trends, the method by which these scores are calculated, and some case studies on worse forecasts.

  7. Counteracting structural errors in ensemble forecast of influenza outbreaks.

    PubMed

    Pei, Sen; Shaman, Jeffrey

    2017-10-13

    For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.

  8. Tool for Forecasting Cool-Season Peak Winds Across Kennedy Space Center and Cape Canaveral Air Force Station

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Roeder, William P.

    2010-01-01

    The expected peak wind speed for the day is an important element in the daily morning forecast for ground and space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45th Weather Squadron (45 WS) must issue forecast advisories for KSC/CCAFS when they expect peak gusts for >= 25, >= 35, and >= 50 kt thresholds at any level from the surface to 300 ft. In Phase I of this task, the 45 WS tasked the Applied Meteorology Unit (AMU) to develop a cool-season (October - April) tool to help forecast the non-convective peak wind from the surface to 300 ft at KSC/CCAFS. During the warm season, these wind speeds are rarely exceeded except during convective winds or under the influence of tropical cyclones, for which other techniques are already in use. The tool used single and multiple linear regression equations to predict the peak wind from the morning sounding. The forecaster manually entered several observed sounding parameters into a Microsoft Excel graphical user interface (GUI), and then the tool displayed the forecast peak wind speed, average wind speed at the time of the peak wind, the timing of the peak wind and the probability the peak wind will meet or exceed 35, 50 and 60 kt. The 45 WS customers later dropped the requirement for >= 60 kt wind warnings. During Phase II of this task, the AMU expanded the period of record (POR) by six years to increase the number of observations used to create the forecast equations. A large number of possible predictors were evaluated from archived soundings, including inversion depth and strength, low-level wind shear, mixing height, temperature lapse rate and winds from the surface to 3000 ft. Each day in the POR was stratified in a number of ways, such as by low-level wind direction, synoptic weather pattern, precipitation and Bulk Richardson number. The most accurate Phase II equations were then selected for an independent verification. The Phase I and II forecast methods were compared using an independent verification data set. The two methods were compared to climatology, wind warnings and advisories issued by the 45 WS, and North American Mesoscale (NAM) model (MesoNAM) forecast winds. The performance of the Phase I and II methods were similar with respect to mean absolute error. Since the Phase I data were not stratified by precipitation, this method's peak wind forecasts had a large negative bias on days with precipitation and a small positive bias on days with no precipitation. Overall, the climatology methods performed the worst while the MesoNAM performed the best. Since the MesoNAM winds were the most accurate in the comparison, the final version of the tool was based on the MesoNAM winds. The probability the peak wind will meet or exceed the warning thresholds were based on the one standard deviation error bars from the linear regression. For example, the linear regression might forecast the most likely peak speed to be 35 kt and the error bars used to calculate that the probability of >= 25 kt = 76%, the probability of >= 35 kt = 50%, and the probability of >= 50 kt = 19%. The authors have not seen this application of linear regression error bars in any other meteorological applications. Although probability forecast tools should usually be developed with logistic regression, this technique could be easily generalized to any linear regression forecast tool to estimate the probability of exceeding any desired threshold . This could be useful for previously developed linear regression forecast tools or new forecast applications where statistical analysis software to perform logistic regression is not available. The tool was delivered in two formats - a Microsoft Excel GUI and a Tool Command Language/Tool Kit (Tcl/Tk) GUI in the Meteorological Interactive Data Display System (MIDDS). The Microsoft Excel GUI reads a MesoNAM text file containing hourly forecasts from 0 to 84 hours, from one model run (00 or 12 UTC). The GUI then displays e peak wind speed, average wind speed, and the probability the peak wind will meet or exceed the 25-, 35- and 50-kt thresholds. The user can display the Day-1 through Day-3 peak wind forecasts, and separate forecasts are made for precipitation and non-precipitation days. The MIDDS GUI uses data from the NAM and Global Forecast System (GFS), instead of the MesoNAM. It can display Day-1 and Day-2 forecasts using NAM data, and Day-1 through Day-5 forecasts using GFS data. The timing of the peak wind is not displayed, since the independent verification showed that none of the forecast methods performed significantly better than climatology. The forecaster should use the climatological timing of the peak wind (2248 UTC) as a first guess and then adjust it based on the movement of weather features.

  9. A Beacon Transmission Power Control Algorithm Based on Wireless Channel Load Forecasting in VANETs.

    PubMed

    Mo, Yuanfu; Yu, Dexin; Song, Jun; Zheng, Kun; Guo, Yajuan

    2015-01-01

    In a vehicular ad hoc network (VANET), the periodic exchange of single-hop status information broadcasts (beacon frames) produces channel loading, which causes channel congestion and induces information conflict problems. To guarantee fairness in beacon transmissions from each node and maximum network connectivity, adjustment of the beacon transmission power is an effective method for reducing and preventing channel congestion. In this study, the primary factors that influence wireless channel loading are selected to construct the KF-BCLF, which is a channel load forecasting algorithm based on a recursive Kalman filter and employs multiple regression equation. By pre-adjusting the transmission power based on the forecasted channel load, the channel load was kept within a predefined range; therefore, channel congestion was prevented. Based on this method, the CLF-BTPC, which is a transmission power control algorithm, is proposed. To verify KF-BCLF algorithm, a traffic survey method that involved the collection of floating car data along a major traffic road in Changchun City is employed. By comparing this forecast with the measured channel loads, the proposed KF-BCLF algorithm was proven to be effective. In addition, the CLF-BTPC algorithm is verified by simulating a section of eight-lane highway and a signal-controlled urban intersection. The results of the two verification process indicate that this distributed CLF-BTPC algorithm can effectively control channel load, prevent channel congestion, and enhance the stability and robustness of wireless beacon transmission in a vehicular network.

  10. A Beacon Transmission Power Control Algorithm Based on Wireless Channel Load Forecasting in VANETs

    PubMed Central

    Mo, Yuanfu; Yu, Dexin; Song, Jun; Zheng, Kun; Guo, Yajuan

    2015-01-01

    In a vehicular ad hoc network (VANET), the periodic exchange of single-hop status information broadcasts (beacon frames) produces channel loading, which causes channel congestion and induces information conflict problems. To guarantee fairness in beacon transmissions from each node and maximum network connectivity, adjustment of the beacon transmission power is an effective method for reducing and preventing channel congestion. In this study, the primary factors that influence wireless channel loading are selected to construct the KF-BCLF, which is a channel load forecasting algorithm based on a recursive Kalman filter and employs multiple regression equation. By pre-adjusting the transmission power based on the forecasted channel load, the channel load was kept within a predefined range; therefore, channel congestion was prevented. Based on this method, the CLF-BTPC, which is a transmission power control algorithm, is proposed. To verify KF-BCLF algorithm, a traffic survey method that involved the collection of floating car data along a major traffic road in Changchun City is employed. By comparing this forecast with the measured channel loads, the proposed KF-BCLF algorithm was proven to be effective. In addition, the CLF-BTPC algorithm is verified by simulating a section of eight-lane highway and a signal-controlled urban intersection. The results of the two verification process indicate that this distributed CLF-BTPC algorithm can effectively control channel load, prevent channel congestion, and enhance the stability and robustness of wireless beacon transmission in a vehicular network. PMID:26571042

  11. Verification of a Non-Hydrostatic Dynamical Core Using Horizontally Spectral Element Vertically Finite Difference Method: 2D Aspects

    DTIC Science & Technology

    2014-04-01

    hydrostatic pressure vertical coordinate, which are the same as those used in the Weather Research and Forecasting ( WRF ) model, but a hybrid sigma...hydrostatic pressure vertical coordinate, which are the 33 same as those used in the Weather Research and Forecasting ( WRF ) model, but a hybrid 34 sigma...Weather Research and Forecasting 79 ( WRF ) Model. The Euler equations are in flux form based on the hydrostatic pressure vertical 80 coordinate. In

  12. Peak Wind Tool for General Forecasting

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2010-01-01

    The expected peak wind speed of the day is an important forecast element in the 45th Weather Squadron's (45 WS) daily 24-Hour and Weekly Planning Forecasts. The forecasts are used for ground and space launch operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45 WS also issues wind advisories for KSC/CCAFS when they expect wind gusts to meet or exceed 25 kt, 35 kt and 50 kt thresholds at any level from the surface to 300 ft. The 45 WS forecasters have indicated peak wind speeds are challenging to forecast, particularly in the cool season months of October - April. In Phase I of this task, the Applied Meteorology Unit (AMU) developed a tool to help the 45 WS forecast non-convective winds at KSC/CCAFS for the 24-hour period of 0800 to 0800 local time. The tool was delivered as a Microsoft Excel graphical user interface (GUI). The GUI displayed the forecast of peak wind speed, 5-minute average wind speed at the time of the peak wind, timing of the peak wind and probability the peak speed would meet or exceed 25 kt, 35 kt and 50 kt. For the current task (Phase II ), the 45 WS requested additional observations be used for the creation of the forecast equations by expanding the period of record (POR). Additional parameters were evaluated as predictors, including wind speeds between 500 ft and 3000 ft, static stability classification, Bulk Richardson Number, mixing depth, vertical wind shear, temperature inversion strength and depth and wind direction. Using a verification data set, the AMU compared the performance of the Phase I and II prediction methods. Just as in Phase I, the tool was delivered as a Microsoft Excel GUI. The 45 WS requested the tool also be available in the Meteorological Interactive Data Display System (MIDDS). The AMU first expanded the POR by two years by adding tower observations, surface observations and CCAFS (XMR) soundings for the cool season months of March 2007 to April 2009. The POR was expanded again by six years, from October 1996 to April 2002, by interpolating 1000-ft sounding data to 100-ft increments. The Phase II developmental data set included observations for the cool season months of October 1996 to February 2007. The AMU calculated 68 candidate predictors from the XMR soundings, to include 19 stability parameters, 48 wind speed parameters and one wind shear parameter. Each day in the data set was stratified by synoptic weather pattern, low-level wind direction, precipitation and Richardson Number, for a total of 60 stratification methods. Linear regression equations, using the 68 predictors and 60 stratification methods, were created for the tool's three forecast parameters: the highest peak wind speed of the day (PWSD), 5-minute average speed at the same time (A WSD), and timing of the PWSD. For PWSD and A WSD, 30 Phase II methods were selected for evaluation in the verification data set. For timing of the PWSD, 12 Phase\\I methods were selected for evaluation. The verification data set contained observations for the cool season months of March 2007 to April 2009. The data set was used to compare the Phase I and II forecast methods to climatology, model forecast winds and wind advisories issued by the 45 WS. The model forecast winds were derived from the 0000 and 1200 UTC runs of the 12-km North American Mesoscale (MesoNAM) model. The forecast methods that performed the best in the verification data set were selected for the Phase II version of the tool. For PWSD and A WSD, linear regression equations based on MesoNAM forecasts performed significantly better than the Phase I and II methods. For timing of the PWSD, none of the methods performed significantly bener than climatology. The AMU then developed the Microsoft Excel and MIDDS GUls. The GUIs display the forecasts for PWSD, AWSD and the probability the PWSD will meet or exceed 25 kt, 35 kt and 50 kt. Since none of the prediction methods for timing of the PWSD performed significantly better thanlimatology, the tool no longer displays this predictand. The Excel and MIDDS GUIs display forecasts for Day-I to Day-3 and Day-I to Day-5, respectively. The Excel GUI uses MesoNAM forecasts as input, while the MIDDS GUI uses input from the MesoNAM and Global Forecast System model. Based on feedback from the 45 WS, the AMU added the daily average wind speed from 30 ft to 60 ft to the tool, which is one of the parameters in the 24-Hour and Weekly Planning Forecasts issued by the 45 WS. In addition, the AMU expanded the MIDDS GUI to include forecasts out to Day-7.

  13. Water demand forecasting: review of soft computing methods.

    PubMed

    Ghalehkhondabi, Iman; Ardjmand, Ehsan; Young, William A; Weckman, Gary R

    2017-07-01

    Demand forecasting plays a vital role in resource management for governments and private companies. Considering the scarcity of water and its inherent constraints, demand management and forecasting in this domain are critically important. Several soft computing techniques have been developed over the last few decades for water demand forecasting. This study focuses on soft computing methods of water consumption forecasting published between 2005 and 2015. These methods include artificial neural networks (ANNs), fuzzy and neuro-fuzzy models, support vector machines, metaheuristics, and system dynamics. Furthermore, it was discussed that while in short-term forecasting, ANNs have been superior in many cases, but it is still very difficult to pick a single method as the overall best. According to the literature, various methods and their hybrids are applied to water demand forecasting. However, it seems soft computing has a lot more to contribute to water demand forecasting. These contribution areas include, but are not limited, to various ANN architectures, unsupervised methods, deep learning, various metaheuristics, and ensemble methods. Moreover, it is found that soft computing methods are mainly used for short-term demand forecasting.

  14. Forecasting Space Weather-Induced GPS Performance Degradation Using Random Forest

    NASA Astrophysics Data System (ADS)

    Filjar, R.; Filic, M.; Milinkovic, F.

    2017-12-01

    Space weather and ionospheric dynamics have a profound effect on positioning performance of the Global Satellite Navigation System (GNSS). However, the quantification of that effect is still the subject of scientific activities around the world. In the latest contribution to the understanding of the space weather and ionospheric effects on satellite-based positioning performance, we conducted a study of several candidates for forecasting method for space weather-induced GPS positioning performance deterioration. First, a 5-days set of experimentally collected data was established, encompassing the space weather and ionospheric activity indices (including: the readings of the Sudden Ionospheric Disturbance (SID) monitors, components of geomagnetic field strength, global Kp index, Dst index, GPS-derived Total Electron Content (TEC) samples, standard deviation of TEC samples, and sunspot number) and observations of GPS positioning error components (northing, easting, and height positioning error) derived from the Adriatic Sea IGS reference stations' RINEX raw pseudorange files in quiet space weather periods. This data set was split into the training and test sub-sets. Then, a selected set of supervised machine learning methods based on Random Forest was applied to the experimentally collected data set in order to establish the appropriate regional (the Adriatic Sea) forecasting models for space weather-induced GPS positioning performance deterioration. The forecasting models were developed in the R/rattle statistical programming environment. The forecasting quality of the regional forecasting models developed was assessed, and the conclusions drawn on the advantages and shortcomings of the regional forecasting models for space weather-caused GNSS positioning performance deterioration.

  15. Load Forecasting Based Distribution System Network Reconfiguration -- A Distributed Data-Driven Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard

    In this paper, a short-term load forecasting approach based network reconfiguration is proposed in a parallel manner. Specifically, a support vector regression (SVR) based short-term load forecasting approach is designed to provide an accurate load prediction and benefit the network reconfiguration. Because of the nonconvexity of the three-phase balanced optimal power flow, a second-order cone program (SOCP) based approach is used to relax the optimal power flow problem. Then, the alternating direction method of multipliers (ADMM) is used to compute the optimal power flow in distributed manner. Considering the limited number of the switches and the increasing computation capability, themore » proposed network reconfiguration is solved in a parallel way. The numerical results demonstrate the feasible and effectiveness of the proposed approach.« less

  16. Real-time anomaly detection for very short-term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Jian; Hong, Tao; Yue, Meng

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  17. Real-time anomaly detection for very short-term load forecasting

    DOE PAGES

    Luo, Jian; Hong, Tao; Yue, Meng

    2018-01-06

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  18. Short-term solar flare prediction using image-case-based reasoning

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Fu; Li, Fei; Zhang, Huai-Peng; Yu, Da-Ren

    2017-10-01

    Solar flares strongly influence space weather and human activities, and their prediction is highly complex. The existing solutions such as data based approaches and model based approaches have a common shortcoming which is the lack of human engagement in the forecasting process. An image-case-based reasoning method is introduced to achieve this goal. The image case library is composed of SOHO/MDI longitudinal magnetograms, the images from which exhibit the maximum horizontal gradient, the length of the neutral line and the number of singular points that are extracted for retrieving similar image cases. Genetic optimization algorithms are employed for optimizing the weight assignment for image features and the number of similar image cases retrieved. Similar image cases and prediction results derived by majority voting for these similar image cases are output and shown to the forecaster in order to integrate his/her experience with the final prediction results. Experimental results demonstrate that the case-based reasoning approach has slightly better performance than other methods, and is more efficient with forecasts improved by humans.

  19. Radar-based Quantitative Precipitation Forecasting using Spatial-scale Decomposition Method for Urban Flood Management

    NASA Astrophysics Data System (ADS)

    Yoon, S.; Lee, B.; Nakakita, E.; Lee, G.

    2016-12-01

    Recent climate changes and abnormal weather phenomena have resulted in increased occurrences of localized torrential rainfall. Urban areas in Korea have suffered from localized heavy rainfall, including the notable Seoul flood disaster in 2010 and 2011. The urban hydrological environment has changed in relation to precipitation, such as reduced concentration time, a decreased storage rate, and increased peak discharge. These changes have altered and accelerated the severity of damage to urban areas. In order to prevent such urban flash flood damages, we have to secure the lead time for evacuation through the improvement of radar-based quantitative precipitation forecasting (QPF). The purpose of this research is to improve the QPF products using spatial-scale decomposition method for considering the life time of storm and to assess the accuracy between traditional QPF method and proposed method in terms of urban flood management. The layout of this research is as below. First, this research applies the image filtering to separate the spatial-scale of rainfall field. Second, the separated small and large-scale rainfall fields are extrapolated by each different forecasting method. Third, forecasted rainfall fields are combined at each lead time. Finally, results of this method are evaluated and compared with the results of uniform advection model for urban flood modeling. It is expected that urban flood information using improved QPF will help to reduce casualties and property damage caused by urban flooding through this research.

  20. Practical implementation of a particle filter data assimilation approach to estimate initial hydrologic conditions and initialize medium-range streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Clark, Elizabeth; Wood, Andy; Nijssen, Bart; Mendoza, Pablo; Newman, Andy; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    In an automated forecast system, hydrologic data assimilation (DA) performs the valuable function of correcting raw simulated watershed model states to better represent external observations, including measurements of streamflow, snow, soil moisture, and the like. Yet the incorporation of automated DA into operational forecasting systems has been a long-standing challenge due to the complexities of the hydrologic system, which include numerous lags between state and output variations. To help demonstrate that such methods can succeed in operational automated implementations, we present results from the real-time application of an ensemble particle filter (PF) for short-range (7 day lead) ensemble flow forecasts in western US river basins. We use the System for Hydromet Applications, Research and Prediction (SHARP), developed by the National Center for Atmospheric Research (NCAR) in collaboration with the University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. SHARP is a fully automated platform for short-term to seasonal hydrologic forecasting applications, incorporating uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions through ensemble methods. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 temperature and precipitation time series through conceptual and physically-oriented models. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. The PF selects and/or weights and resamples the IHCs that are most consistent with external streamflow observations, and uses the particles to initialize a streamflow forecast ensemble driven by ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS). We apply this method in real-time for several basins in the western US that are important for water resources management, and perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts skill. This presentation describes findings, including a comparison of sequential and non-sequential particle weighting methods.

  1. Development of a method for comprehensive water quality forecasting and its application in Miyun reservoir of Beijing, China.

    PubMed

    Zhang, Lei; Zou, Zhihong; Shan, Wei

    2017-06-01

    Water quality forecasting is an essential part of water resource management. Spatiotemporal variations of water quality and their inherent constraints make it very complex. This study explored a data-based method for short-term water quality forecasting. Prediction of water quality indicators including dissolved oxygen, chemical oxygen demand by KMnO 4 and ammonia nitrogen using support vector machine was taken as inputs of the particle swarm algorithm based optimal wavelet neural network to forecast the whole status index of water quality. Gubeikou monitoring section of Miyun reservoir in Beijing, China was taken as the study case to examine effectiveness of this approach. The experiment results also revealed that the proposed model has advantages of stability and time reduction in comparison with other data-driven models including traditional BP neural network model, wavelet neural network model and Gradient Boosting Decision Tree model. It can be used as an effective approach to perform short-term comprehensive water quality prediction. Copyright © 2016. Published by Elsevier B.V.

  2. A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty

    NASA Astrophysics Data System (ADS)

    Ohmi, Masataro; Mori, Hiroyuki

    In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.

  3. A probabilistic neural network based approach for predicting the output power of wind turbines

    NASA Astrophysics Data System (ADS)

    Tabatabaei, Sajad

    2017-03-01

    Finding the authentic predicting tools of eliminating the uncertainty of wind speed forecasts is highly required while wind power sources are strongly penetrating. Recently, traditional predicting models of generating point forecasts have no longer been trustee. Thus, the present paper aims at utilising the concept of prediction intervals (PIs) to assess the uncertainty of wind power generation in power systems. Besides, this paper uses a newly introduced non-parametric approach called lower upper bound estimation (LUBE) to build the PIs since the forecasting errors are unable to be modelled properly by applying distribution probability functions. In the present proposed LUBE method, a PI combination-based fuzzy framework is used to overcome the performance instability of neutral networks (NNs) used in LUBE. In comparison to other methods, this formulation more suitably has satisfied the PI coverage and PI normalised average width (PINAW). Since this non-linear problem has a high complexity, a new heuristic-based optimisation algorithm comprising a novel modification is introduced to solve the aforesaid problems. Based on data sets taken from a wind farm in Australia, the feasibility and satisfying performance of the suggested method have been investigated.

  4. Time series regression and ARIMAX for forecasting currency flow at Bank Indonesia in Sulawesi region

    NASA Astrophysics Data System (ADS)

    Suharsono, Agus; Suhartono, Masyitha, Aulia; Anuravega, Arum

    2015-12-01

    The purpose of the study is to forecast the outflow and inflow of currency at Indonesian Central Bank or Bank Indonesia (BI) in Sulawesi Region. The currency outflow and inflow data tend to have a trend pattern which is influenced by calendar variation effects. Therefore, this research focuses to apply some forecasting methods that could handle calendar variation effects, i.e. Time Series Regression (TSR) and ARIMAX models, and compare the forecast accuracy with ARIMA model. The best model is selected based on the lowest of Root Mean Squares Errors (RMSE) at out-sample dataset. The results show that ARIMA is the best model for forecasting the currency outflow and inflow at South Sulawesi. Whereas, the best model for forecasting the currency outflow at Central Sulawesi and Southeast Sulawesi, and for forecasting the currency inflow at South Sulawesi and North Sulawesi is TSR. Additionally, ARIMAX is the best model for forecasting the currency outflow at North Sulawesi. Hence, the results show that more complex models do not neccessary yield more accurate forecast than the simpler one.

  5. Evaluation of medium-range ensemble flood forecasting based on calibration strategies and ensemble methods in Lanjiang Basin, Southeast China

    NASA Astrophysics Data System (ADS)

    Liu, Li; Gao, Chao; Xuan, Weidong; Xu, Yue-Ping

    2017-11-01

    Ensemble flood forecasts by hydrological models using numerical weather prediction products as forcing data are becoming more commonly used in operational flood forecasting applications. In this study, a hydrological ensemble flood forecasting system comprised of an automatically calibrated Variable Infiltration Capacity model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated. The hydrological model is optimized by the parallel programmed ε-NSGA II multi-objective algorithm. According to the solutions by ε-NSGA II, two differently parameterized models are determined to simulate daily flows and peak flows at each of the three hydrological stations. Then a simple yet effective modular approach is proposed to combine these daily and peak flows at the same station into one composite series. Five ensemble methods and various evaluation metrics are adopted. The results show that ε-NSGA II can provide an objective determination on parameter estimation, and the parallel program permits a more efficient simulation. It is also demonstrated that the forecasts from ECMWF have more favorable skill scores than other Ensemble Prediction Systems. The multimodel ensembles have advantages over all the single model ensembles and the multimodel methods weighted on members and skill scores outperform other methods. Furthermore, the overall performance at three stations can be satisfactory up to ten days, however the hydrological errors can degrade the skill score by approximately 2 days, and the influence persists until a lead time of 10 days with a weakening trend. With respect to peak flows selected by the Peaks Over Threshold approach, the ensemble means from single models or multimodels are generally underestimated, indicating that the ensemble mean can bring overall improvement in forecasting of flows. For peak values taking flood forecasts from each individual member into account is more appropriate.

  6. Spatio-temporal pattern clustering for skill assessment of the Korea Operational Oceanographic System

    NASA Astrophysics Data System (ADS)

    Kim, J.; Park, K.

    2016-12-01

    In order to evaluate the performance of operational forecast models in the Korea operational oceanographic system (KOOS) which has been developed by Korea Institute of Ocean Science and Technology (KIOST), a skill assessment (SA) tool has developed and provided multiple skill metrics including not only correlation and error skills by comparing predictions and observation but also pattern clustering with numerical models, satellite, and observation. The KOOS has produced 72 hours forecast information on atmospheric and hydrodynamic forecast variables of wind, pressure, current, tide, wave, temperature, and salinity at every 12 hours per day produced by operating numerical models such as WRF, ROMS, MOM5, WW-III, and SWAN and the SA has conducted to evaluate the forecasts. We have been operationally operated several kinds of numerical models such as WRF, ROMS, MOM5, MOHID, WW-III. Quantitative assessment of operational ocean forecast model is very important to provide accurate ocean forecast information not only to general public but also to support ocean-related problems. In this work, we propose a method of pattern clustering using machine learning method and GIS-based spatial analytics to evaluate spatial distribution of numerical models and spatial observation data such as satellite and HF radar. For the clustering, we use 10 or 15 years-long reanalysis data which was computed by the KOOS, ECMWF, and HYCOM to make best matching clusters which are classified physical meaning with time variation and then we compare it with forecast data. Moreover, for evaluating current, we develop extraction method of dominant flow and apply it to hydrodynamic models and HF radar's sea surface current data. By applying pattern clustering method, it allows more accurate and effective assessment of ocean forecast models' performance by comparing not only specific observation positions which are determined by observation stations but also spatio-temporal distribution of whole model areas. We believe that our proposed method will be very useful to examine and evaluate large amount of numerical modeling data as well as satellite data.

  7. The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments

    NASA Astrophysics Data System (ADS)

    Chen, Fajing; Jiao, Meiyan; Chen, Jing

    2013-04-01

    Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.

  8. Incorporating Wind Power Forecast Uncertainties Into Stochastic Unit Commitment Using Neural Network-Based Prediction Intervals.

    PubMed

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2015-09-01

    Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.

  9. [Application of artificial neural networks in forecasting the number of circulatory system diseases death toll].

    PubMed

    Zhang, Ying; Shao, Yi; Shang, Kezheng; Wang, Shigong; Wang, Jinyan

    2014-09-01

    Set up the model of forecasting the number of circulatorys death toll based on back-propagation (BP) artificial neural networks discuss the relationship between the circulatory system diseases death toll meteorological factors and ambient air pollution. The data of tem deaths, meteorological factors, and ambient air pollution within the m 2004 to 2009 in Nanjing were collected. On the basis of analyzing the ficient between CSDDT meteorological factors and ambient air pollution, leutral network model of CSDDT was built for 2004 - 2008 based on factors and ambient air pollution within the same time, and the data of 2009 est the predictive power of the model. There was a closely system diseases relationship between meteorological factors, ambient air pollution and the circulatory system diseases death toll. The ANN model structure was 17 -16 -1, 17 input notes, 16 hidden notes and 1 output note. The training precision was 0. 005 and the final error was 0. 004 999 42 after 487 training steps. The results of forecast show that predict accuracy over 78. 62%. This method is easy to be finished with smaller error, and higher ability on circulatory system death toll on independent prediction, which can provide a new method for forecasting medical-meteorological forecast and have the value of further research.

  10. State space model approach for forecasting the use of electrical energy (a case study on: PT. PLN (Persero) district of Kroya)

    NASA Astrophysics Data System (ADS)

    Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik

    2018-05-01

    Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.

  11. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    NASA Astrophysics Data System (ADS)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  12. An overview of health forecasting.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D

    2013-01-01

    Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.

  13. Interevent times in a new alarm-based earthquake forecasting model

    NASA Astrophysics Data System (ADS)

    Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed

    2013-09-01

    This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the occurrence region of the 2011 Mw 9.0 Tohoku earthquake, whereas the RI method did not. Cases where a period of quiescent seismicity occurred before the target event often lead to low MR scores, meaning that the target event was not predicted and indicating that our model could be further improved by taking into account quiescent periods in the alarm strategy.

  14. A Bayesian spatio-temporal model for forecasting Anaplasma species seroprevalence in domestic dogs within the contiguous United States

    PubMed Central

    Liu, Yan; Watson, Stella C.; Gettings, Jenna R.; Lund, Robert B.; Nordone, Shila K.; McMahan, Christopher S.

    2017-01-01

    This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast’s construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011–2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year’s regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011–2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases. PMID:28738085

  15. Application of Remote Sensors in Mapping Rice Area and Forecasting Its Production: A Review

    PubMed Central

    Mosleh, Mostafa K.; Hassan, Quazi K.; Chowdhury, Ehsan H.

    2015-01-01

    Rice is one of the staple foods for more than three billion people worldwide. Rice paddies accounted for approximately 11.5% of the World's arable land area during 2012. Rice provided ∼19% of the global dietary energy in recent times and its annual average consumption per capita was ∼65 kg during 2010–2011. Therefore, rice area mapping and forecasting its production is important for food security, where demands often exceed production due to an ever increasing population. Timely and accurate estimation of rice areas and forecasting its production can provide invaluable information for governments, planners, and decision makers in formulating policies in regard to import/export in the event of shortfall and/or surplus. The aim of this paper was to review the applicability of the remote sensing-based imagery for rice area mapping and forecasting its production. Recent advances on the resolutions (i.e., spectral, spatial, radiometric, and temporal) and availability of remote sensing imagery have allowed us timely collection of information on the growth and development stages of the rice crop. For elaborative understanding of the application of remote sensing sensors, following issues were described: the rice area mapping and forecasting its production using optical and microwave imagery, synergy between remote sensing-based methods and other developments, and their implications as an operational one. The overview of the studies to date indicated that remote sensing-based methods using optical and microwave imagery found to be encouraging. However, there were having some limitations, such as: (i) optical remote sensing imagery had relatively low spatial resolution led to inaccurate estimation of rice areas; and (ii) radar imagery would suffer from speckles, which potentially would degrade the quality of the images; and also the brightness of the backscatters were sensitive to the interacting surface. In addition, most of the methods used in forecasting rice yield were empirical in nature, so thus it would require further calibration and validation prior to implement over other geographical locations. PMID:25569753

  16. Application of remote sensors in mapping rice area and forecasting its production: a review.

    PubMed

    Mosleh, Mostafa K; Hassan, Quazi K; Chowdhury, Ehsan H

    2015-01-05

    Rice is one of the staple foods for more than three billion people worldwide. Rice paddies accounted for approximately 11.5% of the World's arable land area during 2012. Rice provided ~19% of the global dietary energy in recent times and its annual average consumption per capita was ~65 kg during 2010-2011. Therefore, rice area mapping and forecasting its production is important for food security, where demands often exceed production due to an ever increasing population. Timely and accurate estimation of rice areas and forecasting its production can provide invaluable information for governments, planners, and decision makers in formulating policies in regard to import/export in the event of shortfall and/or surplus. The aim of this paper was to review the applicability of the remote sensing-based imagery for rice area mapping and forecasting its production. Recent advances on the resolutions (i.e., spectral, spatial, radiometric, and temporal) and availability of remote sensing imagery have allowed us timely collection of information on the growth and development stages of the rice crop. For elaborative understanding of the application of remote sensing sensors, following issues were described: the rice area mapping and forecasting its production using optical and microwave imagery, synergy between remote sensing-based methods and other developments, and their implications as an operational one. The overview of the studies to date indicated that remote sensing-based methods using optical and microwave imagery found to be encouraging. However, there were having some limitations, such as: (i) optical remote sensing imagery had relatively low spatial resolution led to inaccurate estimation of rice areas; and (ii) radar imagery would suffer from speckles, which potentially would degrade the quality of the images; and also the brightness of the backscatters were sensitive to the interacting surface. In addition, most of the methods used in forecasting rice yield were empirical in nature, so thus it would require further calibration and validation prior to implement over other geographical locations.

  17. Forecasting outpatient visits using empirical mode decomposition coupled with back-propagation artificial neural networks optimized by particle swarm optimization

    PubMed Central

    Huang, Daizheng; Wu, Zhihui

    2017-01-01

    Accurately predicting the trend of outpatient visits by mathematical modeling can help policy makers manage hospitals effectively, reasonably organize schedules for human resources and finances, and appropriately distribute hospital material resources. In this study, a hybrid method based on empirical mode decomposition and back-propagation artificial neural networks optimized by particle swarm optimization is developed to forecast outpatient visits on the basis of monthly numbers. The data outpatient visits are retrieved from January 2005 to December 2013 and first obtained as the original time series. Second, the original time series is decomposed into a finite and often small number of intrinsic mode functions by the empirical mode decomposition technique. Third, a three-layer back-propagation artificial neural network is constructed to forecast each intrinsic mode functions. To improve network performance and avoid falling into a local minimum, particle swarm optimization is employed to optimize the weights and thresholds of back-propagation artificial neural networks. Finally, the superposition of forecasting results of the intrinsic mode functions is regarded as the ultimate forecasting value. Simulation indicates that the proposed method attains a better performance index than the other four methods. PMID:28222194

  18. Forecasting outpatient visits using empirical mode decomposition coupled with back-propagation artificial neural networks optimized by particle swarm optimization.

    PubMed

    Huang, Daizheng; Wu, Zhihui

    2017-01-01

    Accurately predicting the trend of outpatient visits by mathematical modeling can help policy makers manage hospitals effectively, reasonably organize schedules for human resources and finances, and appropriately distribute hospital material resources. In this study, a hybrid method based on empirical mode decomposition and back-propagation artificial neural networks optimized by particle swarm optimization is developed to forecast outpatient visits on the basis of monthly numbers. The data outpatient visits are retrieved from January 2005 to December 2013 and first obtained as the original time series. Second, the original time series is decomposed into a finite and often small number of intrinsic mode functions by the empirical mode decomposition technique. Third, a three-layer back-propagation artificial neural network is constructed to forecast each intrinsic mode functions. To improve network performance and avoid falling into a local minimum, particle swarm optimization is employed to optimize the weights and thresholds of back-propagation artificial neural networks. Finally, the superposition of forecasting results of the intrinsic mode functions is regarded as the ultimate forecasting value. Simulation indicates that the proposed method attains a better performance index than the other four methods.

  19. Improved Modeling Tools Development for High Penetration Solar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washom, Byron; Meagher, Kevin

    2014-12-11

    One of the significant objectives of the High Penetration solar research is to help the DOE understand, anticipate, and minimize grid operation impacts as more solar resources are added to the electric power system. For Task 2.2, an effective, reliable approach to predicting solar energy availability for energy generation forecasts using the University of California, San Diego (UCSD) Sky Imager technology has been demonstrated. Granular cloud and ramp forecasts for the next 5 to 20 minutes over an area of 10 square miles were developed. Sky images taken every 30 seconds are processed to determine cloud locations and cloud motionmore » vectors yielding future cloud shadow locations respective to distributed generation or utility solar power plants in the area. The performance of the method depends on cloud characteristics. On days with more advective cloud conditions, the developed method outperforms persistence forecasts by up to 30% (based on mean absolute error). On days with dynamic conditions, the method performs worse than persistence. Sky Imagers hold promise for ramp forecasting and ramp mitigation in conjunction with inverter controls and energy storage. The pre-commercial Sky Imager solar forecasting algorithm was documented with licensing information and was a Sunshot website highlight.« less

  20. Forecasting Behavior in Smart Homes Based on Sleep and Wake Patterns

    PubMed Central

    Williams, Jennifer A.; Cook, Diane J.

    2017-01-01

    Background The goal of this research is to use smart home technology to assist people who are recovering from injuries or coping with disabilities to live independently. Objective We introduce an algorithm to model and forecast wake and sleep behaviors that are exhibited by the participant. Furthermore, we propose that sleep behavior is impacted by and can be modeled from wake behavior, and vice versa. Methods This paper describes the Behavior Forecasting (BF) algorithm. BF consists of 1) defining numeric values that reflect sleep and wake behavior, 2) forecasting wake and sleep values from past behavior, 3) analyzing the effect of wake behavior on sleep and vice versa, and 4) improving prediction performance by using both wake and sleep scores. Results The BF method was evaluated with data collected from 20 smart homes. We found that regardless of the forecasting method utilized, wake behavior and sleep behavior can be modeled with a minimum accuracy of 84%. Additionally, normalizing the wake and sleep scores drastically improves the accuracy to 99%. Conclusions The results show that we can effectively model wake and sleep behaviors in a smart environment. Furthermore, wake behaviors can be predicted from sleep behaviors and vice versa. PMID:27689555

  1. Moving beyond the cost-loss ratio: economic assessment of streamflow forecasts for a risk-averse decision maker

    NASA Astrophysics Data System (ADS)

    Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles

    2017-06-01

    A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.

  2. Simplification of the Kalman filter for meteorological data assimilation

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.

    1991-01-01

    The paper proposes a new statistical method of data assimilation that is based on a simplification of the Kalman filter equations. The forecast error covariance evolution is approximated simply by advecting the mass-error covariance field, deriving the remaining covariances geostrophically, and accounting for external model-error forcing only at the end of each forecast cycle. This greatly reduces the cost of computation of the forecast error covariance. In simulations with a linear, one-dimensional shallow-water model and data generated artificially, the performance of the simplified filter is compared with that of the Kalman filter and the optimal interpolation (OI) method. The simplified filter produces analyses that are nearly optimal, and represents a significant improvement over OI.

  3. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  4. Modeling and Forecasting Influenza-like Illness (ILI) in Houston, Texas Using Three Surveillance Data Capture Mechanisms.

    PubMed

    Paul, Susannah; Mgbere, Osaro; Arafat, Raouf; Yang, Biru; Santos, Eunice

    2017-01-01

    Objective The objective was to forecast and validate prediction estimates of influenza activity in Houston, TX using four years of historical influenza-like illness (ILI) from three surveillance data capture mechanisms. Background Using novel surveillance methods and historical data to estimate future trends of influenza-like illness can lead to early detection of influenza activity increases and decreases. Anticipating surges gives public health professionals more time to prepare and increase prevention efforts. Methods Data was obtained from three surveillance systems, Flu Near You, ILINet, and hospital emergency center (EC) visits, with diverse data capture mechanisms. Autoregressive integrated moving average (ARIMA) models were fitted to data from each source for week 27 of 2012 through week 26 of 2016 and used to forecast influenza-like activity for the subsequent 10 weeks. Estimates were then compared to actual ILI percentages for the same period. Results Forecasted estimates had wide confidence intervals that crossed zero. The forecasted trend direction differed by data source, resulting in lack of consensus about future influenza activity. ILINet forecasted estimates and actual percentages had the least differences. ILINet performed best when forecasting influenza activity in Houston, TX. Conclusion Though the three forecasted estimates did not agree on the trend directions, and thus, were considered imprecise predictors of long-term ILI activity based on existing data, pooling predictions and careful interpretations may be helpful for short term intervention efforts. Further work is needed to improve forecast accuracy considering the promise forecasting holds for seasonal influenza prevention and control, and pandemic preparedness.

  5. Uncertainty analysis of neural network based flood forecasting models: An ensemble based approach for constructing prediction interval

    NASA Astrophysics Data System (ADS)

    Kasiviswanathan, K.; Sudheer, K.

    2013-05-01

    Artificial neural network (ANN) based hydrologic models have gained lot of attention among water resources engineers and scientists, owing to their potential for accurate prediction of flood flows as compared to conceptual or physics based hydrologic models. The ANN approximates the non-linear functional relationship between the complex hydrologic variables in arriving at the river flow forecast values. Despite a large number of applications, there is still some criticism that ANN's point prediction lacks in reliability since the uncertainty of predictions are not quantified, and it limits its use in practical applications. A major concern in application of traditional uncertainty analysis techniques on neural network framework is its parallel computing architecture with large degrees of freedom, which makes the uncertainty assessment a challenging task. Very limited studies have considered assessment of predictive uncertainty of ANN based hydrologic models. In this study, a novel method is proposed that help construct the prediction interval of ANN flood forecasting model during calibration itself. The method is designed to have two stages of optimization during calibration: at stage 1, the ANN model is trained with genetic algorithm (GA) to obtain optimal set of weights and biases vector, and during stage 2, the optimal variability of ANN parameters (obtained in stage 1) is identified so as to create an ensemble of predictions. During the 2nd stage, the optimization is performed with multiple objectives, (i) minimum residual variance for the ensemble mean, (ii) maximum measured data points to fall within the estimated prediction interval and (iii) minimum width of prediction interval. The method is illustrated using a real world case study of an Indian basin. The method was able to produce an ensemble that has an average prediction interval width of 23.03 m3/s, with 97.17% of the total validation data points (measured) lying within the interval. The derived prediction interval for a selected hydrograph in the validation data set is presented in Fig 1. It is noted that most of the observed flows lie within the constructed prediction interval, and therefore provides information about the uncertainty of the prediction. One specific advantage of the method is that when ensemble mean value is considered as a forecast, the peak flows are predicted with improved accuracy by this method compared to traditional single point forecasted ANNs. Fig. 1 Prediction Interval for selected hydrograph

  6. Operational Earthquake Forecasting of Aftershocks for New England

    NASA Astrophysics Data System (ADS)

    Ebel, J.; Fadugba, O. I.

    2015-12-01

    Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.

  7. The Rise of Complexity in Flood Forecasting: Opportunities, Challenges and Tradeoffs

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Clark, M. P.; Nijssen, B.

    2017-12-01

    Operational flood forecasting is currently undergoing a major transformation. Most national flood forecasting services have relied for decades on lumped, highly calibrated conceptual hydrological models running on local office computing resources, providing deterministic streamflow predictions at gauged river locations that are important to stakeholders and emergency managers. A variety of recent technological advances now make it possible to run complex, high-to-hyper-resolution models for operational hydrologic prediction over large domains, and the US National Weather Service is now attempting to use hyper-resolution models to create new forecast services and products. Yet other `increased-complexity' forecasting strategies also exist that pursue different tradeoffs between model complexity (i.e., spatial resolution, physics) and streamflow forecast system objectives. There is currently a pressing need for a greater understanding in the hydrology community of the opportunities, challenges and tradeoffs associated with these different forecasting approaches, and for a greater participation by the hydrology community in evaluating, guiding and implementing these approaches. Intermediate-resolution forecast systems, for instance, use distributed land surface model (LSM) physics but retain the agility to deploy ensemble methods (including hydrologic data assimilation and hindcast-based post-processing). Fully coupled numerical weather prediction (NWP) systems, another example, use still coarser LSMs to produce ensemble streamflow predictions either at the model scale or after sub-grid scale runoff routing. Based on the direct experience of the authors and colleagues in research and operational forecasting, this presentation describes examples of different streamflow forecast paradigms, from the traditional to the recent hyper-resolution, to illustrate the range of choices facing forecast system developers. We also discuss the degree to which the strengths and weaknesses of each strategy map onto the requirements for different types of forecasting services (e.g., flash flooding, river flooding, seasonal water supply prediction).

  8. Quantifying the Usefulness of Ensemble-Based Precipitation Forecasts with Respect to Water Use and Yield during a Field Trial

    NASA Astrophysics Data System (ADS)

    Christ, E.; Webster, P. J.; Collins, G.; Byrd, S.

    2014-12-01

    Recent droughts and the continuing water wars between the states of Georgia, Alabama and Florida have made agricultural producers more aware of the importance of managing their irrigation systems more efficiently. Many southeastern states are beginning to consider laws that will require monitoring and regulation of water used for irrigation. Recently, Georgia suspended issuing irrigation permits in some areas of the southwestern portion of the state to try and limit the amount of water being used in irrigation. However, even in southern Georgia, which receives on average between 23 and 33 inches of rain during the growing season, irrigation can significantly impact crop yields. In fact, studies have shown that when fields do not receive rainfall at the most critical stages in the life of cotton, yield for irrigated fields can be up to twice as much as fields for non-irrigated cotton. This leads to the motivation for this study, which is to produce a forecast tool that will enable producers to make more efficient irrigation management decisions. We will use the ECMWF (European Centre for Medium-Range Weather Forecasts) vars EPS (Ensemble Prediction System) model precipitation forecasts for the grid points included in the 1◦ x 1◦ lat/lon square surrounding the point of interest. We will then apply q-to-q bias corrections to the forecasts. Once we have applied the bias corrections, we will use the check-book method of irrigation scheduling to determine the probability of receiving the required amount of rainfall for each week of the growing season. These forecasts will be used during a field trial conducted at the CM Stripling Irrigation Research Park in Camilla, Georgia. This research will compare differences in yield and water use among the standard checkbook method of irrigation, which uses no precipitation forecast knowledge, the weather.com forecast, a dry land plot, and the ensemble-based forecasts mentioned above.

  9. Multicomponent ensemble models to forecast induced seismicity

    NASA Astrophysics Data System (ADS)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels of seismicity days before the occurrence of felt events.

  10. Development and application of an atmospheric-hydrologic-hydraulic flood forecasting model driven by TIGGE ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Bao, Hongjun; Zhao, Linna

    2012-02-01

    A coupled atmospheric-hydrologic-hydraulic ensemble flood forecasting model, driven by The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data, has been developed for flood forecasting over the Huaihe River. The incorporation of numerical weather prediction (NWP) information into flood forecasting systems may increase forecast lead time from a few hours to a few days. A single NWP model forecast from a single forecast center, however, is insufficient as it involves considerable non-predictable uncertainties and leads to a high number of false alarms. The availability of global ensemble NWP systems through TIGGE offers a new opportunity for flood forecast. The Xinanjiang model used for hydrological rainfall-runoff modeling and the one-dimensional unsteady flow model applied to channel flood routing are coupled with ensemble weather predictions based on the TIGGE data from the Canadian Meteorological Centre (CMC), the European Centre for Medium-Range Weather Forecasts (ECMWF), the UK Met Office (UKMO), and the US National Centers for Environmental Prediction (NCEP). The developed ensemble flood forecasting model is applied to flood forecasting of the 2007 flood season as a test case. The test case is chosen over the upper reaches of the Huaihe River above Lutaizi station with flood diversion and retarding areas. The input flood discharge hydrograph from the main channel to the flood diversion area is estimated with the fixed split ratio of the main channel discharge. The flood flow inside the flood retarding area is calculated as a reservoir with the water balance method. The Muskingum method is used for flood routing in the flood diversion area. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE ensemble forecasts. The results demonstrate satisfactory flood forecasting with clear signals of probability of floods up to a few days in advance, and show that TIGGE ensemble forecast data are a promising tool for forecasting of flood inundation, comparable with that driven by raingauge observations.

  11. Forecasting intense geomagnetic activity using interplanetary magnetic field data

    NASA Astrophysics Data System (ADS)

    Saiz, E.; Cid, C.; Cerrato, Y.

    2008-12-01

    Southward interplanetary magnetic fields are considered traces of geoeffectiveness since they are a main agent of magnetic reconnection of solar wind and magnetosphere. The first part of this work revises the ability to forecast intense geomagnetic activity using different procedures available in the literature. The study shows that current methods do not succeed in making confident predictions. This fact led us to develop a new forecasting procedure, which provides trustworthy results in predicting large variations of Dst index over a sample of 10 years of observations and is based on the value Bz only. The proposed forecasting method appears as a worthy tool for space weather purposes because it is not affected by the lack of solar wind plasma data, which usually occurs during severe geomagnetic activity. Moreover, the results obtained guide us to provide a new interpretation of the physical mechanisms involved in the interaction between the solar wind and the magnetosphere using Faraday's law.

  12. A novel hybrid ensemble learning paradigm for tourism forecasting

    NASA Astrophysics Data System (ADS)

    Shabri, Ani

    2015-02-01

    In this paper, a hybrid forecasting model based on Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) is proposed to forecast tourism demand. This methodology first decomposes the original visitor arrival series into several Intrinsic Model Function (IMFs) components and one residual component by EMD technique. Then, IMFs components and the residual components is forecasted respectively using GMDH model whose input variables are selected by using Partial Autocorrelation Function (PACF). The final forecasted result for tourism series is produced by aggregating all the forecasted results. For evaluating the performance of the proposed EMD-GMDH methodologies, the monthly data of tourist arrivals from Singapore to Malaysia are used as an illustrative example. Empirical results show that the proposed EMD-GMDH model outperforms the EMD-ARIMA as well as the GMDH and ARIMA (Autoregressive Integrated Moving Average) models without time series decomposition.

  13. Weather Forecasting Systems and Methods

    NASA Technical Reports Server (NTRS)

    Mecikalski, John (Inventor); MacKenzie, Wayne M., Jr. (Inventor); Walker, John Robert (Inventor)

    2014-01-01

    A weather forecasting system has weather forecasting logic that receives raw image data from a satellite. The raw image data has values indicative of light and radiance data from the Earth as measured by the satellite, and the weather forecasting logic processes such data to identify cumulus clouds within the satellite images. For each identified cumulus cloud, the weather forecasting logic applies interest field tests to determine a score indicating the likelihood of the cumulus cloud forming precipitation and/or lightning in the future within a certain time period. Based on such scores, the weather forecasting logic predicts in which geographic regions the identified cumulus clouds will produce precipitation and/or lighting within during the time period. Such predictions may then be used to provide a weather map thereby providing users with a graphical illustration of the areas predicted to be affected by precipitation within the time period.

  14. Solar Atmosphere to Earth's Surface: Long Lead Time dB/dt Predictions with the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Manchester, W.; Savani, N.; Sokolov, I.; van der Holst, B.; Jin, M.; Toth, G.; Liemohn, M. W.; Gombosi, T. I.

    2017-12-01

    The future of space weather prediction depends on the community's ability to predict L1 values from observations of the solar atmosphere, which can yield hours of lead time. While both empirical and physics-based L1 forecast methods exist, it is not yet known if this nascent capability can translate to skilled dB/dt forecasts at the Earth's surface. This paper shows results for the first forecast-quality, solar-atmosphere-to-Earth's-surface dB/dt predictions. Two methods are used to predict solar wind and IMF conditions at L1 for several real-world coronal mass ejection events. The first method is an empirical and observationally based system to estimate the plasma characteristics. The magnetic field predictions are based on the Bz4Cast system which assumes that the CME has a cylindrical flux rope geometry locally around Earth's trajectory. The remaining plasma parameters of density, temperature and velocity are estimated from white-light coronagraphs via a variety of triangulation methods and forward based modelling. The second is a first-principles-based approach that combines the Eruptive Event Generator using Gibson-Low configuration (EEGGL) model with the Alfven Wave Solar Model (AWSoM). EEGGL specifies parameters for the Gibson-Low flux rope such that it erupts, driving a CME in the coronal model that reproduces coronagraph observations and propagates to 1AU. The resulting solar wind predictions are used to drive the operational Space Weather Modeling Framework (SWMF) for geospace. Following the configuration used by NOAA's Space Weather Prediction Center, this setup couples the BATS-R-US global magnetohydromagnetic model to the Rice Convection Model (RCM) ring current model and a height-integrated ionosphere electrodynamics model. The long lead time predictions of dB/dt are compared to model results that are driven by L1 solar wind observations. Both are compared to real-world observations from surface magnetometers at a variety of geomagnetic latitudes. Metrics are calculated to examine how the simulated solar wind drivers impact forecast skill. These results illustrate the current state of long-lead-time forecasting and the promise of this technology for operational use.

  15. Seasonal drought predictability in Portugal using statistical-dynamical techniques

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. F. S.; Pires, C. A. L.

    2016-08-01

    Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.

  16. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  17. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    NASA Astrophysics Data System (ADS)

    Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun

    2018-03-01

    Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  18. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    NASA Astrophysics Data System (ADS)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the probability of flooding of a certain area, based on the uncertainty assessment of the flood forecasts. By using this type of maps, water managers can focus their attention on the areas with the highest flood probability. Also the larger public can consult these maps for information on the probability of flooding for their specific location, such that they can take pro-active measures to reduce the personal damage. The method of quantifying the uncertainty was implemented in the operational flood forecasting system for the navigable rivers in the Flanders region of Belgium. The method has shown clear benefits during the floods of the last two years.

  19. Assessing various approaches for flash flood forecasting in the Yzeron periurban catchment (150 km2) south-east Lyon, France

    NASA Astrophysics Data System (ADS)

    Braud, Isabelle; Breil, Pascal; Javelle, Pierre; Pejakovic, Nikola; Guérin, Stéphane

    2017-04-01

    The Yzeron periurban catchment (150 km2) is prone to flash floods leading to overflow in the downstream part of the catchment. A prevention and management plan has been approved and the set-up of a flood forecasting system is planned. The present study presents a comparison of several solutions for flood forecasting in the catchment. It is based on an extensive data collection (rain gauges, radar/rain gauge reanalyses, discharge and water level data) from this experimental catchment. A set of rainfall-runoff events leading to floods (problematic and non-problematic floods) was extracted and formed the basis for the definition of a first forecasting method. It is based on data analysis and the identification of explaining factors amongst the following: rainfall amount, intensity, antecedent rainfall, initial discharge. Several statistical methods including Factorial Analysis of Mixed Data and Classification and Regression Tree were used for this purpose. They showed that several classes of problematic floods can be identified. The first one is related to wet conditions characterized with high initial discharge and antecedent rainfall. The second class is driven by rainfall amount, initial discharge and rainfall intensity. Thresholds of these variables can be identified to provide a first warning. The second forecasting method assessed in the study is the system that will be operational in France in 2017, based on the AIGA method (Javelle et al., 2016). For this purpose, 18-year discharge simulation using the hydrological model of the AIGA method, forced using radar/rain gauges reanalysis were available at 44 locations within the catchment. The dates for which quantiles of a given return period were overtopped were identified and compared with the list of problematic events. The AIGA method was found relevant in identifying the most problematic events, but the lead time needs further investigation in order to assess the usefulness for population warning. References: Pierre Javelle, Didier Organde, Julie Demargne, Clotilde Saint-Martin, Céline de Saint-Aubin, Léa Garandeau and Bruno Janet (2016). Setting up a French national flash flood warning system for ungauged catchments based on the AIGA method. E3S Web of Conferences 7, 18010 (2016), 3rd European Conference on Flood Risk Management (FLOODrisk 2016), http://dx.doi.org/10.1051/e3sconf/20160718010

  20. Intermediate-term forecasting of aftershocks from an early aftershock sequence: Bayesian and ensemble forecasting approaches

    NASA Astrophysics Data System (ADS)

    Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki

    2015-04-01

    Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.

  1. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009).

    PubMed

    Nishiura, Hiroshi

    2011-02-16

    Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance.

  2. Simplified methods for real-time prediction of storm surge uncertainty: The city of Venice case study

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi

    2014-09-01

    Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.

  3. Delphi in Criminal Justice Policy: A Case Study on Judgmental Forecasting

    ERIC Educational Resources Information Center

    Loyens, Kim; Maesschalck, Jeroen; Bouckaert, Geert

    2011-01-01

    This article provides an in-depth case study analysis of a pilot project organized by the section "Strategic Analysis" of the Belgian Federal Police. Using the Delphi method, which is a judgmental forecasting technique, a panel of experts was questioned about future developments of crime, based on their expertise in criminal or social…

  4. [Research on engine remaining useful life prediction based on oil spectrum analysis and particle filtering].

    PubMed

    Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song

    2013-09-01

    The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the

  5. Comparison of Adaline and Multiple Linear Regression Methods for Rainfall Forecasting

    NASA Astrophysics Data System (ADS)

    Sutawinaya, IP; Astawa, INGA; Hariyanti, NKD

    2018-01-01

    Heavy rainfall can cause disaster, therefore need a forecast to predict rainfall intensity. Main factor that cause flooding is there is a high rainfall intensity and it makes the river become overcapacity. This will cause flooding around the area. Rainfall factor is a dynamic factor, so rainfall is very interesting to be studied. In order to support the rainfall forecasting, there are methods that can be used from Artificial Intelligence (AI) to statistic. In this research, we used Adaline for AI method and Regression for statistic method. The more accurate forecast result shows the method that used is good for forecasting the rainfall. Through those methods, we expected which is the best method for rainfall forecasting here.

  6. Assessing skill of a global bimonthly streamflow ensemble prediction system

    NASA Astrophysics Data System (ADS)

    van Dijk, A. I.; Peña-Arancibia, J.; Sheffield, J.; Wood, E. F.

    2011-12-01

    Ideally, a seasonal streamflow forecasting system might be conceived of as a system that ingests skillful climate forecasts from general circulation models and propagates these through thoroughly calibrated hydrological models that are initialised using hydrometric observations. In practice, there are practical problems with each of these aspects. Instead, we analysed whether a comparatively simple hydrological model-based Ensemble Prediction System (EPS) can provide global bimonthly streamflow forecasts with some skill and if so, under what circumstances the greatest skill may be expected. The system tested produces ensemble forecasts for each of six annual bimonthly periods based on the previous 30 years of global daily gridded 1° resolution climate variables and an initialised global hydrological model. To incorporate some of the skill derived from ocean conditions, a post-EPS analog method was used to sample from the ensemble based on El Niño Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation (PDO) index values observed prior to the forecast. Forecasts skill was assessed through a hind-casting experiment for the period 1979-2008. Potential skill was calculated with reference to a model run with the actual forcing for the forecast period (the 'perfect' model) and was compared to actual forecast skill calculated for each of the six forecast times for an average 411 Australian and 51 pan-tropical catchments. Significant potential skill in bimonthly forecasts was largely limited to northern regions during the snow melt period, seasonally wet tropical regions at the transition of wet to dry season, and the Indonesian region where rainfall is well correlated to ENSO. The actual skill was approximately 34-50% of the potential skill. We attribute this primarily to limitations in the model structure, parameterisation and global forcing data. Use of better climate forecasts and remote sensing observations of initial catchment conditions should help to increase actual skill in future. Future work also could address the potential skill gain from using weather and climate forecasts and from a calibrated and/or alternative hydrological model or model ensemble. The approach and data might be useful as a benchmark for joint seasonal forecasting experiments planned under GEWEX.

  7. Using Forecast and Observed Weather Data to Assess Performance of Forecast Products in Identifying Heat Waves and Estimating Heat Wave Effects on Mortality

    PubMed Central

    Chen, Yeh-Hsin; Schwartz, Joel D.; Rood, Richard B.; O’Neill, Marie S.

    2014-01-01

    Background: Heat wave and health warning systems are activated based on forecasts of health-threatening hot weather. Objective: We estimated heat–mortality associations based on forecast and observed weather data in Detroit, Michigan, and compared the accuracy of forecast products for predicting heat waves. Methods: We derived and compared apparent temperature (AT) and heat wave days (with heat waves defined as ≥ 2 days of daily mean AT ≥ 95th percentile of warm-season average) from weather observations and six different forecast products. We used Poisson regression with and without adjustment for ozone and/or PM10 (particulate matter with aerodynamic diameter ≤ 10 μm) to estimate and compare associations of daily all-cause mortality with observed and predicted AT and heat wave days. Results: The 1-day-ahead forecast of a local operational product, Revised Digital Forecast, had about half the number of false positives compared with all other forecasts. On average, controlling for heat waves, days with observed AT = 25.3°C were associated with 3.5% higher mortality (95% CI: –1.6, 8.8%) than days with AT = 8.5°C. Observed heat wave days were associated with 6.2% higher mortality (95% CI: –0.4, 13.2%) than non–heat wave days. The accuracy of predictions varied, but associations between mortality and forecast heat generally tended to overestimate heat effects, whereas associations with forecast heat waves tended to underestimate heat wave effects, relative to associations based on observed weather metrics. Conclusions: Our findings suggest that incorporating knowledge of local conditions may improve the accuracy of predictions used to activate heat wave and health warning systems. Citation: Zhang K, Chen YH, Schwartz JD, Rood RB, O’Neill MS. 2014. Using forecast and observed weather data to assess performance of forecast products in identifying heat waves and estimating heat wave effects on mortality. Environ Health Perspect 122:912–918; http://dx.doi.org/10.1289/ehp.1306858 PMID:24833618

  8. Remote-sensing based approach to forecast habitat quality under climate change scenarios.

    PubMed

    Requena-Mullor, Juan M; López, Enrique; Castro, Antonio J; Alcaraz-Segura, Domingo; Castro, Hermelindo; Reyes, Andrés; Cabello, Javier

    2017-01-01

    As climate change is expected to have a significant impact on species distributions, there is an urgent challenge to provide reliable information to guide conservation biodiversity policies. In addressing this challenge, we propose a remote sensing-based approach to forecast the future habitat quality for European badger, a species not abundant and at risk of local extinction in the arid environments of southeastern Spain, by incorporating environmental variables related with the ecosystem functioning and correlated with climate and land use. Using ensemble prediction methods, we designed global spatial distribution models for the distribution range of badger using presence-only data and climate variables. Then, we constructed regional models for an arid region in the southeast Spain using EVI (Enhanced Vegetation Index) derived variables and weighting the pseudo-absences with the global model projections applied to this region. Finally, we forecast the badger potential spatial distribution in the time period 2071-2099 based on IPCC scenarios incorporating the uncertainty derived from the predicted values of EVI-derived variables. By including remotely sensed descriptors of the temporal dynamics and spatial patterns of ecosystem functioning into spatial distribution models, results suggest that future forecast is less favorable for European badgers than not including them. In addition, change in spatial pattern of habitat suitability may become higher than when forecasts are based just on climate variables. Since the validity of future forecast only based on climate variables is currently questioned, conservation policies supported by such information could have a biased vision and overestimate or underestimate the potential changes in species distribution derived from climate change. The incorporation of ecosystem functional attributes derived from remote sensing in the modeling of future forecast may contribute to the improvement of the detection of ecological responses under climate change scenarios.

  9. Remote-sensing based approach to forecast habitat quality under climate change scenarios

    PubMed Central

    Requena-Mullor, Juan M.; López, Enrique; Castro, Antonio J.; Alcaraz-Segura, Domingo; Castro, Hermelindo; Reyes, Andrés; Cabello, Javier

    2017-01-01

    As climate change is expected to have a significant impact on species distributions, there is an urgent challenge to provide reliable information to guide conservation biodiversity policies. In addressing this challenge, we propose a remote sensing-based approach to forecast the future habitat quality for European badger, a species not abundant and at risk of local extinction in the arid environments of southeastern Spain, by incorporating environmental variables related with the ecosystem functioning and correlated with climate and land use. Using ensemble prediction methods, we designed global spatial distribution models for the distribution range of badger using presence-only data and climate variables. Then, we constructed regional models for an arid region in the southeast Spain using EVI (Enhanced Vegetation Index) derived variables and weighting the pseudo-absences with the global model projections applied to this region. Finally, we forecast the badger potential spatial distribution in the time period 2071–2099 based on IPCC scenarios incorporating the uncertainty derived from the predicted values of EVI-derived variables. By including remotely sensed descriptors of the temporal dynamics and spatial patterns of ecosystem functioning into spatial distribution models, results suggest that future forecast is less favorable for European badgers than not including them. In addition, change in spatial pattern of habitat suitability may become higher than when forecasts are based just on climate variables. Since the validity of future forecast only based on climate variables is currently questioned, conservation policies supported by such information could have a biased vision and overestimate or underestimate the potential changes in species distribution derived from climate change. The incorporation of ecosystem functional attributes derived from remote sensing in the modeling of future forecast may contribute to the improvement of the detection of ecological responses under climate change scenarios. PMID:28257501

  10. Evaluation of annual, global seismicity forecasts, including ensemble models

    NASA Astrophysics Data System (ADS)

    Taroni, Matteo; Zechar, Jeremy; Marzocchi, Warner

    2013-04-01

    In 2009, the Collaboratory for the Study of the Earthquake Predictability (CSEP) initiated a prototype global earthquake forecast experiment. Three models participated in this experiment for 2009, 2010 and 2011—each model forecast the number of earthquakes above magnitude 6 in 1x1 degree cells that span the globe. Here we use likelihood-based metrics to evaluate the consistency of the forecasts with the observed seismicity. We compare model performance with statistical tests and a new method based on the peer-to-peer gambling score. The results of the comparisons are used to build ensemble models that are a weighted combination of the individual models. Notably, in these experiments the ensemble model always performs significantly better than the single best-performing model. Our results indicate the following: i) time-varying forecasts, if not updated after each major shock, may not provide significant advantages with respect to time-invariant models in 1-year forecast experiments; ii) the spatial distribution seems to be the most important feature to characterize the different forecasting performances of the models; iii) the interpretation of consistency tests may be misleading because some good models may be rejected while trivial models may pass consistency tests; iv) a proper ensemble modeling seems to be a valuable procedure to get the best performing model for practical purposes.

  11. Real-time short-term forecast of water inflow into Bureyskaya reservoir

    NASA Astrophysics Data System (ADS)

    Motovilov, Yury

    2017-04-01

    During several recent years, a methodology for operational optimization in hydrosystems including forecasts of the hydrological situation has been developed on example of Burea reservoir. The forecasts accuracy improvement of the water inflow into the reservoir during planning of water and energy regime was one of the main goals for implemented research. Burea river is the second left largest Amur tributary after Zeya river with its 70.7 thousand square kilometers watershed and 723 km-long river course. A variety of natural conditions - from plains in the southern part to northern mountainous areas determine a significant spatio-temporal variability in runoff generation patterns and river regime. Bureyskaya hydropower plant (HPP) with watershed area 65.2 thousand square kilometers is a key station in the Russian Far Eastern energy system providing its reliable operation. With a spacious reservoir, Bureyskaya HPP makes a significant contribution to the protection of the Amur region from catastrophic floods. A physically-based distributed model of runoff generation based on the ECOMAG (ECOlogical Model for Applied Geophysics) hydrological modeling platform has been developed for the Burea River basin. The model describes processes of interception of rainfall/snowfall by the canopy, snow accumulation and melt, soil freezing and thawing, water infiltration into unfrozen and frozen soil, evapotranspiration, thermal and water regime of soil, overland, subsurface, ground and river flow. The governing model's equations are derived from integration of the basic hydro- and thermodynamics equations of water and heat vertical transfer in snowpack, frozen/unfrozen soil, horizontal water flow under and over catchment slopes, etc. The model setup for Bureya river basin included watershed and river network schematization with GIS module by DEM analysis, meteorological time-series preparation, model calibration and validation against historical observations. The results showed good model performance as compared to observed inflow data into the Bureya reservoir and high diagnostic potential of data-modeling system of the runoff formation. With the use of this system the following flowchart for short-range forecasting inflow into Bureyskoe reservoir and forecast correction technique using continuously updated hydrometeorological data has been developed: 1 - Daily renewal of weather observations and forecasts database via the Internet; 2 - Daily runoff calculation from the beginning of the current year to current date is conducted; 3 - Short-range (up to 7 days) forecast is generated based on weather forecast. The idea underlying the model assimilation of newly obtained hydro meteorological information to adjust short-range hydrological forecasts lies in the assumption of the forecast errors inertia. Then the difference between calculated and observed streamflow at the forecast release date is "scattered" with specific weights to calculated streamflow for the forecast lead time. During 2016 this forecasts method of the inflow into the Bureyskaya reservoir up to 7 days is tested in online mode. Satisfactory evaluated short-range inflow forecast success rate is obtained. Tests of developed method have shown strong sensitivity to the results of short-term precipitation forecasts.

  12. Do quantitative decadal forecasts from GCMs provide decision relevant skill?

    NASA Astrophysics Data System (ADS)

    Suckling, E. B.; Smith, L. A.

    2012-04-01

    It is widely held that only physics-based simulation models can capture the dynamics required to provide decision-relevant probabilistic climate predictions. This fact in itself provides no evidence that predictions from today's GCMs are fit for purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales, where it is argued that these 'physics free' forecasts provide a quantitative 'zero skill' target for the evaluation of forecasts based on more complicated models. It is demonstrated that these zero skill models are competitive with GCMs on decadal scales for probability forecasts evaluated over the last 50 years. Complications of statistical interpretation due to the 'hindcast' nature of this experiment, and the likely relevance of arguments that the lack of hindcast skill is irrelevant as the signal will soon 'come out of the noise' are discussed. A lack of decision relevant quantiative skill does not bring the science-based insights of anthropogenic warming into doubt, but it does call for a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to do so may risk the credibility of science in support of policy in the long term. The performance amongst a collection of simulation models is evaluated, having transformed ensembles of point forecasts into probability distributions through the kernel dressing procedure [1], according to a selection of proper skill scores [2] and contrasted with purely data-based empirical models. Data-based models are unlikely to yield realistic forecasts for future climate change if the Earth system moves away from the conditions observed in the past, upon which the models are constructed; in this sense the empirical model defines zero skill. When should a decision relevant simulation model be expected to significantly outperform such empirical models? Probability forecasts up to ten years ahead (decadal forecasts) are considered, both on global and regional spatial scales for surface air temperature. Such decadal forecasts are not only important in terms of providing information on the impacts of near-term climate change, but also from the perspective of climate model validation, as hindcast experiments and a sufficient database of historical observations allow standard forecast verification methods to be used. Simulation models from the ENSEMBLES hindcast experiment [3] are evaluated and contrasted with static forecasts of the observed climatology, persistence forecasts and against simple statistical models, called dynamic climatology (DC). It is argued that DC is a more apropriate benchmark in the case of a non-stationary climate. It is found that the ENSEMBLES models do not demonstrate a significant increase in skill relative to the empirical models even at global scales over any lead time up to a decade ahead. It is suggested that the contsruction and co-evaluation with the data-based models become a regular component of the reporting of large simulation model forecasts. The methodology presented may easily be adapted to other forecasting experiments and is expected to influence the design of future experiments. The inclusion of comparisons with dynamic climatology and other data-based approaches provide important information to both scientists and decision makers on which aspects of state-of-the-art simulation forecasts are likely to be fit for purpose. [1] J. Bröcker and L. A. Smith. From ensemble forecasts to predictive distributions, Tellus A, 60(4), 663-678 (2007). [2] J. Bröcker and L. A. Smith. Scoring probabilistic forecasts: The importance of being proper, Weather and Forecasting, 22, 382-388 (2006). [3] F. J. Doblas-Reyes, A. Weisheimer, T. N. Palmer, J. M. Murphy and D. Smith. Forecast quality asessment of the ENSEMBLES seasonal-to-decadal stream 2 hindcasts, ECMWF Technical Memorandum, 621 (2010).

  13. Development of Real-time Tsunami Inundation Forecast Using Ocean Bottom Tsunami Networks along the Japan Trench

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.

    2015-12-01

    In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.

  14. Exploring What Determines the Use of Forecasts of Varying Time Periods in Guanacaste, Costa Rica

    NASA Astrophysics Data System (ADS)

    Babcock, M.; Wong-Parodi, G.; Grossmann, I.; Small, M. J.

    2016-12-01

    Weather and climate forecasts are promoted as ways to improve water management, especially in the face of changing environmental conditions. However, studies indicate many stakeholders who may benefit from such information do not use it. This study sought to better understand which personal factors (e.g., trust in forecast sources, perceptions of accuracy) were important determinants of the use of 4-day, 3-month, and 12-month rainfall forecasts by stakeholders in water management-related sectors in the seasonally dry province of Guanacaste, Costa Rica. From August to October 2015, we surveyed 87 stakeholders from a mix of government agencies, local water committees, large farms, tourist businesses, environmental NGO's, and the public. The result of an exploratory factor analysis suggests that trust in "informal" forecast sources (traditional methods, family advice) and in "formal" sources (government, university and private company science) are independent of each other. The result of logistic regression analyses suggest that 1) greater understanding of forecasts is associated with a greater probability of 4-day and 3-month forecast use, but not 12-month forecast use, 2) a greater probability of 3-month forecast use is associated with a lower level of trust in "informal" sources, and 3), feeling less secure about water resources, and regularly using many sources of information (and specifically formal meetings and reports) are each associated with a greater probability of using 12-month forecasts. While limited by the sample size, and affected by the factoring method and regression model assumptions, these results do appear to suggest that while forecasts of all times scales are used to some extent, local decision makers' decisions to use 4-day and 3-month forecasts appear to be more intrinsically motivated (based on their level of understanding and trust) and the use of 12-month forecasts seems to be more motivated by a sense of requirement or mandate.

  15. Forecasting snowmelt flooding over Britain using the Grid-to-Grid model: a review and assessment of methods

    NASA Astrophysics Data System (ADS)

    Dey, Seonaid R. A.; Moore, Robert J.; Cole, Steven J.; Wells, Steven C.

    2017-04-01

    In many regions of high annual snowfall, snowmelt modelling can prove to be a vital component of operational flood forecasting and warning systems. Although Britain as a whole does not experience prolonged periods of lying snow, with the exception of the Scottish Highlands, the inclusion of snowmelt modelling can still have a significant impact on the skill of flood forecasts. Countrywide operational flood forecasts over Britain are produced using the national Grid-to-Grid (G2G) distributed hydrological model. For Scotland, snowmelt is included in these forecasts through a G2G snow hydrology module involving temperature-based snowfall/rainfall partitioning and functions for temperature-excess snowmelt, snowpack storage and drainage. Over England and Wales, the contribution of snowmelt is included by pre-processing the precipitation prior to input into G2G. This removes snowfall diagnosed from weather model outputs and adds snowmelt from an energy budget land surface scheme to form an effective liquid water gridded input to G2G. To review the operational options for including snowmelt modelling in G2G over Britain, a project was commissioned by the Environment Agency through the Flood Forecasting Centre (FFC) for England and Wales and in partnership with the Scottish Environment Protection Agency (SEPA) and Natural Resources Wales (NRW). Results obtained from this snowmelt review project will be reported on here. The operational methods used by the FFC and SEPA are compared on past snowmelt floods, alongside new alternative methods of treating snowmelt. Both case study and longer-term analyses are considered, covering periods selected from the winters 2009-2010, 2012-2013, 2013-2014 and 2014-2015. Over Scotland, both of the snowmelt methods used operationally by FFC and SEPA provided a clear improvement to the river flow simulations. Over England and Wales, fewer and less significant snowfall events occurred, leading to less distinction in the results between the methods. It is noted that, for all methods considered, large uncertainties remain in flood forecasts influenced by snowmelt. Understanding and quantifying these uncertainties should lead to more informed flood forecasts and associated guidance information.

  16. Performance and robustness of probabilistic river forecasts computed with quantile regression based on multiple independent variables in the North Central USA

    NASA Astrophysics Data System (ADS)

    Hoss, F.; Fischbeck, P. S.

    2014-10-01

    This study further develops the method of quantile regression (QR) to predict exceedance probabilities of flood stages by post-processing forecasts. Using data from the 82 river gages, for which the National Weather Service's North Central River Forecast Center issues forecasts daily, this is the first QR application to US American river gages. Archived forecasts for lead times up to six days from 2001-2013 were analyzed. Earlier implementations of QR used the forecast itself as the only independent variable (Weerts et al., 2011; López López et al., 2014). This study adds the rise rate of the river stage in the last 24 and 48 h and the forecast error 24 and 48 h ago to the QR model. Including those four variables significantly improved the forecasts, as measured by the Brier Skill Score (BSS). Mainly, the resolution increases, as the original QR implementation already delivered high reliability. Combining the forecast with the other four variables results in much less favorable BSSs. Lastly, the forecast performance does not depend on the size of the training dataset, but on the year, the river gage, lead time and event threshold that are being forecast. We find that each event threshold requires a separate model configuration or at least calibration.

  17. Evaluating simplified methods for liquefaction assessment for loss estimation

    NASA Astrophysics Data System (ADS)

    Kongar, Indranil; Rossetto, Tiziana; Giovinazzi, Sonia

    2017-06-01

    Currently, some catastrophe models used by the insurance industry account for liquefaction by applying a simple factor to shaking-induced losses. The factor is based only on local liquefaction susceptibility and this highlights the need for a more sophisticated approach to incorporating the effects of liquefaction in loss models. This study compares 11 unique models, each based on one of three principal simplified liquefaction assessment methods: liquefaction potential index (LPI) calculated from shear-wave velocity, the HAZUS software method and a method created specifically to make use of USGS remote sensing data. Data from the September 2010 Darfield and February 2011 Christchurch earthquakes in New Zealand are used to compare observed liquefaction occurrences to forecasts from these models using binary classification performance measures. The analysis shows that the best-performing model is the LPI calculated using known shear-wave velocity profiles, which correctly forecasts 78 % of sites where liquefaction occurred and 80 % of sites where liquefaction did not occur, when the threshold is set at 7. However, these data may not always be available to insurers. The next best model is also based on LPI but uses shear-wave velocity profiles simulated from the combination of USGS VS30 data and empirical functions that relate VS30 to average shear-wave velocities at shallower depths. This model correctly forecasts 58 % of sites where liquefaction occurred and 84 % of sites where liquefaction did not occur, when the threshold is set at 4. These scores increase to 78 and 86 %, respectively, when forecasts are based on liquefaction probabilities that are empirically related to the same values of LPI. This model is potentially more useful for insurance since the input data are publicly available. HAZUS models, which are commonly used in studies where no local model is available, perform poorly and incorrectly forecast 87 % of sites where liquefaction occurred, even at optimal thresholds. This paper also considers two models (HAZUS and EPOLLS) for estimation of the scale of liquefaction in terms of permanent ground deformation but finds that both models perform poorly, with correlations between observations and forecasts lower than 0.4 in all cases. Therefore these models potentially provide negligible additional value to loss estimation analysis outside of the regions for which they have been developed.

  18. Technical Note: Initial assessment of a multi-method approach to spring-flood forecasting in Sweden

    NASA Astrophysics Data System (ADS)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2016-02-01

    Hydropower is a major energy source in Sweden, and proper reservoir management prior to the spring-flood onset is crucial for optimal production. This requires accurate forecasts of the accumulated discharge in the spring-flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialized set-up of the HBV model. In this study, a number of new approaches to spring-flood forecasting that reflect the latest developments with respect to analysis and modelling on seasonal timescales are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for the Swedish river Vindelälven over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring-flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for early forecasts improvements of up to 25 % are found. This potential is reasonably well realized in a multi-method system, which over all forecast dates reduced the error in SFV by ˜ 4 %. This improvement is limited but potentially significant for e.g. energy trading.

  19. Statistical security for Social Security.

    PubMed

    Soneji, Samir; King, Gary

    2012-08-01

    The financial viability of Social Security, the single largest U.S. government program, depends on accurate forecasts of the solvency of its intergenerational trust fund. We begin by detailing information necessary for replicating the Social Security Administration's (SSA's) forecasting procedures, which until now has been unavailable in the public domain. We then offer a way to improve the quality of these procedures via age- and sex-specific mortality forecasts. The most recent SSA mortality forecasts were based on the best available technology at the time, which was a combination of linear extrapolation and qualitative judgments. Unfortunately, linear extrapolation excludes known risk factors and is inconsistent with long-standing demographic patterns, such as the smoothness of age profiles. Modern statistical methods typically outperform even the best qualitative judgments in these contexts. We show how to use such methods, enabling researchers to forecast using far more information, such as the known risk factors of smoking and obesity and known demographic patterns. Including this extra information makes a substantial difference. For example, by improving only mortality forecasting methods, we predict three fewer years of net surplus, $730 billion less in Social Security Trust Funds, and program costs that are 0.66% greater for projected taxable payroll by 2031 compared with SSA projections. More important than specific numerical estimates are the advantages of transparency, replicability, reduction of uncertainty, and what may be the resulting lower vulnerability to the politicization of program forecasts. In addition, by offering with this article software and detailed replication information, we hope to marshal the efforts of the research community to include ever more informative inputs and to continue to reduce uncertainties in Social Security forecasts.

  20. Probabilistic Forecasting of Surface Ozone with a Novel Statistical Approach

    NASA Technical Reports Server (NTRS)

    Balashov, Nikolay V.; Thompson, Anne M.; Young, George S.

    2017-01-01

    The recent change in the Environmental Protection Agency's surface ozone regulation, lowering the surface ozone daily maximum 8-h average (MDA8) exceedance threshold from 75 to 70 ppbv, poses significant challenges to U.S. air quality (AQ) forecasters responsible for ozone MDA8 forecasts. The forecasters, supplied by only a few AQ model products, end up relying heavily on self-developed tools. To help U.S. AQ forecasters, this study explores a surface ozone MDA8 forecasting tool that is based solely on statistical methods and standard meteorological variables from the numerical weather prediction (NWP) models. The model combines the self-organizing map (SOM), which is a clustering technique, with a step wise weighted quadratic regression using meteorological variables as predictors for ozone MDA8. The SOM method identifies different weather regimes, to distinguish between various modes of ozone variability, and groups them according to similarity. In this way, when a regression is developed for a specific regime, data from the other regimes are also used, with weights that are based on their similarity to this specific regime. This approach, regression in SOM (REGiS), yields a distinct model for each regime taking into account both the training cases for that regime and other similar training cases. To produce probabilistic MDA8 ozone forecasts, REGiS weighs and combines all of the developed regression models on the basis of the weather patterns predicted by an NWP model. REGiS is evaluated over the San Joaquin Valley in California and the northeastern plains of Colorado. The results suggest that the model performs best when trained and adjusted separately for an individual AQ station and its corresponding meteorological site.

  1. A generalized approach to wheat yield forecasting using earth observations: Data considerations, application and relevance

    NASA Astrophysics Data System (ADS)

    Becker-Reshef, Inbal

    In recent years there has been a dramatic increase in the demand for timely, comprehensive global agricultural intelligence. The issue of food security has rapidly risen to the top of government agendas around the world as the recent lack of food access led to unprecedented food prices, hunger, poverty, and civil conflict. Timely information on global crop production is indispensable for combating the growing stress on the world's crop production, for stabilizing food prices, developing effective agricultural policies, and for coordinating responses to regional food shortages. Earth Observations (EO) data offer a practical means for generating such information as they provide global, timely, cost-effective, and synoptic information on crop condition and distribution. Their utility for crop production forecasting has long been recognized and demonstrated across a wide range of scales and geographic regions. Nevertheless it is widely acknowledged that EO data could be better utilized within the operational monitoring systems and thus there is a critical need for research focused on developing practical robust methods for agricultural monitoring. Within this context this dissertation focused on advancing EO-based methods for crop yield forecasting and on demonstrating the potential relevance for adopting EO-based crop forecasts for providing timely reliable agricultural intelligence. This thesis made contributions to this field by developing and testing a robust EO-based method for wheat production forecasting at state to national scales using available and easily accessible data. The model was developed in Kansas (KS) using coarse resolution normalized difference vegetation index (NDVI) time series data in conjunction with out-of-season wheat masks and was directly applied in Ukraine to assess its transferability. The model estimated yields within 7% in KS and 10% in Ukraine of final estimates 6 weeks prior to harvest. The relevance of adopting such methods to provide timely reliable information to crop commodity markets is demonstrated through a 2010 case study.

  2. Semiparametric modeling: Correcting low-dimensional model error in parametric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013

    2016-03-01

    In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less

  3. Nonlinear problems in data-assimilation : Can synchronization help?

    NASA Astrophysics Data System (ADS)

    Tribbia, J. J.; Duane, G. S.

    2009-12-01

    Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.

  4. Interval forecasting of cyber-attacks on industrial control systems

    NASA Astrophysics Data System (ADS)

    Ivanyo, Y. M.; Krakovsky, Y. M.; Luzgin, A. N.

    2018-03-01

    At present, cyber-security issues of industrial control systems occupy one of the key niches in a state system of planning and management Functional disruption of these systems via cyber-attacks may lead to emergencies related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. There is then an urgent need to develop protection methods against cyber-attacks. This paper studied the results of cyber-attack interval forecasting with a pre-set intensity level of cyber-attacks. Interval forecasting is the forecasting of one interval from two predetermined ones in which a future value of the indicator will be obtained. For this, probability estimates of these events were used. For interval forecasting, a probabilistic neural network with a dynamic updating value of the smoothing parameter was used. A dividing bound of these intervals was determined by a calculation method based on statistical characteristics of the indicator. The number of cyber-attacks per hour that were received through a honeypot from March to September 2013 for the group ‘zeppo-norcal’ was selected as the indicator.

  5. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  6. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  7. Fuzzy Regression Prediction and Application Based on Multi-Dimensional Factors of Freight Volume

    NASA Astrophysics Data System (ADS)

    Xiao, Mengting; Li, Cheng

    2018-01-01

    Based on the reality of the development of air cargo, the multi-dimensional fuzzy regression method is used to determine the influencing factors, and the three most important influencing factors of GDP, total fixed assets investment and regular flight route mileage are determined. The system’s viewpoints and analogy methods, the use of fuzzy numbers and multiple regression methods to predict the civil aviation cargo volume. In comparison with the 13th Five-Year Plan for China’s Civil Aviation Development (2016-2020), it is proved that this method can effectively improve the accuracy of forecasting and reduce the risk of forecasting. It is proved that this model predicts civil aviation freight volume of the feasibility, has a high practical significance and practical operation.

  8. Diagnosis of edge condition based on force measurement during milling of composites

    NASA Astrophysics Data System (ADS)

    Felusiak, Agata; Twardowski, Paweł

    2018-04-01

    The present paper presents comparative results of the forecasting of a cutting tool wear with the application of different methods of diagnostic deduction based on the measurement of cutting force components. The research was carried out during the milling of the Duralcan F3S.10S aluminum-ceramic composite. Prediction of the toolwear was based on one variable, two variables regression Multilayer Perceptron(MLP)and Radial Basis Function(RBF)neural networks. Forecasting the condition of the cutting tool on the basis of cutting forces has yielded very satisfactory results.

  9. Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models

    NASA Astrophysics Data System (ADS)

    Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock

    2017-05-01

    Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.

  10. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    PubMed

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  11. Microgrid optimal scheduling considering impact of high penetration wind generation

    NASA Astrophysics Data System (ADS)

    Alanazi, Abdulaziz

    The objective of this thesis is to study the impact of high penetration wind energy in economic and reliable operation of microgrids. Wind power is variable, i.e., constantly changing, and nondispatchable, i.e., cannot be controlled by the microgrid controller. Thus an accurate forecasting of wind power is an essential task in order to study its impacts in microgrid operation. Two commonly used forecasting methods including Autoregressive Integrated Moving Average (ARIMA) and Artificial Neural Network (ANN) have been used in this thesis to improve the wind power forecasting. The forecasting error is calculated using a Mean Absolute Percentage Error (MAPE) and is improved using the ANN. The wind forecast is further used in the microgrid optimal scheduling problem. The microgrid optimal scheduling is performed by developing a viable model for security-constrained unit commitment (SCUC) based on mixed-integer linear programing (MILP) method. The proposed SCUC is solved for various wind penetration levels and the relationship between the total cost and the wind power penetration is found. In order to reduce microgrid power transfer fluctuations, an additional constraint is proposed and added to the SCUC formulation. The new constraint would control the time-based fluctuations. The impact of the constraint on microgrid SCUC results is tested and validated with numerical analysis. Finally, the applicability of proposed models is demonstrated through numerical simulations.

  12. Monitoring of waste disposal in deep geological formations

    NASA Astrophysics Data System (ADS)

    German, V.; Mansurov, V.

    2003-04-01

    In the paper application of kinetic approach for description of rock failure process and waste disposal microseismic monitoring is advanced. On base of two-stage model of failure process the capability of rock fracture is proved. The requests to monitoring system such as real time mode of data registration and processing and its precision range are formulated. The method of failure nuclei delineation in a rock masses is presented. This method is implemented in a software program for strong seismic events forecasting. It is based on direct use of the fracture concentration criterion. The method is applied to the database of microseismic events of the North Ural Bauxite Mine. The results of this application, such as: efficiency, stability, possibility of forecasting rockburst are discussed.

  13. Extended Range Prediction of Indian Summer Monsoon: Current status

    NASA Astrophysics Data System (ADS)

    Sahai, A. K.; Abhilash, S.; Borah, N.; Joseph, S.; Chattopadhyay, R.; S, S.; Rajeevan, M.; Mandal, R.; Dey, A.

    2014-12-01

    The main focus of this study is to develop forecast consensus in the extended range prediction (ERP) of monsoon Intraseasonal oscillations using a suit of different variants of Climate Forecast system (CFS) model. In this CFS based Grand MME prediction system (CGMME), the ensemble members are generated by perturbing the initial condition and using different configurations of CFSv2. This is to address the role of different physical mechanisms known to have control on the error growth in the ERP in the 15-20 day time scale. The final formulation of CGMME is based on 21 ensembles of the standalone Global Forecast System (GFS) forced with bias corrected forecasted SST from CFS, 11 low resolution CFST126 and 11 high resolution CFST382. Thus, we develop the multi-model consensus forecast for the ERP of Indian summer monsoon (ISM) using a suite of different variants of CFS model. This coordinated international effort lead towards the development of specific tailor made regional forecast products over Indian region. Skill of deterministic and probabilistic categorical rainfall forecast as well the verification of large-scale low frequency monsoon intraseasonal oscillations has been carried out using hindcast from 2001-2012 during the monsoon season in which all models are initialized at every five days starting from 16May to 28 September. The skill of deterministic forecast from CGMME is better than the best participating single model ensemble configuration (SME). The CGMME approach is believed to quantify the uncertainty in both initial conditions and model formulation. Main improvement is attained in probabilistic forecast which is because of an increase in the ensemble spread, thereby reducing the error due to over-confident ensembles in a single model configuration. For probabilistic forecast, three tercile ranges are determined by ranking method based on the percentage of ensemble members from all the participating models falls in those three categories. CGMME further added value to both deterministic and probability forecast compared to raw SME's and this better skill is probably flows from large spread and improved spread-error relationship. CGMME system is currently capable of generating ER prediction in real time and successfully delivering its experimental operational ER forecast of ISM for the last few years.

  14. Implementation of bayesian model averaging on the weather data forecasting applications utilizing open weather map

    NASA Astrophysics Data System (ADS)

    Rahmat, R. F.; Nasution, F. R.; Seniman; Syahputra, M. F.; Sitompul, O. S.

    2018-02-01

    Weather is condition of air in a certain region at a relatively short period of time, measured with various parameters such as; temperature, air preasure, wind velocity, humidity and another phenomenons in the atmosphere. In fact, extreme weather due to global warming would lead to drought, flood, hurricane and other forms of weather occasion, which directly affects social andeconomic activities. Hence, a forecasting technique is to predict weather with distinctive output, particullary mapping process based on GIS with information about current weather status in certain cordinates of each region with capability to forecast for seven days afterward. Data used in this research are retrieved in real time from the server openweathermap and BMKG. In order to obtain a low error rate and high accuracy of forecasting, the authors use Bayesian Model Averaging (BMA) method. The result shows that the BMA method has good accuracy. Forecasting error value is calculated by mean square error shows (MSE). The error value emerges at minumum temperature rated at 0.28 and maximum temperature rated at 0.15. Meanwhile, the error value of minimum humidity rates at 0.38 and the error value of maximum humidity rates at 0.04. Afterall, the forecasting error rate of wind speed is at 0.076. The lower the forecasting error rate, the more optimized the accuracy is.

  15. An Oceanographic and Climatological Atlas of Bristol Bay

    DTIC Science & Technology

    1987-10-01

    36 Forecasting Method ................................ 38 SUPERSTRUCTURE ICING.............................. 41 WIND...slicks and risk general advection of oil by large-scale ice move- analysis to coastal regions were computed. ment, and specific advection of oil by the...tide 1) Fetch wind (speed and direction) from tables or other sources. Forecast time of a surface map analysis of pressure highest range based on loss of

  16. A Case Study on a Combination NDVI Forecasting Model Based on the Entropy Weight Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Shengzhi; Ming, Bo; Huang, Qiang

    It is critically meaningful to accurately predict NDVI (Normalized Difference Vegetation Index), which helps guide regional ecological remediation and environmental managements. In this study, a combination forecasting model (CFM) was proposed to improve the performance of NDVI predictions in the Yellow River Basin (YRB) based on three individual forecasting models, i.e., the Multiple Linear Regression (MLR), Artificial Neural Network (ANN), and Support Vector Machine (SVM) models. The entropy weight method was employed to determine the weight coefficient for each individual model depending on its predictive performance. Results showed that: (1) ANN exhibits the highest fitting capability among the four orecastingmore » models in the calibration period, whilst its generalization ability becomes weak in the validation period; MLR has a poor performance in both calibration and validation periods; the predicted results of CFM in the calibration period have the highest stability; (2) CFM generally outperforms all individual models in the validation period, and can improve the reliability and stability of predicted results through combining the strengths while reducing the weaknesses of individual models; (3) the performances of all forecasting models are better in dense vegetation areas than in sparse vegetation areas.« less

  17. Forecasting Natural Rubber Price In Malaysia Using Arima

    NASA Astrophysics Data System (ADS)

    Zahari, Fatin Z.; Khalid, Kamil; Roslan, Rozaini; Sufahani, Suliadi; Mohamad, Mahathir; Saifullah Rusiman, Mohd; Ali, Maselan

    2018-04-01

    This paper contains introduction, materials and methods, results and discussions, conclusions and references. Based on the title mentioned, high volatility of the price of natural rubber nowadays will give the significant risk to the producers, traders, consumers, and others parties involved in the production of natural rubber. To help them in making decisions, forecasting is needed to predict the price of natural rubber. The main objective of the research is to forecast the upcoming price of natural rubber by using the reliable statistical method. The data are gathered from Malaysia Rubber Board which the data are from January 2000 until December 2015. In this research, average monthly price of Standard Malaysia Rubber 20 (SMR20) will be forecast by using Box-Jenkins approach. Time series plot is used to determine the pattern of the data. The data have trend pattern which indicates the data is non-stationary data and the data need to be transformed. By using the Box-Jenkins method, the best fit model for the time series data is ARIMA (1, 1, 0) which this model satisfy all the criteria needed. Hence, ARIMA (1, 1, 0) is the best fitted model and the model will be used to forecast the average monthly price of Standard Malaysia Rubber 20 (SMR20) for twelve months ahead.

  18. Fuzzy neural network technique for system state forecasting.

    PubMed

    Li, Dezhi; Wang, Wilson; Ismail, Fathy

    2013-10-01

    In many system state forecasting applications, the prediction is performed based on multiple datasets, each corresponding to a distinct system condition. The traditional methods dealing with multiple datasets (e.g., vector autoregressive moving average models and neural networks) have some shortcomings, such as limited modeling capability and opaque reasoning operations. To tackle these problems, a novel fuzzy neural network (FNN) is proposed in this paper to effectively extract information from multiple datasets, so as to improve forecasting accuracy. The proposed predictor consists of both autoregressive (AR) nodes modeling and nonlinear nodes modeling; AR models/nodes are used to capture the linear correlation of the datasets, and the nonlinear correlation of the datasets are modeled with nonlinear neuron nodes. A novel particle swarm technique [i.e., Laplace particle swarm (LPS) method] is proposed to facilitate parameters estimation of the predictor and improve modeling accuracy. The effectiveness of the developed FNN predictor and the associated LPS method is verified by a series of tests related to Mackey-Glass data forecast, exchange rate data prediction, and gear system prognosis. Test results show that the developed FNN predictor and the LPS method can capture the dynamics of multiple datasets effectively and track system characteristics accurately.

  19. Rapid weather information dissemination in Florida

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D.; Heinemann, P. H.; Gerber, J. F.; Crosby, F. L.; Smith, D. L.

    1984-01-01

    The development of the Florida Agricultural Services and Technology (FAST) plan to provide ports for users to call for weather information is described. FAST is based on the Satellite Frost Forecast System, which makes a broad base of weather data available to its users. The methods used for acquisition and dissemination of data from various networks under the FAST plan are examined. The system provides color coded IR or thermal maps, precipitation maps, and textural forecast information. A diagram of the system is provided.

  20. Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, S.; Klein, B.

    2017-11-01

    Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.

  1. Snowfall Rate Retrieval Using Passive Microwave Measurements and Its Applications in Weather Forecast and Hydrology

    NASA Technical Reports Server (NTRS)

    Meng, Huan; Ferraro, Ralph; Kongoli, Cezar; Yan, Banghua; Zavodsky, Bradley; Zhao, Limin; Dong, Jun; Wang, Nai-Yu

    2015-01-01

    (AMSU), Microwave Humidity Sounder (MHS) and Advance Technology Microwave Sounder (ATMS). ATMS is the follow-on sensor to AMSU and MHS. Currently, an AMSU and MHS based land snowfall rate (SFR) product is running operationally at NOAA/NESDIS. Based on the AMSU/MHS SFR, an ATMS SFR algorithm has also been developed. The algorithm performs retrieval in three steps: snowfall detection, retrieval of cloud properties, and estimation of snow particle terminal velocity and snowfall rate. The snowfall detection component utilizes principal component analysis and a logistic regression model. It employs a combination of temperature and water vapor sounding channels to detect the scattering signal from falling snow and derives the probability of snowfall. Cloud properties are retrieved using an inversion method with an iteration algorithm and a two-stream radiative transfer model. A method adopted to calculate snow particle terminal velocity. Finally, snowfall rate is computed by numerically solving a complex integral. The SFR products are being used mainly in two communities: hydrology and weather forecast. Global blended precipitation products traditionally do not include snowfall derived from satellites because such products were not available operationally in the past. The ATMS and AMSU/MHS SFR now provide the winter precipitation information for these blended precipitation products. Weather forecasters mainly rely on radar and station observations for snowfall forecast. The SFR products can fill in gaps where no conventional snowfall data are available to forecasters. The products can also be used to confirm radar and gauge snowfall data and increase forecasters' confidence in their prediction.

  2. Tsunami Forecasting and Monitoring in New Zealand

    NASA Astrophysics Data System (ADS)

    Power, William; Gale, Nora

    2011-06-01

    New Zealand is exposed to tsunami threats from several sources that vary significantly in their potential impact and travel time. One route for reducing the risk from these tsunami sources is to provide advance warning based on forecasting and monitoring of events in progress. In this paper the National Tsunami Warning System framework, including the responsibilities of key organisations and the procedures that they follow in the event of a tsunami threatening New Zealand, are summarised. A method for forecasting threat-levels based on tsunami models is presented, similar in many respects to that developed for Australia by Allen and Greenslade (Nat Hazards 46:35-52, 2008), and a simple system for easy access to the threat-level forecasts using a clickable pdf file is presented. Once a tsunami enters or initiates within New Zealand waters, its progress and evolution can be monitored in real-time using a newly established network of online tsunami gauge sensors placed at strategic locations around the New Zealand coasts and offshore islands. Information from these gauges can be used to validate and revise forecasts, and assist in making the all-clear decision.

  3. How is the weather? Forecasting inpatient glycemic control

    PubMed Central

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B; Thompson, Bithika M

    2017-01-01

    Aim: Apply methods of damped trend analysis to forecast inpatient glycemic control. Method: Observed and calculated point-of-care blood glucose data trends were determined over 62 weeks. Mean absolute percent error was used to calculate differences between observed and forecasted values. Comparisons were drawn between model results and linear regression forecasting. Results: The forecasted mean glucose trends observed during the first 24 and 48 weeks of projections compared favorably to the results provided by linear regression forecasting. However, in some scenarios, the damped trend method changed inferences compared with linear regression. In all scenarios, mean absolute percent error values remained below the 10% accepted by demand industries. Conclusion: Results indicate that forecasting methods historically applied within demand industries can project future inpatient glycemic control. Additional study is needed to determine if forecasting is useful in the analyses of other glucometric parameters and, if so, how to apply the techniques to quality improvement. PMID:29134125

  4. Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation

    PubMed Central

    Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott

    2016-01-01

    Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217

  5. The clustering-based case-based reasoning for imbalanced business failure prediction: a hybrid approach through integrating unsupervised process with supervised process

    NASA Astrophysics Data System (ADS)

    Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie

    2014-05-01

    Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.

  6. Use of High-Resolution WRF Simulations to Forecast Lightning Threat

    NASA Technical Reports Server (NTRS)

    McCaul, E. W., Jr.; LaCasse, K.; Goodman, S. J.; Cecil, D. J.

    2008-01-01

    Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors aloft in storms. This relationship is exploited, in conjunction with the capabilities of cloud-resolving forecast models such as WRF, to forecast explicitly the threat of lightning from convective storms using selected output fields from the model forecasts. The simulated vertical flux of graupel at -15C and the shape of the simulated reflectivity profile are tested in this study as proxies for charge separation processes and their associated lightning risk. Our lightning forecast method differs from others in that it is entirely based on high-resolution simulation output, without reliance on any climatological data. short [6-8 h) simulations are conducted for a number of case studies for which three-dmmensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity fields, and METAR and ACARS data y&eld satisfactory simulations. __nalyses of the lightning threat fields suggests that both the graupel flux and reflectivity profile approaches, when properly calibrated, can yield reasonable lightning threat forecasts, although an ensemble approach is probably desirable in order to reduce the tendency for misplacement of modeled storms to hurt the accuracy of the forecasts. Our lightning threat forecasts are also compared to other more traditional means of forecasting thunderstorms, such as those based on inspection of the convective available potential energy field.

  7. An intercomparison of approaches for improving operational seasonal streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo A.; Wood, Andrew W.; Clark, Elizabeth; Rothwell, Eric; Clark, Martyn P.; Nijssen, Bart; Brekke, Levi D.; Arnold, Jeffrey R.

    2017-07-01

    For much of the last century, forecasting centers around the world have offered seasonal streamflow predictions to support water management. Recent work suggests that the two major avenues to advance seasonal predictability are improvements in the estimation of initial hydrologic conditions (IHCs) and the incorporation of climate information. This study investigates the marginal benefits of a variety of methods using IHCs and/or climate information, focusing on seasonal water supply forecasts (WSFs) in five case study watersheds located in the US Pacific Northwest region. We specify two benchmark methods that mimic standard operational approaches - statistical regression against IHCs and model-based ensemble streamflow prediction (ESP) - and then systematically intercompare WSFs across a range of lead times. Additional methods include (i) statistical techniques using climate information either from standard indices or from climate reanalysis variables and (ii) several hybrid/hierarchical approaches harnessing both land surface and climate predictability. In basins where atmospheric teleconnection signals are strong, and when watershed predictability is low, climate information alone provides considerable improvements. For those basins showing weak teleconnections, custom predictors from reanalysis fields were more effective in forecast skill than standard climate indices. ESP predictions tended to have high correlation skill but greater bias compared to other methods, and climate predictors failed to substantially improve these deficiencies within a trace weighting framework. Lower complexity techniques were competitive with more complex methods, and the hierarchical expert regression approach introduced here (hierarchical ensemble streamflow prediction - HESP) provided a robust alternative for skillful and reliable water supply forecasts at all initialization times. Three key findings from this effort are (1) objective approaches supporting methodologically consistent hindcasts open the door to a broad range of beneficial forecasting strategies; (2) the use of climate predictors can add to the seasonal forecast skill available from IHCs; and (3) sample size limitations must be handled rigorously to avoid over-trained forecast solutions. Overall, the results suggest that despite a rich, long heritage of operational use, there remain a number of compelling opportunities to improve the skill and value of seasonal streamflow predictions.

  8. Influenza forecasting in human populations: a scoping review.

    PubMed

    Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A; McKenzie, F Ellis

    2014-01-01

    Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms "influenza AND (forecast* OR predict*)", excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials.

  9. Error Estimation of An Ensemble Statistical Seasonal Precipitation Prediction Model

    NASA Technical Reports Server (NTRS)

    Shen, Samuel S. P.; Lau, William K. M.; Kim, Kyu-Myong; Li, Gui-Long

    2001-01-01

    This NASA Technical Memorandum describes an optimal ensemble canonical correlation forecasting model for seasonal precipitation. Each individual forecast is based on the canonical correlation analysis (CCA) in the spectral spaces whose bases are empirical orthogonal functions (EOF). The optimal weights in the ensemble forecasting crucially depend on the mean square error of each individual forecast. An estimate of the mean square error of a CCA prediction is made also using the spectral method. The error is decomposed onto EOFs of the predictand and decreases linearly according to the correlation between the predictor and predictand. Since new CCA scheme is derived for continuous fields of predictor and predictand, an area-factor is automatically included. Thus our model is an improvement of the spectral CCA scheme of Barnett and Preisendorfer. The improvements include (1) the use of area-factor, (2) the estimation of prediction error, and (3) the optimal ensemble of multiple forecasts. The new CCA model is applied to the seasonal forecasting of the United States (US) precipitation field. The predictor is the sea surface temperature (SST). The US Climate Prediction Center's reconstructed SST is used as the predictor's historical data. The US National Center for Environmental Prediction's optimally interpolated precipitation (1951-2000) is used as the predictand's historical data. Our forecast experiments show that the new ensemble canonical correlation scheme renders a reasonable forecasting skill. For example, when using September-October-November SST to predict the next season December-January-February precipitation, the spatial pattern correlation between the observed and predicted are positive in 46 years among the 50 years of experiments. The positive correlations are close to or greater than 0.4 in 29 years, which indicates excellent performance of the forecasting model. The forecasting skill can be further enhanced when several predictors are used.

  10. Influenza Forecasting in Human Populations: A Scoping Review

    PubMed Central

    Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A.; McKenzie, F. Ellis

    2014-01-01

    Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms “influenza AND (forecast* OR predict*)”, excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials. PMID:24714027

  11. The Wind Forecast Improvement Project (WFIP). A Public/Private Partnership for Improving Short Term Wind Energy Forecasts and Quantifying the Benefits of Utility Operations -- the Northern Study Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finley, Cathy

    2014-04-30

    This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less

  12. Value of long-term streamflow forecast to reservoir operations for water supply in snow-dominated catchments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anghileri, Daniela; Voisin, Nathalie; Castelletti, Andrea F.

    In this study, we develop a forecast-based adaptive control framework for Oroville reservoir, California, to assess the value of seasonal and inter-annual forecasts for reservoir operation.We use an Ensemble Streamflow Prediction (ESP) approach to generate retrospective, one-year-long streamflow forecasts based on the Variable Infiltration Capacity hydrology model. The optimal sequence of daily release decisions from the reservoir is then determined by Model Predictive Control, a flexible and adaptive optimization scheme.We assess the forecast value by comparing system performance based on the ESP forecasts with that based on climatology and a perfect forecast. In addition, we evaluate system performance based onmore » a synthetic forecast, which is designed to isolate the contribution of seasonal and inter-annual forecast skill to the overall value of the ESP forecasts.Using the same ESP forecasts, we generalize our results by evaluating forecast value as a function of forecast skill, reservoir features, and demand. Our results show that perfect forecasts are valuable when the water demand is high and the reservoir is sufficiently large to allow for annual carry-over. Conversely, ESP forecast value is highest when the reservoir can shift water on a seasonal basis.On average, for the system evaluated here, the overall ESP value is 35% less than the perfect forecast value. The inter-annual component of the ESP forecast contributes 20-60% of the total forecast value. Improvements in the seasonal component of the ESP forecast would increase the overall ESP forecast value between 15 and 20%.« less

  13. Big data driven cycle time parallel prediction for production planning in wafer manufacturing

    NASA Astrophysics Data System (ADS)

    Wang, Junliang; Yang, Jungang; Zhang, Jie; Wang, Xiaoxi; Zhang, Wenjun Chris

    2018-07-01

    Cycle time forecasting (CTF) is one of the most crucial issues for production planning to keep high delivery reliability in semiconductor wafer fabrication systems (SWFS). This paper proposes a novel data-intensive cycle time (CT) prediction system with parallel computing to rapidly forecast the CT of wafer lots with large datasets. First, a density peak based radial basis function network (DP-RBFN) is designed to forecast the CT with the diverse and agglomerative CT data. Second, the network learning method based on a clustering technique is proposed to determine the density peak. Third, a parallel computing approach for network training is proposed in order to speed up the training process with large scaled CT data. Finally, an experiment with respect to SWFS is presented, which demonstrates that the proposed CTF system can not only speed up the training process of the model but also outperform the radial basis function network, the back-propagation-network and multivariate regression methodology based CTF methods in terms of the mean absolute deviation and standard deviation.

  14. Modeling and forecasting US presidential election using learning algorithms

    NASA Astrophysics Data System (ADS)

    Zolghadr, Mohammad; Niaki, Seyed Armin Akhavan; Niaki, S. T. A.

    2017-09-01

    The primary objective of this research is to obtain an accurate forecasting model for the US presidential election. To identify a reliable model, artificial neural networks (ANN) and support vector regression (SVR) models are compared based on some specified performance measures. Moreover, six independent variables such as GDP, unemployment rate, the president's approval rate, and others are considered in a stepwise regression to identify significant variables. The president's approval rate is identified as the most significant variable, based on which eight other variables are identified and considered in the model development. Preprocessing methods are applied to prepare the data for the learning algorithms. The proposed procedure significantly increases the accuracy of the model by 50%. The learning algorithms (ANN and SVR) proved to be superior to linear regression based on each method's calculated performance measures. The SVR model is identified as the most accurate model among the other models as this model successfully predicted the outcome of the election in the last three elections (2004, 2008, and 2012). The proposed approach significantly increases the accuracy of the forecast.

  15. Real-time Mainshock Forecast by Statistical Discrimination of Foreshock Clusters

    NASA Astrophysics Data System (ADS)

    Nomura, S.; Ogata, Y.

    2016-12-01

    Foreshock discremination is one of the most effective ways for short-time forecast of large main shocks. Though many large earthquakes accompany their foreshocks, discreminating them from enormous small earthquakes is difficult and only probabilistic evaluation from their spatio-temporal features and magnitude evolution may be available. Logistic regression is the statistical learning method best suited to such binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Statistical learning methods can keep learning discreminating features from updating catalog and give probabilistic recognition of forecast in real time. We estimated a non-linear function of foreshock proportion by smooth spline bases and evaluate the possibility of foreshocks by the logit function. In this study, we classified foreshocks from earthquake catalog by the Japan Meteorological Agency by single-link clustering methods and learned spatial and temporal features of foreshocks by the probability density ratio estimation. We use the epicentral locations, time spans and difference in magnitudes for learning and forecasting. Magnitudes of main shocks are also predicted our method by incorporating b-values into our method. We discuss the spatial pattern of foreshocks from the classifier composed by our model. We also implement a back test to validate predictive performance of the model by this catalog.

  16. A Posteriori Correction of Forecast and Observation Error Variances

    NASA Technical Reports Server (NTRS)

    Rukhovets, Leonid

    2005-01-01

    Proposed method of total observation and forecast error variance correction is based on the assumption about normal distribution of "observed-minus-forecast" residuals (O-F), where O is an observed value and F is usually a short-term model forecast. This assumption can be accepted for several types of observations (except humidity) which are not grossly in error. Degree of nearness to normal distribution can be estimated by the symmetry or skewness (luck of symmetry) a(sub 3) = mu(sub 3)/sigma(sup 3) and kurtosis a(sub 4) = mu(sub 4)/sigma(sup 4) - 3 Here mu(sub i) = i-order moment, sigma is a standard deviation. It is well known that for normal distribution a(sub 3) = a(sub 4) = 0.

  17. Improving of local ozone forecasting by integrated models.

    PubMed

    Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš

    2016-09-01

    This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.

  18. Impact of single-point GPS integrated water vapor estimates on short-range WRF model forecasts over southern India

    NASA Astrophysics Data System (ADS)

    Kumar, Prashant; Gopalan, Kaushik; Shukla, Bipasha Paul; Shyam, Abhineet

    2017-11-01

    Specifying physically consistent and accurate initial conditions is one of the major challenges of numerical weather prediction (NWP) models. In this study, ground-based global positioning system (GPS) integrated water vapor (IWV) measurements available from the International Global Navigation Satellite Systems (GNSS) Service (IGS) station in Bangalore, India, are used to assess the impact of GPS data on NWP model forecasts over southern India. Two experiments are performed with and without assimilation of GPS-retrieved IWV observations during the Indian winter monsoon period (November-December, 2012) using a four-dimensional variational (4D-Var) data assimilation method. Assimilation of GPS data improved the model IWV analysis as well as the subsequent forecasts. There is a positive impact of ˜10 % over Bangalore and nearby regions. The Weather Research and Forecasting (WRF) model-predicted 24-h surface temperature forecasts have also improved when compared with observations. Small but significant improvements were found in the rainfall forecasts compared to control experiments.

  19. Knowing what to expect, forecasting monthly emergency department visits: A time-series analysis.

    PubMed

    Bergs, Jochen; Heerinckx, Philipe; Verelst, Sandra

    2014-04-01

    To evaluate an automatic forecasting algorithm in order to predict the number of monthly emergency department (ED) visits one year ahead. We collected retrospective data of the number of monthly visiting patients for a 6-year period (2005-2011) from 4 Belgian Hospitals. We used an automated exponential smoothing approach to predict monthly visits during the year 2011 based on the first 5 years of the dataset. Several in- and post-sample forecasting accuracy measures were calculated. The automatic forecasting algorithm was able to predict monthly visits with a mean absolute percentage error ranging from 2.64% to 4.8%, indicating an accurate prediction. The mean absolute scaled error ranged from 0.53 to 0.68 indicating that, on average, the forecast was better compared with in-sample one-step forecast from the naïve method. The applied automated exponential smoothing approach provided useful predictions of the number of monthly visits a year in advance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2017-04-01

    Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes distance, stream-connectivity and size of the catchment areas into account. This can in some cases have a slight negative impact on the calibration error, but avoids large differences between parameters of nearby locations, whether stream connected or not. The spatial calibration also makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.

  1. Forecasting Andean rainfall and crop yield from the influence of El Nino on Pleiades visibility

    PubMed

    Orlove; Chiang; Cane

    2000-01-06

    Farmers in drought-prone regions of Andean South America have historically made observations of changes in the apparent brightness of stars in the Pleiades around the time of the southern winter solstice in order to forecast interannual variations in summer rainfall and in autumn harvests. They moderate the effect of reduced rainfall by adjusting the planting dates of potatoes, their most important crop. Here we use data on cloud cover and water vapour from satellite imagery, agronomic data from the Andean altiplano and an index of El Nino variability to analyse this forecasting method. We find that poor visibility of the Pleiades in June-caused by an increase in subvisual high cirrus clouds-is indicative of an El Nino year, which is usually linked to reduced rainfall during the growing season several months later. Our results suggest that this centuries-old method of seasonal rainfall forecasting may be based on a simple indicator of El Nino variability.

  2. Entropy Econometrics for combining regional economic forecasts: A Data-Weighted Prior Estimator

    NASA Astrophysics Data System (ADS)

    Fernández-Vázquez, Esteban; Moreno, Blanca

    2017-10-01

    Forecast combination has been studied in econometrics for a long time, and the literature has shown the superior performance of forecast combination over individual predictions. However, there is still controversy on which is the best procedure to specify the forecast weights. This paper explores the possibility of using a procedure based on Entropy Econometrics, which allows setting the weights for the individual forecasts as a mixture of different alternatives. In particular, we examine the ability of the Data-Weighted Prior Estimator proposed by Golan (J Econom 101(1):165-193, 2001) to combine forecasting models in a context of small sample sizes, a relative common scenario when dealing with time series for regional economies. We test the validity of the proposed approach using a simulation exercise and a real-world example that aims at predicting gross regional product growth rates for a regional economy. The forecasting performance of the Data-Weighted Prior Estimator proposed is compared with other combining methods. The simulation results indicate that in scenarios of heavily ill-conditioned datasets the approach suggested dominates other forecast combination strategies. The empirical results are consistent with the conclusions found in the numerical experiment.

  3. Hybrid Intrusion Forecasting Framework for Early Warning System

    NASA Astrophysics Data System (ADS)

    Kim, Sehun; Shin, Seong-Jun; Kim, Hyunwoo; Kwon, Ki Hoon; Han, Younggoo

    Recently, cyber attacks have become a serious hindrance to the stability of Internet. These attacks exploit interconnectivity of networks, propagate in an instant, and have become more sophisticated and evolutionary. Traditional Internet security systems such as firewalls, IDS and IPS are limited in terms of detecting recent cyber attacks in advance as these systems respond to Internet attacks only after the attacks inflict serious damage. In this paper, we propose a hybrid intrusion forecasting system framework for an early warning system. The proposed system utilizes three types of forecasting methods: time-series analysis, probabilistic modeling, and data mining method. By combining these methods, it is possible to take advantage of the forecasting technique of each while overcoming their drawbacks. Experimental results show that the hybrid intrusion forecasting method outperforms each of three forecasting methods.

  4. Computer-Aided Analysis of Patents for Product Technology Maturity Forecasting

    NASA Astrophysics Data System (ADS)

    Liang, Yanhong; Gan, Dequan; Guo, Yingchun; Zhang, Peng

    Product technology maturity foresting is vital for any enterprises to hold the chance for innovation and keep competitive for a long term. The Theory of Invention Problem Solving (TRIZ) is acknowledged both as a systematic methodology for innovation and a powerful tool for technology forecasting. Based on TRIZ, the state -of-the-art on the technology maturity of product and the limits of application are discussed. With the application of text mining and patent analysis technologies, this paper proposes a computer-aided approach for product technology maturity forecasting. It can overcome the shortcomings of the current methods.

  5. Study on Electricity Business Expansion and Electricity Sales Based on Seasonal Adjustment

    NASA Astrophysics Data System (ADS)

    Zhang, Yumin; Han, Xueshan; Wang, Yong; Zhang, Li; Yang, Guangsen; Sun, Donglei; Wang, Bolun

    2017-05-01

    [1] proposed a novel analysis and forecast method of electricity business expansion based on Seasonal Adjustment, we extend this work to include the effect the micro and macro aspects, respectively. From micro aspect, we introduce the concept of load factor to forecast the stable value of electricity consumption of single new consumer after the installation of new capacity of the high-voltage transformer. From macro aspects, considering the growth of business expanding is also stimulated by the growth of electricity sales, it is necessary to analyse the antecedent relationship between business expanding and electricity sales. First, forecast electricity consumption of customer group and release rules of expanding capacity, respectively. Second, contrast the degree of fitting and prediction accuracy to find out the antecedence relationship and analyse the reason. Also, it can be used as a contrast to observe the influence of customer group in different ranges on the prediction precision. Finally, Simulation results indicate that the proposed method is accurate to help determine the value of expanding capacity and electricity consumption.

  6. Verification of an ensemble prediction system for storm surge forecast in the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Lionello, Piero

    2014-12-01

    In the Adriatic Sea, storm surges present a significant threat to Venice and to the flat coastal areas of the northern coast of the basin. Sea level forecast is of paramount importance for the management of daily activities and for operating the movable barriers that are presently being built for the protection of the city. In this paper, an EPS (ensemble prediction system) for operational forecasting of storm surge in the northern Adriatic Sea is presented and applied to a 3-month-long period (October-December 2010). The sea level EPS is based on the HYPSE (hydrostatic Padua Sea elevation) model, which is a standard single-layer nonlinear shallow water model, whose forcings (mean sea level pressure and surface wind fields) are provided by the ensemble members of the ECMWF (European Center for Medium-Range Weather Forecasts) EPS. Results are verified against observations at five tide gauges located along the Croatian and Italian coasts of the Adriatic Sea. Forecast uncertainty increases with the predicted value of the storm surge and with the forecast lead time. The EMF (ensemble mean forecast) provided by the EPS has a rms (root mean square) error lower than the DF (deterministic forecast), especially for short (up to 3 days) lead times. Uncertainty for short lead times of the forecast and for small storm surges is mainly caused by uncertainty of the initial condition of the hydrodynamical model. Uncertainty for large lead times and large storm surges is mainly caused by uncertainty in the meteorological forcings. The EPS spread increases with the rms error of the forecast. For large lead times the EPS spread and the forecast error substantially coincide. However, the EPS spread in this study, which does not account for uncertainty in the initial condition, underestimates the error during the early part of the forecast and for small storm surge values. On the contrary, it overestimates the rms error for large surge values. The PF (probability forecast) of the EPS has a clear skill in predicting the actual probability distribution of sea level, and it outperforms simple "dressed" PF methods. A probability estimate based on the single DF is shown to be inadequate. However, a PF obtained with a prescribed Gaussian distribution and centered on the DF value performs very similarly to the EPS-based PF.

  7. Quasi-most unstable modes: a window to 'À la carte' ensemble diversity?

    NASA Astrophysics Data System (ADS)

    Homar Santaner, Victor; Stensrud, David J.

    2010-05-01

    The atmospheric scientific community is nowadays facing the ambitious challenge of providing useful forecasts of atmospheric events that produce high societal impact. The low level of social resilience to false alarms creates tremendous pressure on forecasting offices to issue accurate, timely and reliable warnings.Currently, no operational numerical forecasting system is able to respond to the societal demand for high-resolution (in time and space) predictions in the 12-72h time span. The main reasons for such deficiencies are the lack of adequate observations and the high non-linearity of the numerical models that are currently used. The whole weather forecasting problem is intrinsically probabilistic and current methods aim at coping with the various sources of uncertainties and the error propagation throughout the forecasting system. This probabilistic perspective is often created by generating ensembles of deterministic predictions that are aimed at sampling the most important sources of uncertainty in the forecasting system. The ensemble generation/sampling strategy is a crucial aspect of their performance and various methods have been proposed. Although global forecasting offices have been using ensembles of perturbed initial conditions for medium-range operational forecasts since 1994, no consensus exists regarding the optimum sampling strategy for high resolution short-range ensemble forecasts. Bred vectors, however, have been hypothesized to better capture the growing modes in the highly nonlinear mesoscale dynamics of severe episodes than singular vectors or observation perturbations. Yet even this technique is not able to produce enough diversity in the ensembles to accurately and routinely predict extreme phenomena such as severe weather. Thus, we propose a new method to generate ensembles of initial conditions perturbations that is based on the breeding technique. Given a standard bred mode, a set of customized perturbations is derived with specified amplitudes and horizontal scales. This allows the ensemble to excite growing modes across a wider range of scales. Results show that this approach produces significantly more spread in the ensemble prediction than standard bred modes alone. Several examples that illustrate the benefits from this approach for severe weather forecasts will be provided.

  8. Predictions instead of panics: the framework and utility of systematic forecasting of novel psychoactive drug trends.

    PubMed

    Stogner, John M

    2015-01-01

    Countless novel psychoactive substances have been sensationally described in the last 15 years by the media and academia. Though some become significant issues, most fail to become a substantial threat. The diversity and breadth of these potential problem substances has led policymakers, law enforcement officers, and healthcare providers alike to feel overwhelmed and underprepared for dealing with novel drugs. Inadequacies in training and preparation may be remedied by a response that is more selective and more proactive. The current manuscript seeks to clarify how to most efficiently forecast the "success" of each newly introduced novel psychoactive substance in order to allow for more efficient decision making and proactive resource allocation. A review of literature, published case reports, and legal studies was used to determine which factors were most closely linked to use of a novel drug spreading. Following the development of a forecasting framework, examples of its use are provided. The resulting five-step forecast method relies on assessments of the availability of a potential user base, the costs--legal and otherwise--of the drug relative to existent analogues, the subjective experience, the substance's dependence potential and that of any existent analogue, and ease of acquisition. These five factors should serve to forecast the prevalence of novel drug use, but reaction should be conditioned by the potential for harm. The five-step forecast method predicts that use of acetyl fentanyl, kratom, Leonotis leonurus, and e-cigarettes will grow, but that use of dragonfly and similar substances will not. While this forecasting approach should not be used as a replacement for monitoring, the use of the five-step method will allow policymakers, law enforcement and practitioners to quickly begin targeted evaluative, intervention, and treatment initiatives only for those drugs with predicted harm.

  9. [Medical human resources planning in Europe: A literature review of the forecasting models].

    PubMed

    Benahmed, N; Deliège, D; De Wever, A; Pirson, M

    2018-02-01

    Healthcare is a labor-intensive sector in which half of the expenses are dedicated to human resources. Therefore, policy makers, at national and internal levels, attend to the number of practicing professionals and the skill mix. This paper aims to analyze the European forecasting model for supply and demand of physicians. To describe the forecasting tools used for physician planning in Europe, a grey literature search was done in the OECD, WHO, and European Union libraries. Electronic databases such as Pubmed, Medine, Embase and Econlit were also searched. Quantitative methods for forecasting medical supply rely mainly on stock-and-flow simulations and less often on systemic dynamics. Parameters included in forecasting models exhibit wide variability for data availability and quality. The forecasting of physician needs is limited to healthcare consumption and rarely considers overall needs and service targets. Besides quantitative methods, horizon scanning enables an evaluation of the changes in supply and demand in an uncertain future based on qualitative techniques such as semi-structured interviews, Delphi Panels, or focus groups. Finally, supply and demand forecasting models should be regularly updated. Moreover, post-hoc analyze is also needed but too rarely implemented. Medical human resource planning in Europe is inconsistent. Political implementation of the results of forecasting projections is essential to insure efficient planning. However, crucial elements such as mobility data between Member States are poorly understood, impairing medical supply regulation policies. These policies are commonly limited to training regulations, while horizontal and vertical substitution is less frequently taken into consideration. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  10. Intermittent Demand Forecasting in a Tertiary Pediatric Intensive Care Unit.

    PubMed

    Cheng, Chen-Yang; Chiang, Kuo-Liang; Chen, Meng-Yin

    2016-10-01

    Forecasts of the demand for medical supplies both directly and indirectly affect the operating costs and the quality of the care provided by health care institutions. Specifically, overestimating demand induces an inventory surplus, whereas underestimating demand possibly compromises patient safety. Uncertainty in forecasting the consumption of medical supplies generates intermittent demand events. The intermittent demand patterns for medical supplies are generally classified as lumpy, erratic, smooth, and slow-moving demand. This study was conducted with the purpose of advancing a tertiary pediatric intensive care unit's efforts to achieve a high level of accuracy in its forecasting of the demand for medical supplies. On this point, several demand forecasting methods were compared in terms of the forecast accuracy of each. The results confirm that applying Croston's method combined with a single exponential smoothing method yields the most accurate results for forecasting lumpy, erratic, and slow-moving demand, whereas the Simple Moving Average (SMA) method is the most suitable for forecasting smooth demand. In addition, when the classification of demand consumption patterns were combined with the demand forecasting models, the forecasting errors were minimized, indicating that this classification framework can play a role in improving patient safety and reducing inventory management costs in health care institutions.

  11. An impact analysis of forecasting methods and forecasting parameters on bullwhip effect

    NASA Astrophysics Data System (ADS)

    Silitonga, R. Y. H.; Jelly, N.

    2018-04-01

    Bullwhip effect is an increase of variance of demand fluctuation from downstream to upstream of supply chain. Forecasting methods and forecasting parameters were recognized as some factors that affect bullwhip phenomena. To study these factors, we can develop simulations. There are several ways to simulate bullwhip effect in previous studies, such as mathematical equation modelling, information control modelling, computer program, and many more. In this study a spreadsheet program named Bullwhip Explorer was used to simulate bullwhip effect. Several scenarios were developed to show the change in bullwhip effect ratio because of the difference in forecasting methods and forecasting parameters. Forecasting methods used were mean demand, moving average, exponential smoothing, demand signalling, and minimum expected mean squared error. Forecasting parameters were moving average period, smoothing parameter, signalling factor, and safety stock factor. It showed that decreasing moving average period, increasing smoothing parameter, increasing signalling factor can create bigger bullwhip effect ratio. Meanwhile, safety stock factor had no impact to bullwhip effect.

  12. On-line algorithms for forecasting hourly loads of an electric utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vemuri, S.; Huang, W.L.; Nelson, D.J.

    A method that lends itself to on-line forecasting of hourly electric loads is presented, and the results of its use are compared to models developed using the Box-Jenkins method. The method consits of processing the historical hourly loads with a sequential least-squares estimator to identify a finite-order autoregressive model which, in turn, is used to obtain a parsimonious autoregressive-moving average model. The method presented has several advantages in comparison with the Box-Jenkins method including much-less human intervention, improved model identification, and better results. The method is also more robust in that greater confidence can be placed in the accuracy ofmore » models based upon the various measures available at the identification stage.« less

  13. The development rainfall forecasting using kalman filter

    NASA Astrophysics Data System (ADS)

    Zulfi, Mohammad; Hasan, Moh.; Dwidja Purnomo, Kosala

    2018-04-01

    Rainfall forecasting is very interesting for agricultural planing. Rainfall information is useful to make decisions about the plan planting certain commodities. In this studies, the rainfall forecasting by ARIMA and Kalman Filter method. Kalman Filter method is used to declare a time series model of which is shown in the form of linear state space to determine the future forecast. This method used a recursive solution to minimize error. The rainfall data in this research clustered by K-means clustering. Implementation of Kalman Filter method is for modelling and forecasting rainfall in each cluster. We used ARIMA (p,d,q) to construct a state space for KalmanFilter model. So, we have four group of the data and one model in each group. In conclusions, Kalman Filter method is better than ARIMA model for rainfall forecasting in each group. It can be showed from error of Kalman Filter method that smaller than error of ARIMA model.

  14. Forecasting electricity usage using univariate time series models

    NASA Astrophysics Data System (ADS)

    Hock-Eam, Lim; Chee-Yin, Yip

    2014-12-01

    Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.

  15. Compensated Box-Jenkins transfer function for short term load forecast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breipohl, A.; Yu, Z.; Lee, F.N.

    In the past years, the Box-Jenkins ARIMA method and the Box-Jenkins transfer function method (BJTF) have been among the most commonly used methods for short term electrical load forecasting. But when there exists a sudden change in the temperature, both methods tend to exhibit larger errors in the forecast. This paper demonstrates that the load forecasting errors resulting from either the BJ ARIMA model or the BJTF model are not simply white noise, but rather well-patterned noise, and the patterns in the noise can be used to improve the forecasts. Thus a compensated Box-Jenkins transfer method (CBJTF) is proposed tomore » improve the accuracy of the load prediction. Some case studies have been made which result in about a 14-33% reduction of the root mean square (RMS) errors of the forecasts, depending on the compensation time period as well as the compensation method used.« less

  16. Snowmelt runoff modeling in simulation and forecasting modes with the Martinec-Mango model

    NASA Technical Reports Server (NTRS)

    Shafer, B.; Jones, E. B.; Frick, D. M. (Principal Investigator)

    1982-01-01

    The Martinec-Rango snowmelt runoff model was applied to two watersheds in the Rio Grande basin, Colorado-the South Fork Rio Grande, a drainage encompassing 216 sq mi without reservoirs or diversions and the Rio Grande above Del Norte, a drainage encompassing 1,320 sq mi without major reservoirs. The model was successfully applied to both watersheds when run in a simulation mode for the period 1973-79. This period included both high and low runoff seasons. Central to the adaptation of the model to run in a forecast mode was the need to develop a technique to forecast the shape of the snow cover depletion curves between satellite data points. Four separate approaches were investigated-simple linear estimation, multiple regression, parabolic exponential, and type curve. Only the parabolic exponential and type curve methods were run on the South Fork and Rio Grande watersheds for the 1980 runoff season using satellite snow cover updates when available. Although reasonable forecasts were obtained in certain situations, neither method seemed ready for truly operational forecasts, possibly due to a large amount of estimated climatic data for one or two primary base stations during the 1980 season.

  17. Forecasting behavior in smart homes based on sleep and wake patterns.

    PubMed

    Williams, Jennifer A; Cook, Diane J

    2017-01-01

    The goal of this research is to use smart home technology to assist people who are recovering from injuries or coping with disabilities to live independently. We introduce an algorithm to model and forecast wake and sleep behaviors that are exhibited by the participant. Furthermore, we propose that sleep behavior is impacted by and can be modeled from wake behavior, and vice versa. This paper describes the Behavior Forecasting (BF) algorithm. BF consists of 1) defining numeric values that reflect sleep and wake behavior, 2) forecasting wake and sleep values from past behavior, 3) analyzing the effect of wake behavior on sleep and vice versa, and 4) improving prediction performance by using both wake and sleep scores. The BF method was evaluated with data collected from 20 smart homes. We found that regardless of the forecasting method utilized, wake behavior and sleep behavior can be modeled with a minimum accuracy of 84%. Additionally, normalizing the wake and sleep scores drastically improves the accuracy to 99%. The results show that we can effectively model wake and sleep behaviors in a smart environment. Furthermore, wake behaviors can be predicted from sleep behaviors and vice versa.

  18. Foretelling Flares and Solar Energetic Particle Events: the FORSPEF tool

    NASA Astrophysics Data System (ADS)

    Anastasiadis, Anastasios; Papaioannou, Athanasios; Sandberg, Ingmar; Georgoulis, Manolis K.; Tziotziou, Kostas; Jiggens, Piers

    2017-04-01

    A novel integrated prediction system, for both solar flares (SFs) and solar energetic particle (SEP) events is being presented. The Forecasting Solar Particle Events and Flares (FORSPEF) provides forecasting of solar eruptive events, such as SFs with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. In addition, FORSPEF, also provides nowcasting of SEP events based on actual SF and CME near real-time data, as well as the complete SEP profile (peak flux, fluence, rise time, duration) per parent solar event. The prediction of SFs relies on a morphological method: the effective connected magnetic field strength (Beff); it is based on an assessment of potentially flaring active-region (AR) magnetic configurations and it utilizes sophisticated analysis of a large number of AR magnetograms. For the prediction of SEP events new methods have been developed for both the likelihood of SEP occurrence and the expected SEP characteristics. In particular, using the location of the flare (longitude) and the flare size (maximum soft X-ray intensity), a reductive statistical method has been implemented. Moreover, employing CME parameters (velocity and width), proper functions per width (i.e. halo, partial halo, non-halo) and integral energy (E>30, 60, 100 MeV) have been identified. In our technique warnings are issued for all > C1.0 soft X-ray flares. The prediction time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective prediction time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes for solar flares and 6 hours for CMEs. We present the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of SF and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. Finally, we demonstrate the validation of the modules of the FORSPEF tool using categorical scores constructed on archived data and we further discuss independent case studies. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK and the "SPECS: Solar Particle Events foreCasting Studies" project of the National Observatory of Athens.

  19. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  20. SONARC: A Sea Ice Monitoring and Forecasting System to Support Safe Operations and Navigation in Arctic Seas

    NASA Astrophysics Data System (ADS)

    Stephenson, S. R.; Babiker, M.; Sandven, S.; Muckenhuber, S.; Korosov, A.; Bobylev, L.; Vesman, A.; Mushta, A.; Demchev, D.; Volkov, V.; Smirnov, K.; Hamre, T.

    2015-12-01

    Sea ice monitoring and forecasting systems are important tools for minimizing accident risk and environmental impacts of Arctic maritime operations. Satellite data such as synthetic aperture radar (SAR), combined with atmosphere-ice-ocean forecasting models, navigation models and automatic identification system (AIS) transponder data from ships are essential components of such systems. Here we present first results from the SONARC project (project term: 2015-2017), an international multidisciplinary effort to develop novel and complementary ice monitoring and forecasting systems for vessels and offshore platforms in the Arctic. Automated classification methods (Zakhvatkina et al., 2012) are applied to Sentinel-1 dual-polarization SAR images from the Barents and Kara Sea region to identify ice types (e.g. multi-year ice, level first-year ice, deformed first-year ice, new/young ice, open water) and ridges. Short-term (1-3 days) ice drift forecasts are computed from SAR images using feature tracking and pattern tracking methods (Berg & Eriksson, 2014). Ice classification and drift forecast products are combined with ship positions based on AIS data from a selected period of 3-4 weeks to determine optimal vessel speed and routing in ice. Results illustrate the potential of high-resolution SAR data for near-real-time monitoring and forecasting of Arctic ice conditions. Over the next 3 years, SONARC findings will contribute new knowledge about sea ice in the Arctic while promoting safe and cost-effective shipping, domain awareness, resource management, and environmental protection.

  1. Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.

    DTIC Science & Technology

    1981-03-01

    Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS

  2. An operational procedure for rapid flood risk assessment in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc

    2017-07-01

    The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.

  3. Ionosphere monitoring and forecast activities within the IAG working group "Ionosphere Prediction"

    NASA Astrophysics Data System (ADS)

    Hoque, Mainul; Garcia-Rigo, Alberto; Erdogan, Eren; Cueto Santamaría, Marta; Jakowski, Norbert; Berdermann, Jens; Hernandez-Pajares, Manuel; Schmidt, Michael; Wilken, Volker

    2017-04-01

    Ionospheric disturbances can affect technologies in space and on Earth disrupting satellite and airline operations, communications networks, navigation systems. As the world becomes ever more dependent on these technologies, ionospheric disturbances as part of space weather pose an increasing risk to the economic vitality and national security. Therefore, having the knowledge of ionospheric state in advance during space weather events is becoming more and more important. To promote scientific cooperation we recently formed a Working Group (WG) called "Ionosphere Predictions" within the International Association of Geodesy (IAG) under Sub-Commission 4.3 "Atmosphere Remote Sensing" of the Commission 4 "Positioning and Applications". The general objective of the WG is to promote the development of ionosphere prediction algorithm/models based on the dependence of ionospheric characteristics on solar and magnetic conditions combining data from different sensors to improve the spatial and temporal resolution and sensitivity taking advantage of different sounding geometries and latency. Our presented work enables the possibility to compare total electron content (TEC) prediction approaches/results from different centers contributing to this WG such as German Aerospace Center (DLR), Universitat Politècnica de Catalunya (UPC), Technische Universität München (TUM) and GMV. DLR developed a model-assisted TEC forecast algorithm taking benefit from actual trends of the TEC behavior at each grid point. Since during perturbations, characterized by large TEC fluctuations or ionization fronts, this approach may fail, the trend information is merged with the current background model which provides a stable climatological TEC behavior. The presented solution is a first step to regularly provide forecasted TEC services via SWACI/IMPC by DLR. UPC forecast model is based on applying linear regression to a temporal window of TEC maps in the Discrete Cosine Transform (DCT) domain. Performance tests are being conducted at the moment in order to improve UPC predicted products for 1-, 2-days ahead. In addition, UPC is working to enable short-term predictions based on UPC real-time GIMs (labelled URTG) and implementing an improved prediction approach. TUM developed a forecast method based on a time series analysis of TEC products which are either B-spline coefficients estimated by a Kalman filter or TEC grid maps derived from the B-spline coefficients. The forecast method uses a Fourier series expansion to extract the trend functions from the estimated TEC product. Then the trend functions are carried out to provide predicted TEC products. The forecast algorithm developed by GMV is based on the ionospheric delay estimation from previous epochs using GNSS data and the main dependence of ionospheric delays on solar and magnetic conditions. Since the ionospheric behavior is highly dependent on the region of the Earth, different region-based algorithmic modifications have been implemented in GMV's magicSBAS ionospheric algorithms to be able to estimate and forecast ionospheric delays worldwide. Different TEC prediction approaches outlined here will certainly help to learn about forecasting ionospheric ionization.

  4. Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…

  5. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    NASA Astrophysics Data System (ADS)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  6. Urban flood early warning systems: approaches to hydrometeorological forecasting and communicating risk

    NASA Astrophysics Data System (ADS)

    Cranston, Michael; Speight, Linda; Maxey, Richard; Tavendale, Amy; Buchanan, Peter

    2015-04-01

    One of the main challenges for the flood forecasting community remains the provision of reliable early warnings of surface (or pluvial) flooding. The Scottish Flood Forecasting Service has been developing approaches for forecasting the risk of surface water flooding including capitalising on the latest developments in quantitative precipitation forecasting from the Met Office. A probabilistic Heavy Rainfall Alert decision support tool helps operational forecasters assess the likelihood of surface water flooding against regional rainfall depth-duration estimates from MOGREPS-UK linked to historical short-duration flooding in Scotland. The surface water flood risk is communicated through the daily Flood Guidance Statement to emergency responders. A more recent development is an innovative risk-based hydrometeorological approach that links 24-hour ensemble rainfall forecasts through a hydrological model (Grid-to-Grid) to a library of impact assessments (Speight et al., 2015). The early warning tool - FEWS Glasgow - presents the risk of flooding to people, property and transport across a 1km grid over the city of Glasgow with a lead time of 24 hours. Communication of the risk was presented in a bespoke surface water flood forecast product designed based on emergency responder requirements and trialled during the 2014 Commonwealth Games in Glasgow. The development of new approaches to surface water flood forecasting are leading to improved methods of communicating the risk and better performance in early warning with a reduction in false alarm rates with summer flood guidance in 2014 (67%) compared to 2013 (81%) - although verification of instances of surface water flooding remains difficult. However the introduction of more demanding hydrometeorological capabilities with associated greater levels of uncertainty does lead to an increased demand on operational flood forecasting skills and resources. Speight, L., Cole, S.J., Moore, R.J., Pierce, C., Wright, B., Golding, B., Cranston, M., Tavendale, A., Ghimire, S., and Dhondia, J. (2015) Developing surface water flood forecasting capabilities in Scotland: an operational pilot for the 2014 Commonwealth Games in Glasgow. Journal of Flood Risk Management, In Press.

  7. Evaluation of Clear-Air Turbulence Diagnostics: GTG in Korea

    NASA Astrophysics Data System (ADS)

    Kim, J.-H.; Chun, H.-Y.; Jang, W.; Sharman, R. D.

    2009-04-01

    Turbulence forecasting algorithm, the Graphical Turbulence Guidance (GTG) system developed at NCAR (Sharman et al., 2006), is evaluated with available turbulence observations (e.g. pilot reports; PIREPs) reported in South Korea during the recent 4 years (2003-2007). Clear-air turbulence (CAT) is extracted from PIREPs by using cloud-to-ground lightning flash data from Korean Meteorological Administration (KMA). The GTG system includes several steps. First, 45 turbulence indices are calculated in the East Asian region near Korean peninsula using the Regional Data Assimilation and Prediction System (RDAPS) analysis data with 30 km horizontal grid spacing provided by KMA. Second, 10 CAT indices that performed ten best forecasting score are selected. The scoring method is based on the probability of detection, which is calculated using PIREPs exclusively of moderate-or-greater intensity. Various statistical examinations and sensitivity tests of the GTG system are performed by yearly and seasonally classified PIREPs in South Korea. Performance of GTG is more consistent and stable than that of any individual diagnostic in each year and season. In addition, current-year forecasting based on yearly PIREPs is better than adjacent-year forecasting and year-after-year forecasting. Seasonal forecasting is generally better than yearly forecasting, because selected CAT indices in each season represent meteorological condition much more properly than applying the selected CAT indices to all seasons. Wintertime forecasting is the best among the four seasonal forecastings. This is likely due to that the GTG system consists of many CAT indices related to jet stream, and turbulence associated with the jet can be most activated in wintertime under strong jet magnitude. On the other hand, summertime forecasting skill is much less than in wintertime. To acquire better performance for summertime forecasting, it is likely to develop more turbulence indices related to, for example, convections. By sensitivity test to the number of combined indices, it is found that yearly and seasonal GTG is the best when about 7 CAT indices are combined.

  8. Landslide early warning based on failure forecast models: the example of the Mt. de La Saxe rockslide, northern Italy

    NASA Astrophysics Data System (ADS)

    Manconi, A.; Giordan, D.

    2015-07-01

    We apply failure forecast models by exploiting near-real-time monitoring data for the La Saxe rockslide, a large unstable slope threatening Aosta Valley in northern Italy. Starting from the inverse velocity theory, we analyze landslide surface displacements automatically and in near real time on different temporal windows and apply straightforward statistical methods to obtain confidence intervals on the estimated time of failure. Here, we present the result obtained for the La Saxe rockslide, a large unstable slope located in Aosta Valley, northern Italy. Based on this case study, we identify operational thresholds that are established on the reliability of the forecast models. Our approach is aimed at supporting the management of early warning systems in the most critical phases of the landslide emergency.

  9. Introducing an operational method to forecast long-term regional drought based on the application of artificial intelligence capabilities

    NASA Astrophysics Data System (ADS)

    Kousari, Mohammad Reza; Hosseini, Mitra Esmaeilzadeh; Ahani, Hossein; Hakimelahi, Hemila

    2017-01-01

    An effective forecast of the drought definitely gives lots of advantages in regard to the management of water resources being used in agriculture, industry, and households consumption. To introduce such a model applying simple data inputs, in this study a regional drought forecast method on the basis of artificial intelligence capabilities (artificial neural networks) and Standardized Precipitation Index (SPI in 3, 6, 9, 12, 18, and 24 monthly series) has been presented in Fars Province of Iran. The precipitation data of 41 rain gauge stations were applied for computing SPI values. Besides, weather signals including Multivariate ENSO Index (MEI), North Atlantic Oscillation (NAO), Southern Oscillation Index (SOI), NINO1+2, anomaly NINO1+2, NINO3, anomaly NINO3, NINO4, anomaly NINO4, NINO3.4, and anomaly NINO3.4 were also used as the predictor variables for SPI time series forecast the next 12 months. Frequent testing and validating steps were considered to obtain the best artificial neural networks (ANNs) models. The forecasted values were mapped in verification sector then they were compared with the observed maps at the same dates. Results showed considerable spatial and temporal relationships even among the maps of different SPI time series. Also, the first 6 months forecasted maps showed an average of 73 % agreements with the observed ones. The most important finding and the strong point of this study was the fact that although drought forecast in each station and time series was completely independent, the relationships between spatial and temporal predictions remained. This strong point mainly referred to frequent testing and validating steps in order to explore the best drought forecast models from plenty of produced ANNs models. Finally, wherever the precipitation data are available, the practical application of the presented method is possible.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendes, J.; Bessa, R.J.; Keko, H.

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highlymore » dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.« less

  11. Efficient ensemble forecasting of marine ecology with clustered 1D models and statistical lateral exchange: application to the Red Sea

    NASA Astrophysics Data System (ADS)

    Dreano, Denis; Tsiaras, Kostas; Triantafyllou, George; Hoteit, Ibrahim

    2017-07-01

    Forecasting the state of large marine ecosystems is important for many economic and public health applications. However, advanced three-dimensional (3D) ecosystem models, such as the European Regional Seas Ecosystem Model (ERSEM), are computationally expensive, especially when implemented within an ensemble data assimilation system requiring several parallel integrations. As an alternative to 3D ecological forecasting systems, we propose to implement a set of regional one-dimensional (1D) water-column ecological models that run at a fraction of the computational cost. The 1D model domains are determined using a Gaussian mixture model (GMM)-based clustering method and satellite chlorophyll-a (Chl-a) data. Regionally averaged Chl-a data is assimilated into the 1D models using the singular evolutive interpolated Kalman (SEIK) filter. To laterally exchange information between subregions and improve the forecasting skills, we introduce a new correction step to the assimilation scheme, in which we assimilate a statistical forecast of future Chl-a observations based on information from neighbouring regions. We apply this approach to the Red Sea and show that the assimilative 1D ecological models can forecast surface Chl-a concentration with high accuracy. The statistical assimilation step further improves the forecasting skill by as much as 50%. This general approach of clustering large marine areas and running several interacting 1D ecological models is very flexible. It allows many combinations of clustering, filtering and regression technics to be used and can be applied to build efficient forecasting systems in other large marine ecosystems.

  12. Implementation of Automatic Clustering Algorithm and Fuzzy Time Series in Motorcycle Sales Forecasting

    NASA Astrophysics Data System (ADS)

    Rasim; Junaeti, E.; Wirantika, R.

    2018-01-01

    Accurate forecasting for the sale of a product depends on the forecasting method used. The purpose of this research is to build motorcycle sales forecasting application using Fuzzy Time Series method combined with interval determination using automatic clustering algorithm. Forecasting is done using the sales data of motorcycle sales in the last ten years. Then the error rate of forecasting is measured using Means Percentage Error (MPE) and Means Absolute Percentage Error (MAPE). The results of forecasting in the one-year period obtained in this study are included in good accuracy.

  13. NWS Operational Requirements for Ensemble-Based Hydrologic Forecasts

    NASA Astrophysics Data System (ADS)

    Hartman, R. K.

    2008-12-01

    Ensemble-based hydrologic forecasts have been developed and issued by National Weather Service (NWS) staff at River Forecast Centers (RFCs) for many years. Used principally for long-range water supply forecasts, only the uncertainty associated with weather and climate have been traditionally considered. As technology and societal expectations of resource managers increase, the use and desire for risk-based decision support tools has also increased. These tools require forecast information that includes reliable uncertainty estimates across all time and space domains. The development of reliable uncertainty estimates associated with hydrologic forecasts is being actively pursued within the United States and internationally. This presentation will describe the challenges, components, and requirements for operational hydrologic ensemble-based forecasts from the perspective of a NOAA/NWS River Forecast Center.

  14. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  15. Meteor Shower Forecasting for Spacecraft Operations

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.; Cooke, William J.; Campbell-Brown, Margaret D.

    2017-01-01

    Although sporadic meteoroids generally pose a much greater hazard to spacecraft than shower meteoroids, meteor showers can significantly increase the risk of damage over short time periods. Because showers are brief, it is sometimes possible to mitigate the risk operationally, which requires accurate predictions of shower activity. NASA's Meteoroid Environment Office (MEO) generates an annual meteor shower forecast that describes the variations in the near-Earth meteoroid flux produced by meteor showers, and presents the shower flux both in absolute terms and relative to the sporadic flux. The shower forecast incorporates model predictions of annual variations in shower activity and quotes fluxes to several limiting particle kinetic energies. In this work, we describe our forecasting methods and present recent improvements to the temporal profiles based on flux measurements from the Canadian Meteor Orbit Radar (CMOR).

  16. Integration of Remote Sensing Data In Operational Flood Forecast In Southwest Germany

    NASA Astrophysics Data System (ADS)

    Bach, H.; Appel, F.; Schulz, W.; Merkel, U.; Ludwig, R.; Mauser, W.

    Methods to accurately assess and forecast flood discharge are mandatory to minimise the impact of hydrological hazards. However, existing rainfall-runoff models rarely accurately consider the spatial characteristics of the watershed, which is essential for a suitable and physics-based description of processes relevant for runoff formation. Spatial information with low temporal variability like elevation, slopes and land use can be mapped or extracted from remote sensing data. However, land surface param- eters of high temporal variability, like soil moisture and snow properties are hardly available and used in operational forecasts. Remote sensing methods can improve flood forecast by providing information on the actual water retention capacities in the watershed and facilitate the regionalisation of hydrological models. To prove and demonstrate this, the project 'InFerno' (Integration of remote sensing data in opera- tional water balance and flood forecast modelling) has been set up, funded by DLR (50EE0053). Within InFerno remote sensing data (optical and microwave) are thor- oughly processed to deliver spatially distributed parameters of snow properties and soil moisture. Especially during the onset of a flood this information is essential to estimate the initial conditions of the model. At the flood forecast centres of 'Baden- Württemberg' and 'Rheinland-Pfalz' (Southwest Germany) the remote sensing based maps on soil moisture and snow properties will be integrated in the continuously op- erated water balance and flood forecast model LARSIM. The concept is to transfer the developed methodology from the Neckar to the Mosel basin. The major challenges lie on the one hand in the implementation of algorithms developed for a multisensoral synergy and the creation of robust, operationally applicable remote sensing products. On the other hand, the operational flood forecast must be adapted to make full use of the new data sources. In the operational phase of the project ESA's ENVISAT satellite, which will be launched in 2002, will serve as remote sensing data source. Until EN- VISAT data is available, algorithm retrieval, software development and product gener- ation is performed using existing sensors with ENVISAT-like specifications. Based on these data sets test cases and demonstration runs are conducted and will be presented to prove the advantages of the approach.

  17. School District Enrollment Projections: A Comparison of Three Methods.

    ERIC Educational Resources Information Center

    Pettibone, Timothy J.; Bushan, Latha

    This study assesses three methods of forecasting school enrollments: the cohort-sruvival method (grade progression), the statistical forecasting procedure developed by the Statistical Analysis System (SAS) Institute, and a simple ratio computation. The three methods were used to forecast school enrollments for kindergarten through grade 12 in a…

  18. Long-term ensemble forecast of snowmelt inflow into the Cheboksary Reservoir under two different weather scenarios

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.

    2018-04-01

    A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.

  19. Evaluation of the NCEP CFSv2 45-day Forecasts for Predictability of Intraseasonal Tropical Storm Activities

    NASA Astrophysics Data System (ADS)

    Schemm, J. E.; Long, L.; Baxter, S.

    2013-12-01

    Evaluation of the NCEP CFSv2 45-day Forecasts for Predictability of Intraseasonal Tropical Storm Activities Jae-Kyung E. Schemm, Lindsey Long and Stephen Baxter Climate Prediction Center, NCEP/NWS/NOAA Predictability of intraseasonal tropical storm (TS) activities is assessed using the 1999-2010 CFSv2 hindcast suite. Weekly TS activities in the CFSv2 45-day forecasts were determined using the TS detection and tracking method devised by Carmago and Zebiak (2002). The forecast periods are divided into weekly intervals for Week 1 through Week 6, and also the 30-day mean. The TS activities in those intervals are compared to the observed activities based on the NHC HURDAT and JTWC Best Track datasets. The CFSv2 45-day hindcast suite is made of forecast runs initialized at 00, 06, 12 and 18Z every day during the 1999 - 2010 period. For predictability evaluation, forecast TS activities are analyzed based on 20-member ensemble forecasts comprised of 45-day runs made during the most recent 5 days prior to the verification period. The forecast TS activities are evaluated in terms of the number of storms, genesis locations and storm tracks during the weekly periods. The CFSv2 forecasts are shown to have a fair level of skill in predicting the number of storms over the Atlantic Basin with the temporal correlation scores ranging from 0.73 for Week 1 forecasts to 0.63 for Week 6, and the average RMS errors ranging from 0.86 to 1.07 during the 1999-2010 hurricane season. Also, the forecast track density distribution and false alarm statistics are compiled using the hindcast analyses. In real-time applications of the intraseasonal TS activity forecasts, the climatological TS forecast statistics will be used to make the model bias corrections in terms of the storm counts, track distribution and removal of false alarms. An operational implementation of the weekly TS activity prediction is planned for early 2014 to provide an objective input for the CPC's Global Tropical Hazards Outlooks.

  20. A Method for Forecasting the Commercial Air Traffic Schedule in the Future

    NASA Technical Reports Server (NTRS)

    Long, Dou; Lee, David; Gaier, Eric; Johnson, Jesse; Kostiuk, Peter

    1999-01-01

    This report presents an integrated set of models that forecasts air carriers' future operations when delays due to limited terminal-area capacity are considered. This report models the industry as a whole, avoiding unnecessary details of competition among the carriers. To develop the schedule outputs, we first present a model to forecast the unconstrained flight schedules in the future, based on the assumption of rational behavior of the carriers. Then we develop a method to modify the unconstrained schedules, accounting for effects of congestion due to limited NAS capacities. Our underlying assumption is that carriers will modify their operations to keep mean delays within certain limits. We estimate values for those limits from changes in planned block times reflected in the OAG. Our method for modifying schedules takes many means of reducing the delays into considerations, albeit some of them indirectly. The direct actions include depeaking, operating in off-hours, and reducing hub airports'operations. Indirect actions include using secondary airports, using larger aircraft, and selecting new hub airports, which, we assume, have already been modeled in the FAA's TAF. Users of our suite of models can substitute an alternative forecast for the TAF.

  1. The Value of Humans in the Operational River Forecasting Enterprise

    NASA Astrophysics Data System (ADS)

    Pagano, T. C.

    2012-04-01

    The extent of human control over operational river forecasts, such as by adjusting model inputs and outputs, varies from nearly completely automated systems to those where forecasts are generated after discussion among a group of experts. Historical and realtime data availability, the complexity of hydrologic processes, forecast user needs, and forecasting institution support/resource availability (e.g. computing power, money for model maintenance) influence the character and effectiveness of operational forecasting systems. Automated data quality algorithms, if used at all, are typically very basic (e.g. checks for impossible values); substantial human effort is devoted to cleaning up forcing data using subjective methods. Similarly, although it is an active research topic, nearly all operational forecasting systems struggle to make quantitative use of Numerical Weather Prediction model-based precipitation forecasts, instead relying on the assessment of meteorologists. Conversely, while there is a strong tradition in meteorology of making raw model outputs available to forecast users via the Internet, this is rarely done in hydrology; Operational river forecasters express concerns about exposing users to raw guidance, due to the potential for misinterpretation and misuse. However, this limits the ability of users to build their confidence in operational products through their own value-added analyses. Forecasting agencies also struggle with provenance (i.e. documenting the production process and archiving the pieces that went into creating a forecast) although this is necessary for quantifying the benefits of human involvement in forecasting and diagnosing weak links in the forecasting chain. In hydrology, the space between model outputs and final operational products is nearly unstudied by the academic community, although some studies exist in other fields such as meteorology.

  2. Cloud and DNI nowcasting with MSG/SEVIRI for the optimized operation of concentrating solar power plants

    NASA Astrophysics Data System (ADS)

    Sirch, Tobias; Bugliaro, Luca; Zinner, Tobias; Möhrlein, Matthias; Vazquez-Navarro, Margarita

    2017-02-01

    A novel approach for the nowcasting of clouds and direct normal irradiance (DNI) based on the Spinning Enhanced Visible and Infrared Imager (SEVIRI) aboard the geostationary Meteosat Second Generation (MSG) satellite is presented for a forecast horizon up to 120 min. The basis of the algorithm is an optical flow method to derive cloud motion vectors for all cloudy pixels. To facilitate forecasts over a relevant time period, a classification of clouds into objects and a weighted triangular interpolation of clear-sky regions are used. Low and high level clouds are forecasted separately because they show different velocities and motion directions. Additionally a distinction in advective and convective clouds together with an intensity correction for quickly thinning convective clouds is integrated. The DNI is calculated from the forecasted optical thickness of the low and high level clouds. In order to quantitatively assess the performance of the algorithm, a forecast validation against MSG/SEVIRI observations is performed for a period of 2 months. Error rates and Hanssen-Kuiper skill scores are derived for forecasted cloud masks. For a forecast of 5 min for most cloud situations more than 95 % of all pixels are predicted correctly cloudy or clear. This number decreases to 80-95 % for a forecast of 2 h depending on cloud type and vertical cloud level. Hanssen-Kuiper skill scores for cloud mask go down to 0.6-0.7 for a 2 h forecast. Compared to persistence an improvement of forecast horizon by a factor of 2 is reached for all forecasts up to 2 h. A comparison of forecasted optical thickness distributions and DNI against observations yields correlation coefficients larger than 0.9 for 15 min forecasts and around 0.65 for 2 h forecasts.

  3. The application of hybrid artificial intelligence systems for forecasting

    NASA Astrophysics Data System (ADS)

    Lees, Brian; Corchado, Juan

    1999-03-01

    The results to date are presented from an ongoing investigation, in which the aim is to combine the strengths of different artificial intelligence methods into a single problem solving system. The premise underlying this research is that a system which embodies several cooperating problem solving methods will be capable of achieving better performance than if only a single method were employed. The work has so far concentrated on the combination of case-based reasoning and artificial neural networks. The relative merits of artificial neural networks and case-based reasoning problem solving paradigms, and their combination are discussed. The integration of these two AI problem solving methods in a hybrid systems architecture, such that the neural network provides support for learning from past experience in the case-based reasoning cycle, is then presented. The approach has been applied to the task of forecasting the variation of physical parameters of the ocean. Results obtained so far from tests carried out in the dynamic oceanic environment are presented.

  4. Stream-flow forecasting using extreme learning machines: A case study in a semi-arid region in Iraq

    NASA Astrophysics Data System (ADS)

    Yaseen, Zaher Mundher; Jaafar, Othman; Deo, Ravinesh C.; Kisi, Ozgur; Adamowski, Jan; Quilty, John; El-Shafie, Ahmed

    2016-11-01

    Monthly stream-flow forecasting can yield important information for hydrological applications including sustainable design of rural and urban water management systems, optimization of water resource allocations, water use, pricing and water quality assessment, and agriculture and irrigation operations. The motivation for exploring and developing expert predictive models is an ongoing endeavor for hydrological applications. In this study, the potential of a relatively new data-driven method, namely the extreme learning machine (ELM) method, was explored for forecasting monthly stream-flow discharge rates in the Tigris River, Iraq. The ELM algorithm is a single-layer feedforward neural network (SLFNs) which randomly selects the input weights, hidden layer biases and analytically determines the output weights of the SLFNs. Based on the partial autocorrelation functions of historical stream-flow data, a set of five input combinations with lagged stream-flow values are employed to establish the best forecasting model. A comparative investigation is conducted to evaluate the performance of the ELM compared to other data-driven models: support vector regression (SVR) and generalized regression neural network (GRNN). The forecasting metrics defined as the correlation coefficient (r), Nash-Sutcliffe efficiency (ENS), Willmott's Index (WI), root-mean-square error (RMSE) and mean absolute error (MAE) computed between the observed and forecasted stream-flow data are employed to assess the ELM model's effectiveness. The results revealed that the ELM model outperformed the SVR and the GRNN models across a number of statistical measures. In quantitative terms, superiority of ELM over SVR and GRNN models was exhibited by ENS = 0.578, 0.378 and 0.144, r = 0.799, 0.761 and 0.468 and WI = 0.853, 0.802 and 0.689, respectively and the ELM model attained lower RMSE value by approximately 21.3% (relative to SVR) and by approximately 44.7% (relative to GRNN). Based on the findings of this study, several recommendations were suggested for further exploration of the ELM model in hydrological forecasting problems.

  5. Development of a model-based flood emergency management system in Yujiang River Basin, South China

    NASA Astrophysics Data System (ADS)

    Zeng, Yong; Cai, Yanpeng; Jia, Peng; Mao, Jiansu

    2014-06-01

    Flooding is the most frequent disaster in China. It affects people's lives and properties, causing considerable economic loss. Flood forecast and operation of reservoirs are important in flood emergency management. Although great progress has been achieved in flood forecast and reservoir operation through using computer, network technology, and geographic information system technology in China, the prediction accuracy of models are not satisfactory due to the unavailability of real-time monitoring data. Also, real-time flood control scenario analysis is not effective in many regions and can seldom provide online decision support function. In this research, a decision support system for real-time flood forecasting in Yujiang River Basin, South China (DSS-YRB) is introduced in this paper. This system is based on hydrological and hydraulic mathematical models. The conceptual framework and detailed components of the proposed DSS-YRB is illustrated, which employs real-time rainfall data conversion, model-driven hydrologic forecasting, model calibration, data assimilation methods, and reservoir operational scenario analysis. Multi-tiered architecture offers great flexibility, portability, reusability, and reliability. The applied case study results show the development and application of a decision support system for real-time flood forecasting and operation is beneficial for flood control.

  6. Daily Peak Load Forecasting of Next Day using Weather Distribution and Comparison Value of Each Nearby Date Data

    NASA Astrophysics Data System (ADS)

    Ito, Shigenobu; Yukita, Kazuto; Goto, Yasuyuki; Ichiyanagi, Katsuhiro; Nakano, Hiroyuki

    By the development of industry, in recent years; dependence to electric energy is growing year by year. Therefore, reliable electric power supply is in need. However, to stock a huge amount of electric energy is very difficult. Also, there is a necessity to keep balance between the demand and supply, which changes hour after hour. Consequently, to supply the high quality and highly dependable electric power supply, economically, and with high efficiency, there is a need to forecast the movement of the electric power demand carefully in advance. And using that forecast as the source, supply and demand management plan should be made. Thus load forecasting is said to be an important job among demand investment of electric power companies. So far, forecasting method using Fuzzy logic, Neural Net Work, Regression model has been suggested for the development of forecasting accuracy. Those forecasting accuracy is in a high level. But to invest electric power in higher accuracy more economically, a new forecasting method with higher accuracy is needed. In this paper, to develop the forecasting accuracy of the former methods, the daily peak load forecasting method using the weather distribution of highest and lowest temperatures, and comparison value of each nearby date data is suggested.

  7. An experiment in hurricane track prediction using parallel computing methods

    NASA Technical Reports Server (NTRS)

    Song, Chang G.; Jwo, Jung-Sing; Lakshmivarahan, S.; Dhall, S. K.; Lewis, John M.; Velden, Christopher S.

    1994-01-01

    The barotropic model is used to explore the advantages of parallel processing in deterministic forecasting. We apply this model to the track forecasting of hurricane Elena (1985). In this particular application, solutions to systems of elliptic equations are the essence of the computational mechanics. One set of equations is associated with the decomposition of the wind into irrotational and nondivergent components - this determines the initial nondivergent state. Another set is associated with recovery of the streamfunction from the forecasted vorticity. We demonstrate that direct parallel methods based on accelerated block cyclic reduction (BCR) significantly reduce the computational time required to solve the elliptic equations germane to this decomposition and forecast problem. A 72-h track prediction was made using incremental time steps of 16 min on a network of 3000 grid points nominally separated by 100 km. The prediction took 30 sec on the 8-processor Alliant FX/8 computer. This was a speed-up of 3.7 when compared to the one-processor version. The 72-h prediction of Elena's track was made as the storm moved toward Florida's west coast. Approximately 200 km west of Tampa Bay, Elena executed a dramatic recurvature that ultimately changed its course toward the northwest. Although the barotropic track forecast was unable to capture the hurricane's tight cycloidal looping maneuver, the subsequent northwesterly movement was accurately forecasted as was the location and timing of landfall near Mobile Bay.

  8. Adaptive spline autoregression threshold method in forecasting Mitsubishi car sales volume at PT Srikandi Diamond Motors

    NASA Astrophysics Data System (ADS)

    Susanti, D.; Hartini, E.; Permana, A.

    2017-01-01

    Sale and purchase of the growing competition between companies in Indonesian, make every company should have a proper planning in order to win the competition with other companies. One of the things that can be done to design the plan is to make car sales forecast for the next few periods, it’s required that the amount of inventory of cars that will be sold in proportion to the number of cars needed. While to get the correct forecasting, on of the methods that can be used is the method of Adaptive Spline Threshold Autoregression (ASTAR). Therefore, this time the discussion will focus on the use of Adaptive Spline Threshold Autoregression (ASTAR) method in forecasting the volume of car sales in PT.Srikandi Diamond Motors using time series data.In the discussion of this research, forecasting using the method of forecasting value Adaptive Spline Threshold Autoregression (ASTAR) produce approximately correct.

  9. Forecasting approaches to the Mekong River

    NASA Astrophysics Data System (ADS)

    Plate, E. J.

    2009-04-01

    Hydrologists distinguish between flood forecasts, which are concerned with events of the immediate future, and flood predictions, which are concerned with events that are possible, but whose date of occurrence is not determined. Although in principle both involve the determination of runoff from rainfall, the analytical approaches differ because of different objectives. The differences between the two approaches will be discussed, starting with an analysis of the forecasting process. The Mekong River in south-east Asia is used as an example. Prediction is defined as forecast for a hypothetical event, such as the 100-year flood, which is usually sufficiently specified by its magnitude and its probability of occurrence. It forms the basis for designing flood protection structures and risk management activities. The method for determining these quantities is hydrological modeling combined with extreme value statistics, today usually applied both to rainfall events and to observed river discharges. A rainfall-runoff model converts extreme rainfall events into extreme discharges, which at certain gage points along a river are calibrated against observed discharges. The quality of the model output is assessed against the mean value by means of the Nash-Sutcliffe quality criterion. The result of this procedure is a design hydrograph (or a family of design hydrographs) which are used as inputs into a hydraulic model, which converts the hydrograph into design water levels according to the hydraulic situation of the location. The accuracy of making a prediction in this sense is not particularly high: hydrologists know that the 100-year flood is a statistical quantity which can be estimated only within comparatively wide error bounds, and the hydraulics of a river site, in particular under conditions of heavy sediment loads has many uncertainties. Safety margins, such as additional freeboards are arranged to compensate for the uncertainty of the prediction. Forecasts, on the other hand, have as objective to obtain an accurate hydrograph of the near future. The method by means of which this is done is not as important as the accuracy of the forecast. A mathematical rainfall-runoff model is not necessarily a good forecast model. It has to be very carefully designed, and in many cases statistical models are found to give better results than mathematical models. Forecasters have the advantage of knowing the course of the hydrographs up to the point in time where forecasts have to be made. Therefore, models can be calibrated on line against the hydrograph of the immediate past. To assess the quality of a forecast, the quality criterion should not be based on the mean value, as does the Nash-Sutcliffe criterion, but should be based on the best forecast given the information up to the forecast time. Without any additional information, the best forecast when only the present day value is known is to assume a no-change scenario, i.e. to assume that the present value does not change in the immediate future. For the Mekong there exists a forecasting system which is based on a rainfall-runoff model operated by the Mekong River Commission. This model is found not to be adequate for forecasting for periods longer than one or two days ahead. Improvements are sought through two approaches: a strictly deterministic rainfall-runoff model, and a strictly statistical model based on regression with upstream stations. The two approaches are com-pared, and suggestions are made how to best combine the advantages of both approaches. This requires that due consideration is given to critical hydraulic conditions of the river at and in between the gauging stations. Critical situations occur in two ways: when the river overtops, in which case the rainfall-runoff model is incomplete unless overflow losses are considered, and at the confluence with tributaries. Of particular importance is the role of the large Tonle Sap Lake, which dampens the hydrograph downstream of Phnom Penh. The effect of these components of river hydraulics on forecasting accuracy will be assessed.

  10. [Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].

    PubMed

    Zheng, Chang-song; Ma, Biao

    2009-04-01

    The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.

  11. Applying different independent component analysis algorithms and support vector regression for IT chain store sales forecasting.

    PubMed

    Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  12. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    PubMed Central

    Dai, Wensheng

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740

  13. Evaluating Snow Data Assimilation Framework for Streamflow Forecasting Applications Using Hindcast Verification

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2012-12-01

    Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.

  14. The potential of radar-based ensemble forecasts for flash-flood early warning in the southern Swiss Alps

    NASA Astrophysics Data System (ADS)

    Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.

    2013-10-01

    This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel radar-based ensemble forecasting chains for flash-flood early warning are investigated in three catchments in the southern Swiss Alps and set in relation to deterministic discharge forecasts for the same catchments. The first radar-based ensemble forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second ensemble forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialised with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. A clear preference was found for the ensemble approach. Discharge forecasts perform better when forced by NORA and REAL-C2 rather then by deterministic weather radar data. Moreover, it was observed that using an ensemble of initial conditions at the forecast initialisation, as in REAL-C2, significantly improved the forecast skill. These forecasts also perform better then forecasts forced by ensemble rainfall forecasts (NORA) initialised form a single initial condition of the hydrological model. Thus the best results were obtained with the REAL-C2 forecasting chain. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.

  15. A novel hybrid decomposition-and-ensemble model based on CEEMD and GWO for short-term PM2.5 concentration forecasting

    NASA Astrophysics Data System (ADS)

    Niu, Mingfei; Wang, Yufang; Sun, Shaolong; Li, Yongwu

    2016-06-01

    To enhance prediction reliability and accuracy, a hybrid model based on the promising principle of "decomposition and ensemble" and a recently proposed meta-heuristic called grey wolf optimizer (GWO) is introduced for daily PM2.5 concentration forecasting. Compared with existing PM2.5 forecasting methods, this proposed model has improved the prediction accuracy and hit rates of directional prediction. The proposed model involves three main steps, i.e., decomposing the original PM2.5 series into several intrinsic mode functions (IMFs) via complementary ensemble empirical mode decomposition (CEEMD) for simplifying the complex data; individually predicting each IMF with support vector regression (SVR) optimized by GWO; integrating all predicted IMFs for the ensemble result as the final prediction by another SVR optimized by GWO. Seven benchmark models, including single artificial intelligence (AI) models, other decomposition-ensemble models with different decomposition methods and models with the same decomposition-ensemble method but optimized by different algorithms, are considered to verify the superiority of the proposed hybrid model. The empirical study indicates that the proposed hybrid decomposition-ensemble model is remarkably superior to all considered benchmark models for its higher prediction accuracy and hit rates of directional prediction.

  16. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  17. Risky Business: Development, Communication and Use of Hydroclimatic Forecasts

    NASA Astrophysics Data System (ADS)

    Lall, U.

    2012-12-01

    Inter-seasonal and longer hydroclimatic forecasts have been made increasingly in the last two decades following the increase in ENSO activity since the early 1980s and the success in seasonal ENSO forecasting. Yet, the number of examples of systematic use of these forecasts and their incorporation into water systems operation continue to be few. This may be due in part to the limited skill in such forecasts over much of the world, but is also likely due to the limited evolution of methods and opportunities to "safely" use uncertain forecasts. There has been a trend to rely more on "physically based" rather than "physically informed" empirical forecasts, and this may in part explain the limited success in developing usable products in more locations. Given the limited skill, forecasters have tended to "dumb" down their forecasts - either formally or subjectively shrinking the forecasts towards climatology, or reducing them to tercile forecasts that serve to obscure the potential information in the forecast. Consequently, the potential utility of such forecasts for decision making is compromised. Water system operating rules are often designed to be robust in the face of historical climate variability, and consequently are adapted to the potential conditions that a forecast seeks to inform. In such situations, there is understandable reluctance by managers to use the forecasts as presented, except in special cases where an alternate course of action is pragmatically appealing in any case. In this talk, I review opportunities to present targeted forecasts for use with decision systems that directly address climate risk and the risk induced by unbiased yet uncertain forecasts, focusing especially on extreme events and water allocation in a competitive environment. Examples from Brazil and India covering surface and ground water conjunctive use strategies that could potentially be insured and lead to improvements over the traditional system operation and resource allocation are provided.

  18. How Hydroclimate Influences the Effectiveness of Particle Filter Data Assimilation of Streamflow in Initializing Short- to Medium-range Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Clark, E.; Wood, A.; Nijssen, B.; Clark, M. P.

    2017-12-01

    Short- to medium-range (1- to 7-day) streamflow forecasts are important for flood control operations and in issuing potentially life-save flood warnings. In the U.S., the National Weather Service River Forecast Centers (RFCs) issue such forecasts in real time, depending heavily on a manual data assimilation (DA) approach. Forecasters adjust model inputs, states, parameters and outputs based on experience and consideration of a range of supporting real-time information. Achieving high-quality forecasts from new automated, centralized forecast systems will depend critically on the adequacy of automated DA approaches to make analogous corrections to the forecasting system. Such approaches would further enable systematic evaluation of real-time flood forecasting methods and strategies. Toward this goal, we have implemented a real-time Sequential Importance Resampling particle filter (SIR-PF) approach to assimilate observed streamflow into simulated initial hydrologic conditions (states) for initializing ensemble flood forecasts. Assimilating streamflow alone in SIR-PF improves simulated streamflow and soil moisture during the model spin up period prior to a forecast, with consequent benefits for forecasts. Nevertheless, it only consistently limits error in simulated snow water equivalent during the snowmelt season and in basins where precipitation falls primarily as snow. We examine how the simulated initial conditions with and without SIR-PF propagate into 1- to 7-day ensemble streamflow forecasts. Forecasts are evaluated in terms of reliability and skill over a 10-year period from 2005-2015. The focus of this analysis is on how interactions between hydroclimate and SIR-PF performance impact forecast skill. To this end, we examine forecasts for 5 hydroclimatically diverse basins in the western U.S. Some of these basins receive most of their precipitation as snow, others as rain. Some freeze throughout the mid-winter while others experience significant mid-winter melt events. We describe the methodology and present seasonal and inter-basin variations in DA-enhanced forecast skill.

  19. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    NASA Astrophysics Data System (ADS)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums shows a mean improvement of more than 40% in CRPS when compared to bilinearly interpolated uncalibrated ensemble forecasts. The validation on randomly selected grid points, representing the true height distribution over Austria, still indicates a mean improvement of 35%. The applied statistical model is currently set up for 6-hourly and daily accumulation periods, but will be extended to a temporal resolution of 1-3 hours within a new probabilistic nowcasting system operated by ZAMG.

  20. Ocean state and uncertainty forecasts using HYCOM with Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, Mozheng; Hogan, Pat; Rowley, Clark; Smedstad, Ole-Martin; Wallcraft, Alan; Penny, Steve

    2017-04-01

    An ensemble forecast system based on the US Navy's operational HYCOM using Local Ensemble Transfer Kalman Filter (LETKF) technology has been developed for ocean state and uncertainty forecasts. One of the advantages is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates the operational observations using ensemble method. The background covariance during this assimilation process is supplied with the ensemble, thus it avoids the difficulty of developing tangent linear and adjoint models for 4D-VAR from the complicated hybrid isopycnal vertical coordinate in HYCOM. Another advantage is that the ensemble system provides the valuable uncertainty estimate corresponding to every state forecast from HYCOM. Uncertainty forecasts have been proven to be critical for the downstream users and managers to make more scientifically sound decisions in numerical prediction community. In addition, ensemble mean is generally more accurate and skilful than the single traditional deterministic forecast with the same resolution. We will introduce the ensemble system design and setup, present some results from 30-member ensemble experiment, and discuss scientific, technical and computational issues and challenges, such as covariance localization, inflation, model related uncertainties and sensitivity to the ensemble size.

  1. Sufficient Forecasting Using Factor Models

    PubMed Central

    Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei

    2017-01-01

    We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537

  2. A comparison of ensemble post-processing approaches that preserve correlation structures

    NASA Astrophysics Data System (ADS)

    Schefzik, Roman; Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2016-04-01

    Despite the fact that ensemble forecasts address the major sources of uncertainty, they exhibit biases and dispersion errors and therefore are known to improve by calibration or statistical post-processing. For instance the ensemble model output statistics (EMOS) method, also known as non-homogeneous regression approach (Gneiting et al., 2005) is known to strongly improve forecast skill. EMOS is based on fitting and adjusting a parametric probability density function (PDF). However, EMOS and other common post-processing approaches apply to a single weather quantity at a single location for a single look-ahead time. They are therefore unable of taking into account spatial, inter-variable and temporal dependence structures. Recently many research efforts have been invested in designing post-processing methods that resolve this drawback but also in verification methods that enable the detection of dependence structures. New verification methods are applied on two classes of post-processing methods, both generating physically coherent ensembles. A first class uses the ensemble copula coupling (ECC) that starts from EMOS but adjusts the rank structure (Schefzik et al., 2013). The second class is a member-by-member post-processing (MBM) approach that maps each raw ensemble member to a corrected one (Van Schaeybroeck and Vannitsem, 2015). We compare variants of the EMOS-ECC and MBM classes and highlight a specific theoretical connection between them. All post-processing variants are applied in the context of the ensemble system of the European Centre of Weather Forecasts (ECMWF) and compared using multivariate verification tools including the energy score, the variogram score (Scheuerer and Hamill, 2015) and the band depth rank histogram (Thorarinsdottir et al., 2015). Gneiting, Raftery, Westveld, and Goldman, 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Wea. Rev., {133}, 1098-1118. Scheuerer and Hamill, 2015. Variogram-based proper scoring rules for probabilistic forecasts of multivariate quantities. Mon. Wea. Rev. {143},1321-1334. Schefzik, Thorarinsdottir, Gneiting. Uncertainty quantification in complex simulation models using ensemble copula coupling. Statistical Science {28},616-640, 2013. Thorarinsdottir, M. Scheuerer, and C. Heinz, 2015. Assessing the calibration of high-dimensional ensemble forecasts using rank histograms, arXiv:1310.0236. Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.

  3. Evolving forecasting classifications and applications in health forecasting

    PubMed Central

    Soyiri, Ireneous N; Reidpath, Daniel D

    2012-01-01

    Health forecasting forewarns the health community about future health situations and disease episodes so that health systems can better allocate resources and manage demand. The tools used for developing and measuring the accuracy and validity of health forecasts commonly are not defined although they are usually adapted forms of statistical procedures. This review identifies previous typologies used in classifying the forecasting methods commonly used in forecasting health conditions or situations. It then discusses the strengths and weaknesses of these methods and presents the choices available for measuring the accuracy of health-forecasting models, including a note on the discrepancies in the modes of validation. PMID:22615533

  4. Evaluation of CMAQ and CAMx Ensemble Air Quality Forecasts during the 2015 MAPS-Seoul Field Campaign

    NASA Astrophysics Data System (ADS)

    Kim, E.; Kim, S.; Bae, C.; Kim, H. C.; Kim, B. U.

    2015-12-01

    The performance of Air quality forecasts during the 2015 MAPS-Seoul Field Campaign was evaluated. An forecast system has been operated to support the campaign's daily aircraft route decisions for airborne measurements to observe long-range transporting plume. We utilized two real-time ensemble systems based on the Weather Research and Forecasting (WRF)-Sparse Matrix Operator Kernel Emissions (SMOKE)-Comprehensive Air quality Model with extensions (CAMx) modeling framework and WRF-SMOKE- Community Multi_scale Air Quality (CMAQ) framework over northeastern Asia to simulate PM10 concentrations. Global Forecast System (GFS) from National Centers for Environmental Prediction (NCEP) was used to provide meteorological inputs for the forecasts. For an additional set of retrospective simulations, ERA Interim Reanalysis from European Centre for Medium-Range Weather Forecasts (ECMWF) was also utilized to access forecast uncertainties from the meteorological data used. Model Inter-Comparison Study for Asia (MICS-Asia) and National Institute of Environment Research (NIER) Clean Air Policy Support System (CAPSS) emission inventories are used for foreign and domestic emissions, respectively. In the study, we evaluate the CMAQ and CAMx model performance during the campaign by comparing the results to the airborne and surface measurements. Contributions of foreign and domestic emissions are estimated using a brute force method. Analyses on model performance and emissions will be utilized to improve air quality forecasts for the upcoming KORUS-AQ field campaign planned in 2016.

  5. Uncertainty analysis of an inflow forecasting model: extension of the UNEEC machine learning-based method

    NASA Astrophysics Data System (ADS)

    Pianosi, Francesca; Lal Shrestha, Durga; Solomatine, Dimitri

    2010-05-01

    This research presents an extension of UNEEC (Uncertainty Estimation based on Local Errors and Clustering, Shrestha and Solomatine, 2006, 2008 & Solomatine and Shrestha, 2009) method in the direction of explicit inclusion of parameter uncertainty. UNEEC method assumes that there is an optimal model and the residuals of the model can be used to assess the uncertainty of the model prediction. It is assumed that all sources of uncertainty including input, parameter and model structure uncertainty are explicitly manifested in the model residuals. In this research, theses assumptions are relaxed, and the UNEEC method is extended to consider parameter uncertainty as well (abbreviated as UNEEC-P). In UNEEC-P, first we use Monte Carlo (MC) sampling in parameter space to generate N model realizations (each of which is a time series), estimate the prediction quantiles based on the empirical distribution functions of the model residuals considering all the residual realizations, and only then apply the standard UNEEC method that encapsulates the uncertainty of a hydrologic model (expressed by quantiles of the error distribution) in a machine learning model (e.g., ANN). UNEEC-P is applied first to a linear regression model of synthetic data, and then to a real case study of forecasting inflow to lake Lugano in northern Italy. The inflow forecasting model is a stochastic heteroscedastic model (Pianosi and Soncini-Sessa, 2009). The preliminary results show that the UNEEC-P method produces wider uncertainty bounds, which is consistent with the fact that the method considers also parameter uncertainty of the optimal model. In the future UNEEC method will be further extended to consider input and structure uncertainty which will provide more realistic estimation of model predictions.

  6. Operational Hydrological Forecasting During the Iphex-iop Campaign - Meet the Challenge

    NASA Technical Reports Server (NTRS)

    Tao, Jing; Wu, Di; Gourley, Jonathan; Zhang, Sara Q.; Crow, Wade; Peters-Lidard, Christa D.; Barros, Ana P.

    2016-01-01

    An operational streamflow forecasting testbed was implemented during the Intense Observing Period (IOP) of the Integrated Precipitation and Hydrology Experiment (IPHEx-IOP) in May-June 2014 to characterize flood predictability in complex terrain. Specifically, hydrological forecasts were issued daily for 12 headwater catchments in the Southern Appalachians using the Duke Coupled surface-groundwater Hydrology Model (DCHM) forced by hourly atmospheric fields and QPFs (Quantitative Precipitation Forecasts) produced by the NASA-Unified Weather Research and Forecasting (NU-WRF) model. Previous day hindcasts forced by radar-based QPEs (Quantitative Precipitation Estimates) were used to provide initial conditions for present day forecasts. This manuscript first describes the operational testbed framework and workflow during the IPHEx-IOP including a synthesis of results. Second, various data assimilation approaches are explored a posteriori (post-IOP) to improve operational (flash) flood forecasting. Although all flood events during the IOP were predicted by the IPHEx operational testbed with lead times of up to 6 h, significant errors of over- and, or under-prediction were identified that could be traced back to the QPFs and subgrid-scale variability of radar QPEs. To improve operational flood prediction, three data-merging strategies were pursued post-IOP: (1) the spatial patterns of QPFs were improved through assimilation of satellite-based microwave radiances into NU-WRF; (2) QPEs were improved by merging raingauge observations with ground-based radar observations using bias-correction methods to produce streamflow hindcasts and associated uncertainty envelope capturing the streamflow observations, and (3) river discharge observations were assimilated into the DCHM to improve streamflow forecasts using the Ensemble Kalman Filter (EnKF), the fixed-lag Ensemble Kalman Smoother (EnKS), and the Asynchronous EnKF (i.e. AEnKF) methods. Both flood hindcasts and forecasts were significantly improved by assimilating discharge observations into the DCHM. Specifically, Nash-Sutcliff Efficiency (NSE) values as high as 0.98, 0.71 and 0.99 at 15-min time-scales were attained for three headwater catchments in the inner mountain region demonstrating that the assimilation of discharge observations at the basins outlet can reduce the errors and uncertainties in soil moisture at very small scales. Success in operational flood forecasting at lead times of 6, 9, 12 and 15 h was also achieved through discharge assimilation with NSEs of 0.87, 0.78, 0.72 and 0.51, respectively. Analysis of experiments using various data assimilation system configurations indicates that the optimal assimilation time window depends both on basin properties and storm-specific space-time-structure of rainfall, and therefore adaptive, context-aware configurations of the data assimilation system are recommended to address the challenges of flood prediction in headwater basins.

  7. Operational hydrological forecasting during the IPHEx-IOP campaign - Meet the challenge

    NASA Astrophysics Data System (ADS)

    Tao, Jing; Wu, Di; Gourley, Jonathan; Zhang, Sara Q.; Crow, Wade; Peters-Lidard, Christa; Barros, Ana P.

    2016-10-01

    An operational streamflow forecasting testbed was implemented during the Intense Observing Period (IOP) of the Integrated Precipitation and Hydrology Experiment (IPHEx-IOP) in May-June 2014 to characterize flood predictability in complex terrain. Specifically, hydrological forecasts were issued daily for 12 headwater catchments in the Southern Appalachians using the Duke Coupled surface-groundwater Hydrology Model (DCHM) forced by hourly atmospheric fields and QPFs (Quantitative Precipitation Forecasts) produced by the NASA-Unified Weather Research and Forecasting (NU-WRF) model. Previous day hindcasts forced by radar-based QPEs (Quantitative Precipitation Estimates) were used to provide initial conditions for present day forecasts. This manuscript first describes the operational testbed framework and workflow during the IPHEx-IOP including a synthesis of results. Second, various data assimilation approaches are explored a posteriori (post-IOP) to improve operational (flash) flood forecasting. Although all flood events during the IOP were predicted by the IPHEx operational testbed with lead times of up to 6 h, significant errors of over- and, or under-prediction were identified that could be traced back to the QPFs and subgrid-scale variability of radar QPEs. To improve operational flood prediction, three data-merging strategies were pursued post-IOP: (1) the spatial patterns of QPFs were improved through assimilation of satellite-based microwave radiances into NU-WRF; (2) QPEs were improved by merging raingauge observations with ground-based radar observations using bias-correction methods to produce streamflow hindcasts and associated uncertainty envelope capturing the streamflow observations, and (3) river discharge observations were assimilated into the DCHM to improve streamflow forecasts using the Ensemble Kalman Filter (EnKF), the fixed-lag Ensemble Kalman Smoother (EnKS), and the Asynchronous EnKF (i.e. AEnKF) methods. Both flood hindcasts and forecasts were significantly improved by assimilating discharge observations into the DCHM. Specifically, Nash-Sutcliff Efficiency (NSE) values as high as 0.98, 0.71 and 0.99 at 15-min time-scales were attained for three headwater catchments in the inner mountain region demonstrating that the assimilation of discharge observations at the basin's outlet can reduce the errors and uncertainties in soil moisture at very small scales. Success in operational flood forecasting at lead times of 6, 9, 12 and 15 h was also achieved through discharge assimilation with NSEs of 0.87, 0.78, 0.72 and 0.51, respectively. Analysis of experiments using various data assimilation system configurations indicates that the optimal assimilation time window depends both on basin properties and storm-specific space-time-structure of rainfall, and therefore adaptive, context-aware configurations of the data assimilation system are recommended to address the challenges of flood prediction in headwater basins.

  8. Scenario approach for the seasonal forecast of Kharif flows from the Upper Indus Basin

    NASA Astrophysics Data System (ADS)

    Fraz Ismail, Muhammad; Bogacki, Wolfgang

    2018-02-01

    Snow and glacial melt runoff are the major sources of water contribution from the high mountainous terrain of the Indus River upstream of the Tarbela reservoir. A reliable forecast of seasonal water availability for the Kharif cropping season (April-September) can pave the way towards better water management and a subsequent boost in the agro-economy of Pakistan. The use of degree-day models in conjunction with satellite-based remote-sensing data for the forecasting of seasonal snow and ice melt runoff has proved to be a suitable approach for data-scarce regions. In the present research, the Snowmelt Runoff Model (SRM) has not only been enhanced by incorporating the glacier (G) component but also applied for the forecast of seasonal water availability from the Upper Indus Basin (UIB). Excel-based SRM+G takes account of separate degree-day factors for snow and glacier melt processes. All-year simulation runs with SRM+G for the period 2003-2014 result in an average flow component distribution of 53, 21, and 26 % for snow, glacier, and rain, respectively. The UIB has been divided into Upper and Lower parts because of the different climatic conditions in the Tibetan Plateau. The scenario approach for seasonal forecasting, which like the Ensemble Streamflow Prediction method uses historic meteorology as model forcings, has proven to be adequate for long-term water availability forecasts. The accuracy of the forecast with a mean absolute percentage error (MAPE) of 9.5 % could be slightly improved compared to two existing operational forecasts for the UIB, and the bias could be reduced to -2.0 %. However, the association between forecasts and observations as well as the skill in predicting extreme conditions is rather weak for all three models, which motivates further research on the selection of a subset of ensemble members according to forecasted seasonal anomalies.

  9. Evaluation of model-based seasonal streamflow and water allocation forecasts for the Elqui Valley, Chile

    NASA Astrophysics Data System (ADS)

    Delorit, Justin; Cristian Gonzalez Ortuya, Edmundo; Block, Paul

    2017-09-01

    In many semi-arid regions, multisectoral demands often stress available water supplies. Such is the case in the Elqui River valley of northern Chile, which draws on a limited-capacity reservoir to allocate 25 000 water rights. Delayed infrastructure investment forces water managers to address demand-based allocation strategies, particularly in dry years, which are realized through reductions in the volume associated with each water right. Skillful season-ahead streamflow forecasts have the potential to inform managers with an indication of future conditions to guide reservoir allocations. This work evaluates season-ahead statistical prediction models of October-January (growing season) streamflow at multiple lead times associated with manager and user decision points, and links predictions with a reservoir allocation tool. Skillful results (streamflow forecasts outperform climatology) are produced for short lead times (1 September: ranked probability skill score (RPSS) of 0.31, categorical hit skill score of 61 %). At longer lead times, climatological skill exceeds forecast skill due to fewer observations of precipitation. However, coupling the 1 September statistical forecast model with a sea surface temperature phase and strength statistical model allows for equally skillful categorical streamflow forecasts to be produced for a 1 May lead, triggered for 60 % of years (1950-2015), suggesting forecasts need not be strictly deterministic to be useful for water rights holders. An early (1 May) categorical indication of expected conditions is reinforced with a deterministic forecast (1 September) as more observations of local variables become available. The reservoir allocation model is skillful at the 1 September lead (categorical hit skill score of 53 %); skill improves to 79 % when categorical allocation prediction certainty exceeds 80 %. This result implies that allocation efficiency may improve when forecasts are integrated into reservoir decision frameworks. The methods applied here advance the understanding of the mechanisms and timing responsible for moisture transport to the Elqui Valley and provide a unique application of streamflow forecasting in the prediction of water right allocations.

  10. A Copula-Based Conditional Probabilistic Forecast Model for Wind Power Ramps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Krishnan, Venkat K; Zhang, Jie

    Efficient management of wind ramping characteristics can significantly reduce wind integration costs for balancing authorities. By considering the stochastic dependence of wind power ramp (WPR) features, this paper develops a conditional probabilistic wind power ramp forecast (cp-WPRF) model based on Copula theory. The WPRs dataset is constructed by extracting ramps from a large dataset of historical wind power. Each WPR feature (e.g., rate, magnitude, duration, and start-time) is separately forecasted by considering the coupling effects among different ramp features. To accurately model the marginal distributions with a copula, a Gaussian mixture model (GMM) is adopted to characterize the WPR uncertaintymore » and features. The Canonical Maximum Likelihood (CML) method is used to estimate parameters of the multivariable copula. The optimal copula model is chosen based on the Bayesian information criterion (BIC) from each copula family. Finally, the best conditions based cp-WPRF model is determined by predictive interval (PI) based evaluation metrics. Numerical simulations on publicly available wind power data show that the developed copula-based cp-WPRF model can predict WPRs with a high level of reliability and sharpness.« less

  11. Sensitivity of monthly streamflow forecasts to the quality of rainfall forcing: When do dynamical climate forecasts outperform the Ensemble Streamflow Prediction (ESP) method?

    NASA Astrophysics Data System (ADS)

    Tanguy, M.; Prudhomme, C.; Harrigan, S.; Smith, K. A.; Parry, S.

    2017-12-01

    Forecasting hydrological extremes is challenging, especially at lead times over 1 month for catchments with limited hydrological memory and variable climates. One simple way to derive monthly or seasonal hydrological forecasts is to use historical climate data to drive hydrological models using the Ensemble Streamflow Prediction (ESP) method. This gives a range of possible future streamflow given known initial hydrologic conditions alone. The degree of skill of ESP depends highly on the forecast initialisation month and catchment type. Using dynamic rainfall forecasts as driving data instead of historical data could potentially improve streamflow predictions. A lot of effort is being invested within the meteorological community to improve these forecasts. However, while recent progress shows promise (e.g. NAO in winter), the skill of these forecasts at monthly to seasonal timescales is generally still limited, and the extent to which they might lead to improved hydrological forecasts is an area of active research. Additionally, these meteorological forecasts are currently being produced at 1 month or seasonal time-steps in the UK, whereas hydrological models require forcings at daily or sub-daily time-steps. Keeping in mind these limitations of available rainfall forecasts, the objectives of this study are to find out (i) how accurate monthly dynamical rainfall forecasts need to be to outperform ESP, and (ii) how the method used to disaggregate monthly rainfall forecasts into daily rainfall time series affects results. For the first objective, synthetic rainfall time series were created by increasingly degrading observed data (proxy for a `perfect forecast') from 0 % to +/-50 % error. For the second objective, three different methods were used to disaggregate monthly rainfall data into daily time series. These were used to force a simple lumped hydrological model (GR4J) to generate streamflow predictions at a one-month lead time for over 300 catchments representative of the range of UK's hydro-climatic conditions. These forecasts were then benchmarked against the traditional ESP method. It is hoped that the results of this work will help the meteorological community to identify where to focus their efforts in order to increase the usefulness of their forecasts within hydrological forecasting systems.

  12. A Study of Subseasonal Predictability of the Atmospheric Circulation Low-frequency Modes based on SL-AV forecasts

    NASA Astrophysics Data System (ADS)

    Kruglova, Ekaterina; Kulikova, Irina; Khan, Valentina; Tischenko, Vladimir

    2017-04-01

    The subseasonal predictability of low-frequency modes and the atmospheric circulation regimes is investigated based on the using of outputs from global Semi-Lagrangian (SL-AV) model of the Hydrometcentre of Russia and Institute of Numerical Mathematics of Russian Academy of Science. Teleconnection indices (AO, WA, EA, NAO, EU, WP, PNA) are used as the quantitative characteristics of low-frequency variability to identify zonal and meridional flow regimes with focus on control distribution of high impact weather patterns in the Northern Eurasia. The predictability of weekly and monthly averaged indices is estimated by the methods of diagnostic verification of forecast and reanalysis data covering the hindcast period, and also with the use of the recommended WMO quantitative criteria. Characteristics of the low frequency variability have been discussed. Particularly, it is revealed that the meridional flow regimes are reproduced by SL-AV for summer season better comparing to winter period. It is shown that the model's deterministic forecast (ensemble mean) skill at week 1 (days 1-7) is noticeably better than that of climatic forecasts. The decrease of skill scores at week 2 (days 8-14) and week 3( days 15-21) is explained by deficiencies in the modeling system and inaccurate initial conditions. It was noticed the slightly improvement of the skill of model at week 4 (days 22-28), when the condition of atmosphere is more determined by the flow of energy from the outside. The reliability of forecasts of monthly (days 1-30) averaged indices is comparable to that at week 1 (days 1-7). Numerical experiments demonstrated that the forecast accuracy can be improved (thus the limit of practical predictability can be extended) through the using of probabilistic approach based on ensemble forecasts. It is shown that the quality of forecasts of the regimes of circulation like blocking is higher, than that of zonal flow.

  13. Tsunami Forecast Progress Five Years After Indonesian Disaster

    NASA Astrophysics Data System (ADS)

    Titov, Vasily V.; Bernard, Eddie N.; Weinstein, Stuart A.; Kanoglu, Utku; Synolakis, Costas E.

    2010-05-01

    Almost five years after the 26 December 2004 Indian Ocean tragedy, tsunami warnings are finally benefiting from decades of research toward effective model-based forecasts. Since the 2004 tsunami, two seminal advances have been (i) deep-ocean tsunami measurements with tsunameters and (ii) their use in accurately forecasting tsunamis after the tsunami has been generated. Using direct measurements of deep-ocean tsunami heights, assimilated into numerical models for specific locations, greatly improves the real-time forecast accuracy over earthquake-derived magnitude estimates of tsunami impact. Since 2003, this method has been used to forecast tsunamis at specific harbors for different events in the Pacific and Indian Oceans. Recent tsunamis illustrated how this technology is being adopted in global tsunami warning operations. The U.S. forecasting system was used by both research and operations to evaluate the tsunami hazard. Tests demonstrated the effectiveness of operational tsunami forecasting using real-time deep-ocean data assimilated into forecast models. Several examples also showed potential of distributed forecast tools. With IOC and USAID funding, NOAA researchers at PMEL developed the Community Model Interface for Tsunami (ComMIT) tool and distributed it through extensive capacity-building sessions in the Indian Ocean. Over hundred scientists have been trained in tsunami inundation mapping, leading to the first generation of inundation models for many Indian Ocean shorelines. These same inundation models can also be used for real-time tsunami forecasts as was demonstrated during several events. Contact Information Vasily V. Titov, Seattle, Washington, USA, 98115

  14. Application of clustering analysis in the prediction of photovoltaic power generation based on neural network

    NASA Astrophysics Data System (ADS)

    Cheng, K.; Guo, L. M.; Wang, Y. K.; Zafar, M. T.

    2017-11-01

    In order to select effective samples in the large number of data of PV power generation years and improve the accuracy of PV power generation forecasting model, this paper studies the application of clustering analysis in this field and establishes forecasting model based on neural network. Based on three different types of weather on sunny, cloudy and rainy days, this research screens samples of historical data by the clustering analysis method. After screening, it establishes BP neural network prediction models using screened data as training data. Then, compare the six types of photovoltaic power generation prediction models before and after the data screening. Results show that the prediction model combining with clustering analysis and BP neural networks is an effective method to improve the precision of photovoltaic power generation.

  15. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE PAGES

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...

    2017-07-11

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  16. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  17. Practical implementation of a particle filter data assimilation approach to estimate initial hydrologic conditions and initialize medium-range streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Clark, E.; Wood, A.; Nijssen, B.; Newman, A. J.; Mendoza, P. A.

    2016-12-01

    The System for Hydrometeorological Applications, Research and Prediction (SHARP), developed at the National Center for Atmospheric Research (NCAR), University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation, is a fully automated ensemble prediction system for short-term to seasonal applications. It incorporates uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 plausible temperature and precipitation time series through the Sacramento/Snow-17 model. The forcing ensemble explicitly accounts for measurement and interpolation uncertainties in the development of gridded meteorological forcing time series. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. To select the IHCs that are most consistent with the observations, we employ a particle filter (PF) that weights IHC ensemble members based on observations of streamflow and SWE. These particles are then used to initialize ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS), generating a streamflow forecast ensemble. We test this method in two basins in the Pacific Northwest that are important for water resources management: 1) the Green River upstream of Howard Hanson Dam, and 2) the South Fork Flathead River upstream of Hungry Horse Dam. The first of these is characterized by mixed snow and rain, while the second is snow-dominated. The PF-based forecasts are compared to forecasts based on a single IHC (corresponding to median streamflow) paired with the full GEFS ensemble, and 2) the full IHC ensemble, without filtering, paired with the full GEFS ensemble. In addition to assessing improvements in the spread of IHCs, we perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts at 1- to 7-day lead times.

  18. Forecasting Consumer Adoption of Information Technology and Services--Lessons from Home Video Forecasting.

    ERIC Educational Resources Information Center

    Klopfenstein, Bruce C.

    1989-01-01

    Describes research that examined the strengths and weaknesses of technological forecasting methods by analyzing forecasting studies made for home video players. The discussion covers assessments and explications of correct and incorrect forecasting assumptions, and their implications for forecasting the adoption of home information technologies…

  19. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973

  20. Analysis and forecast of railway coal transportation volume based on BP neural network combined forecasting model

    NASA Astrophysics Data System (ADS)

    Xu, Yongbin; Xie, Haihong; Wu, Liuyi

    2018-05-01

    The share of coal transportation in the total railway freight volume is about 50%. As is widely acknowledged, coal industry is vulnerable to the economic situation and national policies. Coal transportation volume fluctuates significantly under the new economic normal. Grasp the overall development trend of railway coal transportation market, have important reference and guidance significance to the railway and coal industry decision-making. By analyzing the economic indicators and policy implications, this paper expounds the trend of the coal transportation volume, and further combines the economic indicators with the high correlation with the coal transportation volume with the traditional traffic prediction model to establish a combined forecasting model based on the back propagation neural network. The error of the prediction results is tested, which proves that the method has higher accuracy and has practical application.

  1. Real-time emergency forecasting technique for situation management systems

    NASA Astrophysics Data System (ADS)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  2. A Delphi forecast of technology in education

    NASA Technical Reports Server (NTRS)

    Robinson, B. E.

    1973-01-01

    The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.

  3. Gambling scores for earthquake predictions and forecasts

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  4. Modeling method of time sequence model based grey system theory and application proceedings

    NASA Astrophysics Data System (ADS)

    Wei, Xuexia; Luo, Yaling; Zhang, Shiqiang

    2015-12-01

    This article gives a modeling method of grey system GM(1,1) model based on reusing information and the grey system theory. This method not only extremely enhances the fitting and predicting accuracy of GM(1,1) model, but also maintains the conventional routes' merit of simple computation. By this way, we have given one syphilis trend forecast method based on reusing information and the grey system GM(1,1) model.

  5. Cognitive methodology for forecasting oil and gas industry using pattern-based neural information technologies

    NASA Astrophysics Data System (ADS)

    Gafurov, O.; Gafurov, D.; Syryamkin, V.

    2018-05-01

    The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.

  6. Ocean Predictability and Uncertainty Forecasts Using Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, M.; Hogan, P. J.; Rowley, C. D.; Smedstad, O. M.; Wallcraft, A. J.; Penny, S. G.

    2017-12-01

    Ocean predictability and uncertainty are studied with an ensemble system that has been developed based on the US Navy's operational HYCOM using the Local Ensemble Transfer Kalman Filter (LETKF) technology. One of the advantages of this method is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates operational observations using ensemble method. The background covariance during this assimilation process is implicitly supplied with the ensemble avoiding the difficult task of developing tangent linear and adjoint models out of HYCOM with the complicated hybrid isopycnal vertical coordinate for 4D-VAR. The flow-dependent background covariance from the ensemble will be an indispensable part in the next generation hybrid 4D-Var/ensemble data assimilation system. The predictability and uncertainty for the ocean forecasts are studied initially for the Gulf of Mexico. The results are compared with another ensemble system using Ensemble Transfer (ET) method which has been used in the Navy's operational center. The advantages and disadvantages are discussed.

  7. Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods

    USGS Publications Warehouse

    Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.

    2011-01-01

    Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.

  8. Day-Ahead Short-Term Forecasting Electricity Load via Approximation

    NASA Astrophysics Data System (ADS)

    Khamitov, R. N.; Gritsay, A. S.; Tyunkov, D. A.; E Sinitsin, G.

    2017-04-01

    The method of short-term forecasting of a power consumption which can be applied to short-term forecasting of power consumption is offered. The offered model is based on sinusoidal function for the description of day and night cycles of power consumption. Function coefficients - the period and amplitude are set up is adaptive, considering dynamics of power consumption with use of an artificial neural network. The presented results are tested on real retrospective data of power supply company. The offered method can be especially useful if there are no opportunities of collection of interval indications of metering devices of consumers, and the power supply company operates with electrical supply points. The offered method can be used by any power supply company upon purchase of the electric power in the wholesale market. For this purpose, it is necessary to receive coefficients of approximation of sinusoidal function and to have retrospective data on power consumption on an interval not less than one year.

  9. A data-driven SVR model for long-term runoff prediction and uncertainty analysis based on the Bayesian framework

    NASA Astrophysics Data System (ADS)

    Liang, Zhongmin; Li, Yujie; Hu, Yiming; Li, Binquan; Wang, Jun

    2017-06-01

    Accurate and reliable long-term forecasting plays an important role in water resources management and utilization. In this paper, a hybrid model called SVR-HUP is presented to predict long-term runoff and quantify the prediction uncertainty. The model is created based on three steps. First, appropriate predictors are selected according to the correlations between meteorological factors and runoff. Second, a support vector regression (SVR) model is structured and optimized based on the LibSVM toolbox and a genetic algorithm. Finally, using forecasted and observed runoff, a hydrologic uncertainty processor (HUP) based on a Bayesian framework is used to estimate the posterior probability distribution of the simulated values, and the associated uncertainty of prediction was quantitatively analyzed. Six precision evaluation indexes, including the correlation coefficient (CC), relative root mean square error (RRMSE), relative error (RE), mean absolute percentage error (MAPE), Nash-Sutcliffe efficiency (NSE), and qualification rate (QR), are used to measure the prediction accuracy. As a case study, the proposed approach is applied in the Han River basin, South Central China. Three types of SVR models are established to forecast the monthly, flood season and annual runoff volumes. The results indicate that SVR yields satisfactory accuracy and reliability at all three scales. In addition, the results suggest that the HUP cannot only quantify the uncertainty of prediction based on a confidence interval but also provide a more accurate single value prediction than the initial SVR forecasting result. Thus, the SVR-HUP model provides an alternative method for long-term runoff forecasting.

  10. Long- Range Forecasting Of The Onset Of Southwest Monsoon Winds And Waves Near The Horn Of Africa

    DTIC Science & Technology

    2017-12-01

    SUMMARY OF CLIMATE ANALYSIS AND LONG-RANGE FORECAST METHODOLOGY Prior theses from Heidt (2006) and Lemke (2010) used methods similar to ours and to...6 II. DATA AND METHODS .......................................................................................7 A...9 D. ANALYSIS AND FORECAST METHODS .........................................10 1. Predictand Selection

  11. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    NASA Astrophysics Data System (ADS)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  12. Generational forecasting in academic medicine: a unique method of planning for success in the next two decades.

    PubMed

    Howell, Lydia Pleotis; Joad, Jesse P; Callahan, Edward; Servis, Gregg; Bonham, Ann C

    2009-08-01

    Multigenerational teams are essential to the missions of academic health centers (AHCs). Generational forecasting using Strauss and Howe's predictive model, "the generational diagonal," can be useful for anticipating and addressing issues so that each generation is effective. Forecasts are based on the observation that cyclical historical events are experienced by all generations, but the response of each generation differs according to its phase of life and previous defining experiences. This article relates Strauss and Howe's generational forecasts to AHCs. Predicted issues such as work-life balance, indebtedness, and succession planning have existed previously, but they now have different causes or consequences because of the unique experiences and life stages of current generations. Efforts to address these issues at the authors' AHC include a work-life balance workgroup, expanded leave, and intramural grants.

  13. The use of snowcovered area in runoff forecasts

    NASA Technical Reports Server (NTRS)

    Rango, A.; Hannaford, J. F.; Hall, R. L.; Rosenzweig, M.; Brown, A. J.

    1977-01-01

    Long-term snowcovered area data from aircraft and satellite observations have proven useful in reducing seasonal runoff forecast error on the Kern river watershed. Similar use of snowcovered area on the Kings river watershed produced results that were about equivalent to methods based solely on conventional data. Snowcovered area will be most effective in reducing forecast procedural error on watersheds with: (1) a substantial amount of area within a limited elevation range; (2) an erratic precipitation and/or snowpack accumulation pattern not strongly related to elevation; and (3) poor coverage by precipitation stations or snow courses restricting adequate indexing of water supply conditions. When satellite data acquisition and delivery problems are resolved, the derived snowcover information should provide a means for enhancing operational streamflow forecasts for areas that depend primarily on snowmelt for their water supply.

  14. ElEvoHI: A Novel CME Prediction Tool for Heliospheric Imaging Combining an Elliptical Front with Drag-based Model Fitting

    NASA Astrophysics Data System (ADS)

    Rollett, T.; Möstl, C.; Isavnin, A.; Davies, J. A.; Kubicka, M.; Amerstorfer, U. V.; Harrison, R. A.

    2016-06-01

    In this study, we present a new method for forecasting arrival times and speeds of coronal mass ejections (CMEs) at any location in the inner heliosphere. This new approach enables the adoption of a highly flexible geometrical shape for the CME front with an adjustable CME angular width and an adjustable radius of curvature of its leading edge, I.e., the assumed geometry is elliptical. Using, as input, Solar TErrestrial RElations Observatory (STEREO) heliospheric imager (HI) observations, a new elliptic conversion (ElCon) method is introduced and combined with the use of drag-based model (DBM) fitting to quantify the deceleration or acceleration experienced by CMEs during propagation. The result is then used as input for the Ellipse Evolution Model (ElEvo). Together, ElCon, DBM fitting, and ElEvo form the novel ElEvoHI forecasting utility. To demonstrate the applicability of ElEvoHI, we forecast the arrival times and speeds of 21 CMEs remotely observed from STEREO/HI and compare them to in situ arrival times and speeds at 1 AU. Compared to the commonly used STEREO/HI fitting techniques (Fixed-ϕ, Harmonic Mean, and Self-similar Expansion fitting), ElEvoHI improves the arrival time forecast by about 2 to ±6.5 hr and the arrival speed forecast by ≈ 250 to ±53 km s-1, depending on the ellipse aspect ratio assumed. In particular, the remarkable improvement of the arrival speed prediction is potentially beneficial for predicting geomagnetic storm strength at Earth.

  15. Increased Accuracy in Statistical Seasonal Hurricane Forecasting

    NASA Astrophysics Data System (ADS)

    Nateghi, R.; Quiring, S. M.; Guikema, S. D.

    2012-12-01

    Hurricanes are among the costliest and most destructive natural hazards in the U.S. Accurate hurricane forecasts are crucial to optimal preparedness and mitigation decisions in the U.S. where 50 percent of the population lives within 50 miles of the coast. We developed a flexible statistical approach to forecast annual number of hurricanes in the Atlantic region during the hurricane season. Our model is based on the method of Random Forest and captures the complex relationship between hurricane activity and climatic conditions through careful variable selection, model testing and validation. We used the National Hurricane Center's Best Track hurricane data from 1949-2011 and sixty-one candidate climate descriptors to develop our model. The model includes information prior to the hurricane season, i.e., from the last three months of the previous year (Oct. through Dec.) and the first five months of the current year (January through May). Our forecast errors are substantially lower than other leading forecasts such as that of the National Oceanic and Atmospheric Administration (NOAA).

  16. Assessing the Impact of Observations on Numerical Weather Forecasts Using the Adjoint Method

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald

    2012-01-01

    The adjoint of a data assimilation system provides a flexible and efficient tool for estimating observation impacts on short-range weather forecasts. The impacts of any or all observations can be estimated simultaneously based on a single execution of the adjoint system. The results can be easily aggregated according to data type, location, channel, etc., making this technique especially attractive for examining the impacts of new hyper-spectral satellite instruments and for conducting regular, even near-real time, monitoring of the entire observing system. This talk provides a general overview of the adjoint method, including the theoretical basis and practical implementation of the technique. Results are presented from the adjoint-based observation impact monitoring tool in NASA's GEOS-5 global atmospheric data assimilation and forecast system. When performed in conjunction with standard observing system experiments (OSEs), the adjoint results reveal both redundancies and dependencies between observing system impacts as observations are added or removed from the assimilation system. Understanding these dependencies may be important for optimizing the use of the current observational network and defining requirements for future observing systems

  17. Score Matrix for HWBI Forecast Model

    EPA Pesticide Factsheets

    2000-2010 Annual State-Scale Service and Domain scores used to support the approach for forecasting EPA's Human Well-Being Index. A modeling approach was developed based relationship function equations derived from select economic, social and ecosystem final goods and service scores and calculated human well-being index and related domain scores. These data are being used in a secondary capacity. The foundational data and scoring techniques were originally described in: a) U.S. EPA. 2012. Indicators and Methods for Constructing a U.S. Human Well-being Index (HWBI) for Ecosystem Services Research. Report. EPA/600/R-12/023. pp. 121; and b) U.S. EPA. 2014. Indicators and Methods for Evaluating Economic, Ecosystem and Social Services Provisioning. Report. EPA/600/R-14/184. pp. 174. Mode Smith, L. M., Harwell, L. C., Summers, J. K., Smith, H. M., Wade, C. M., Straub, K. R. and J.L. Case (2014).This dataset is associated with the following publication:Summers , K., L. Harwell , and L. Smith. A Model For Change: An Approach for Forecasting Well-Being From Service-Based Decisions. ECOLOGICAL INDICATORS. Elsevier Science Ltd, New York, NY, USA, 69: 295-309, (2016).

  18. Diabatic forcing and intialization with assimilation of cloud water and rainwater in a forecast model

    NASA Technical Reports Server (NTRS)

    Raymond, William H.; Olson, William S.; Callan, Geary

    1995-01-01

    In this study, diabatic forcing, and liquid water assimilation techniques are tested in a semi-implicit hydrostatic regional forecast model containing explicit representations of grid-scale cloud water and rainwater. Diabatic forcing, in conjunction with diabatic contributions in the initialization, is found to help the forecast retain the diabatic signal found in the liquid water or heating rate data, consequently reducing the spinup time associated with grid-scale precipitation processes. Both observational Special Sensor Microwave/Imager (SSM/I) and model-generated data are used. A physical retrieval method incorporating SSM/I radiance data is utilized to estimate the 3D distribution of precipitating storms. In the retrieval method the relationship between precipitation distributions and upwelling microwave radiances is parameterized, based upon cloud ensemble-radiative model simulations. Regression formulae relating vertically integrated liquid and ice-phase precipitation amounts to latent heating rates are also derived from the cloud ensemble simulations. Thus, retrieved SSM/I precipitation structures can be used in conjunction with the regression-formulas to infer the 3D distribution of latent heating rates. These heating rates are used directly in the forecast model to help initiate Tropical Storm Emily (21 September 1987). The 14-h forecast of Emily's development yields atmospheric precipitation water contents that compare favorably with coincident SSM/I estimates.

  19. Using modified fruit fly optimisation algorithm to perform the function test and case studies

    NASA Astrophysics Data System (ADS)

    Pan, Wen-Tsao

    2013-06-01

    Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.

  20. Mining key elements for severe convection prediction based on CNN

    NASA Astrophysics Data System (ADS)

    Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng

    2017-04-01

    Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with the new machine-learning method via CNN models. Based on the analysis of those experimental results and case studies, the proposed new method have below benefits for the severe convection prediction: (1) helping forecasters to narrow down the scope of analysis and saves lead-time for those high-impact severe convection; (2) performing huge amount of weather big data by machine learning methods rather relying on traditional theory and knowledge, which provide new method to explore and quantify the severe convective weathers; (3) providing machine learning based end-to-end analysis and processing ability with considerable scalability on data volumes, and accomplishing the analysis work without human intervention.

  1. Graphic comparison of reserve-growth models for conventional oil and accumulation

    USGS Publications Warehouse

    Klett, T.R.

    2003-01-01

    The U.S. Geological Survey (USGS) periodically assesses crude oil, natural gas, and natural gas liquids resources of the world. The assessment procedure requires estimated recover-able oil and natural gas volumes (field size, cumulative production plus remaining reserves) in discovered fields. Because initial reserves are typically conservative, subsequent estimates increase through time as these fields are developed and produced. The USGS assessment of petroleum resources makes estimates, or forecasts, of the potential additions to reserves in discovered oil and gas fields resulting from field development, and it also estimates the potential fully developed sizes of undiscovered fields. The term ?reserve growth? refers to the commonly observed upward adjustment of reserve estimates. Because such additions are related to increases in the total size of a field, the USGS uses field sizes to model reserve growth. Future reserve growth in existing fields is a major component of remaining U.S. oil and natural gas resources and has therefore become a necessary element of U.S. petroleum resource assessments. Past and currently proposed reserve-growth models compared herein aid in the selection of a suitable set of forecast functions to provide an estimate of potential additions to reserves from reserve growth in the ongoing National Oil and Gas Assessment Project (NOGA). Reserve growth is modeled by construction of a curve that represents annual fractional changes of recoverable oil and natural gas volumes (for fields and reservoirs), which provides growth factors. Growth factors are used to calculate forecast functions, which are sets of field- or reservoir-size multipliers. Comparisons of forecast functions were made based on datasets used to construct the models, field type, modeling method, and length of forecast span. Comparisons were also made between forecast functions based on field-level and reservoir- level growth, and between forecast functions based on older and newer data. The reserve-growth model used in the 1995 USGS National Assessment and the model currently used in the NOGA project provide forecast functions that yield similar estimates of potential additions to reserves. Both models are based on the Oil and Gas Integrated Field File from the Energy Information Administration (EIA), but different vintages of data (from 1977 through 1991 and 1977 through 1996, respectively). The model based on newer data can be used in place of the previous model, providing similar estimates of potential additions to reserves. Fore-cast functions for oil fields vary little from those for gas fields in these models; therefore, a single function may be used for both oil and gas fields, like that used in the USGS World Petroleum Assessment 2000. Forecast functions based on the field-level reserve growth model derived from the NRG Associates databases (from 1982 through 1998) differ from those derived from EIA databases (from 1977 through 1996). However, the difference may not be enough to preclude the use of the forecast functions derived from NRG data in place of the forecast functions derived from EIA data. Should the model derived from NRG data be used, separate forecast functions for oil fields and gas fields must be employed. The forecast function for oil fields from the model derived from NRG data varies significantly from that for gas fields, and a single function for both oil and gas fields may not be appropriate.

  2. A weather-driven model of malaria transmission

    PubMed Central

    Hoshen, Moshe B; Morse, Andrew P

    2004-01-01

    Background Climate is a major driving force behind malaria transmission and climate data are often used to account for the spatial, seasonal and interannual variation in malaria transmission. Methods This paper describes a mathematical-biological model of the parasite dynamics, comprising both the weather-dependent within-vector stages and the weather-independent within-host stages. Results Numerical evaluations of the model in both time and space show that it qualitatively reconstructs the prevalence of infection. Conclusion A process-based modelling structure has been developed that may be suitable for the simulation of malaria forecasts based on seasonal weather forecasts. PMID:15350206

  3. Gas demand forecasting by a new artificial intelligent algorithm

    NASA Astrophysics Data System (ADS)

    Khatibi. B, Vahid; Khatibi, Elham

    2012-01-01

    Energy demand forecasting is a key issue for consumers and generators in all energy markets in the world. This paper presents a new forecasting algorithm for daily gas demand prediction. This algorithm combines a wavelet transform and forecasting models such as multi-layer perceptron (MLP), linear regression or GARCH. The proposed method is applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the proposed method.

  4. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  5. A new solar power output prediction based on hybrid forecast engine and decomposition model.

    PubMed

    Zhang, Weijiang; Dang, Hongshe; Simoes, Rolando

    2018-06-12

    Regarding to the growing trend of photovoltaic (PV) energy as a clean energy source in electrical networks and its uncertain nature, PV energy prediction has been proposed by researchers in recent decades. This problem is directly effects on operation in power network while, due to high volatility of this signal, an accurate prediction model is demanded. A new prediction model based on Hilbert Huang transform (HHT) and integration of improved empirical mode decomposition (IEMD) with feature selection and forecast engine is presented in this paper. The proposed approach is divided into three main sections. In the first section, the signal is decomposed by the proposed IEMD as an accurate decomposition tool. To increase the accuracy of the proposed method, a new interpolation method has been used instead of cubic spline curve (CSC) fitting in EMD. Then the obtained output is entered into the new feature selection procedure to choose the best candidate inputs. Finally, the signal is predicted by a hybrid forecast engine composed of support vector regression (SVR) based on an intelligent algorithm. The effectiveness of the proposed approach has been verified over a number of real-world engineering test cases in comparison with other well-known models. The obtained results prove the validity of the proposed method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Forecasting quantities of disused household CRT appliances--a regional case study approach and its application to Baden-Württemberg.

    PubMed

    Walk, Wolfgang

    2009-02-01

    Due to special requirements regarding logistics and recycling, disused cathode ray tube (CRT) appliances are handled in some countries as a separate waste fraction. This article presents a forecast of future household waste CRT quantities based on the past and present equipment of households with television sets and computer monitors. Additional aspects taken into consideration are the product life time distribution and the ongoing change in display technology. Although CRT technology is fading out, the findings of this forecast show that quantities of waste CRT appliances will not decrease before 2012 in Baden-Württemberg, Germany. The results of this regional case study are not quantitatively transferable without further analysis. The method provided allows analysts to consider how the time shift between production and discard could impact recycling options, and the method could be valuable for future similar analyses elsewhere.

  7. Detection and forecasting of oyster norovirus outbreaks: recent advances and future perspectives.

    PubMed

    Wang, Jiao; Deng, Zhiqiang

    2012-09-01

    Norovirus is a highly infectious pathogen that is commonly found in oysters growing in fecally contaminated waters. Norovirus outbreaks can cause the closure of oyster harvesting waters and acute gastroenteritis in humans associated with consumption of contaminated raw oysters. Extensive efforts and progresses have been made in detection and forecasting of oyster norovirus outbreaks over the past decades. The main objective of this paper is to provide a literature review of methods and techniques for detecting and forecasting oyster norovirus outbreaks and thereby to identify the future directions for improving the detection and forecasting of norovirus outbreaks. It is found that (1) norovirus outbreaks display strong seasonality with the outbreak peak occurring commonly in December-March in the U.S. and April-May in the Europe; (2) norovirus outbreaks are affected by multiple environmental factors, including but not limited to precipitation, temperature, solar radiation, wind, and salinity; (3) various modeling approaches may be employed to forecast norovirus outbreaks, including Bayesian models, regression models, Artificial Neural Networks, and process-based models; and (4) diverse techniques are available for near real-time detection of norovirus outbreaks, including multiplex PCR, seminested PCR, real-time PCR, quantitative PCR, and satellite remote sensing. The findings are important to the management of oyster growing waters and to future investigations into norovirus outbreaks. It is recommended that a combined approach of sensor-assisted real time monitoring and modeling-based forecasting should be utilized for an efficient and effective detection and forecasting of norovirus outbreaks caused by consumption of contaminated oysters. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Forecasting in foodservice: model development, testing, and evaluation.

    PubMed

    Miller, J L; Thompson, P A; Orabella, M M

    1991-05-01

    This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.

  9. Regional PV power estimation and forecast to mitigate the impact of high photovoltaic penetration on electric grid.

    NASA Astrophysics Data System (ADS)

    Pierro, Marco; De Felice, Matteo; Maggioni, Enrico; Moser, David; Perotto, Alessandro; Spada, Francesco; Cornaro, Cristina

    2017-04-01

    The growing photovoltaic generation results in a stochastic variability of the electric demand that could compromise the stability of the grid and increase the amount of energy reserve and the energy imbalance cost. On regional scale, solar power estimation and forecast is becoming essential for Distribution System Operators, Transmission System Operator, energy traders, and aggregators of generation. Indeed the estimation of regional PV power can be used for PV power supervision and real time control of residual load. Mid-term PV power forecast can be employed for transmission scheduling to reduce energy imbalance and related cost of penalties, residual load tracking, trading optimization, secondary energy reserve assessment. In this context, a new upscaling method was developed and used for estimation and mid-term forecast of the photovoltaic distributed generation in a small area in the north of Italy under the control of a local DSO. The method was based on spatial clustering of the PV fleet and neural networks models that input satellite or numerical weather prediction data (centered on cluster centroids) to estimate or predict the regional solar generation. It requires a low computational effort and very few input information should be provided by users. The power estimation model achieved a RMSE of 3% of installed capacity. Intra-day forecast (from 1 to 4 hours) obtained a RMSE of 5% - 7% while the one and two days forecast achieve to a RMSE of 7% and 7.5%. A model to estimate the forecast error and the prediction intervals was also developed. The photovoltaic production in the considered region provided the 6.9% of the electric consumption in 2015. Since the PV penetration is very similar to the one observed at national level (7.9%), this is a good case study to analyse the impact of PV generation on the electric grid and the effects of PV power forecast on transmission scheduling and on secondary reserve estimation. It appears that, already with 7% of PV penetration, the distributed PV generation could have a great impact both on the DSO energy need and on the transmission scheduling capability. Indeed, for some hours of the days in summer time, the photovoltaic generation can provide from 50% to 75% of the energy that the local DSO should buy from Italian TSO to cover the electrical demand. Moreover, mid-term forecast can reduce the annual energy imbalance between the scheduled transmission and the actual one from 10% of the TSO energy supply (without considering the PV forecast) to 2%. Furthermore, it was shown that prediction intervals could be used not only to estimate the probability of a specific PV generation bid on the energy market, but also to reduce the energy reserve predicted for the next day. Two different methods for energy reserve estimation were developed and tested. The first is based on a clear sky model while the second makes use of the PV prediction intervals with the 95% of confidence level. The latter reduces the amount of the day-ahead energy reserve of 36% with respect the clear sky method.

  10. Modeling the heterogeneous traffic correlations in urban road systems using traffic-enhanced community detection approach

    NASA Astrophysics Data System (ADS)

    Lu, Feng; Liu, Kang; Duan, Yingying; Cheng, Shifen; Du, Fei

    2018-07-01

    A better characterization of the traffic influence among urban roads is crucial for traffic control and traffic forecasting. The existence of spatial heterogeneity imposes great influence on modeling the extent and degree of road traffic correlation, which is usually neglected by the traditional distance based method. In this paper, we propose a traffic-enhanced community detection approach to spatially reveal the traffic correlation in city road networks. First, the road network is modeled as a traffic-enhanced dual graph with the closeness between two road segments determined not only by their topological connection, but also by the traffic correlation between them. Then a flow-based community detection algorithm called Infomap is utilized to identify the road segment clusters. Evaluated by Moran's I, Calinski-Harabaz Index and the traffic interpolation application, we find that compared to the distance based method and the community based method, our proposed traffic-enhanced community based method behaves better in capturing the extent of traffic relevance as both the topological structure of the road network and the traffic correlations among urban roads are considered. It can be used in more traffic-related applications, such as traffic forecasting, traffic control and guidance.

  11. Research on light rail electric load forecasting based on ARMA model

    NASA Astrophysics Data System (ADS)

    Huang, Yifan

    2018-04-01

    The article compares a variety of time series models and combines the characteristics of power load forecasting. Then, a light load forecasting model based on ARMA model is established. Based on this model, a light rail system is forecasted. The prediction results show that the accuracy of the model prediction is high.

  12. Assessing High-Resolution Weather Research and Forecasting (WRF) Forecasts Using an Object-Based Diagnostic Evaluation

    DTIC Science & Technology

    2014-02-01

    Operational Model Archive and Distribution System ( NOMADS ). The RTMA product was generated using a 2-D variational method to assimilate point weather...observations and satellite-derived measurements (National Weather Service, 2013). The products were downloaded using the NOMADS General Regularly...of the completed WRF run" read Start_Date echo $Start_Date echo " " echo "Enter 2- digit , zulu, observation hour (HH) for remapping" read oHH

  13. Comparative Performance Evaluation of Rainfall-runoff Models, Six of Black-box Type and One of Conceptual Type, From The Galway Flow Forecasting System (gffs) Package, Applied On Two Irish Catchments

    NASA Astrophysics Data System (ADS)

    Goswami, M.; O'Connor, K. M.; Shamseldin, A. Y.

    The "Galway Real-Time River Flow Forecasting System" (GFFS) is a software pack- age developed at the Department of Engineering Hydrology, of the National University of Ireland, Galway, Ireland. It is based on a selection of lumped black-box and con- ceptual rainfall-runoff models, all developed in Galway, consisting primarily of both the non-parametric (NP) and parametric (P) forms of two black-box-type rainfall- runoff models, namely, the Simple Linear Model (SLM-NP and SLM-P) and the seasonally-based Linear Perturbation Model (LPM-NP and LPM-P), together with the non-parametric wetness-index-based Linearly Varying Gain Factor Model (LVGFM), the black-box Artificial Neural Network (ANN) Model, and the conceptual Soil Mois- ture Accounting and Routing (SMAR) Model. Comprised of the above suite of mod- els, the system enables the user to calibrate each model individually, initially without updating, and it is capable also of producing combined (i.e. consensus) forecasts us- ing the Simple Average Method (SAM), the Weighted Average Method (WAM), or the Artificial Neural Network Method (NNM). The updating of each model output is achieved using one of four different techniques, namely, simple Auto-Regressive (AR) updating, Linear Transfer Function (LTF) updating, Artificial Neural Network updating (NNU), and updating by the Non-linear Auto-Regressive Exogenous-input method (NARXM). The models exhibit a considerable range of variation in degree of complexity of structure, with corresponding degrees of complication in objective func- tion evaluation. Operating in continuous river-flow simulation and updating modes, these models and techniques have been applied to two Irish catchments, namely, the Fergus and the Brosna. A number of performance evaluation criteria have been used to comparatively assess the model discharge forecast efficiency.

  14. Analysis of forecasting and inventory control of raw material supplies in PT INDAC INT’L

    NASA Astrophysics Data System (ADS)

    Lesmana, E.; Subartini, B.; Riaman; Jabar, D. A.

    2018-03-01

    This study discusses the data forecasting sales of carbon electrodes at PT. INDAC INT L uses winters and double moving average methods, while for predicting the amount of inventory and cost required in ordering raw material of carbon electrode next period using Economic Order Quantity (EOQ) model. The result of error analysis shows that winters method for next period gives result of MAE, MSE, and MAPE, the winters method is a better forecasting method for forecasting sales of carbon electrode products. So that PT. INDAC INT L is advised to provide products that will be sold following the sales amount by the winters method.

  15. Seizure Forecasting and the Preictal State in Canine Epilepsy.

    PubMed

    Varatharajah, Yogatheesan; Iyer, Ravishankar K; Berry, Brent M; Worrell, Gregory A; Brinkmann, Benjamin H

    2017-02-01

    The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state.

  16. SEIZURE FORECASTING AND THE PREICTAL STATE IN CANINE EPILEPSY

    PubMed Central

    Varatharajah, Yogatheesan; Iyer, Ravishankar K.; Berry, Brent M.; Worrell, Gregory A.; Brinkmann, Benjamin H.

    2017-01-01

    The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state. PMID:27464854

  17. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  18. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast

  19. Accurate Influenza Monitoring and Forecasting Using Novel Internet Data Streams: A Case Study in the Boston Metropolis

    PubMed Central

    Lu, Fred Sun; Hou, Suqin; Baltrusaitis, Kristin; Shah, Manan; Leskovec, Jure; Sosic, Rok; Hawkins, Jared; Brownstein, John; Conidi, Giuseppe; Gunn, Julia; Gray, Josh; Zink, Anna

    2018-01-01

    Background Influenza outbreaks pose major challenges to public health around the world, leading to thousands of deaths a year in the United States alone. Accurate systems that track influenza activity at the city level are necessary to provide actionable information that can be used for clinical, hospital, and community outbreak preparation. Objective Although Internet-based real-time data sources such as Google searches and tweets have been successfully used to produce influenza activity estimates ahead of traditional health care–based systems at national and state levels, influenza tracking and forecasting at finer spatial resolutions, such as the city level, remain an open question. Our study aimed to present a precise, near real-time methodology capable of producing influenza estimates ahead of those collected and published by the Boston Public Health Commission (BPHC) for the Boston metropolitan area. This approach has great potential to be extended to other cities with access to similar data sources. Methods We first tested the ability of Google searches, Twitter posts, electronic health records, and a crowd-sourced influenza reporting system to detect influenza activity in the Boston metropolis separately. We then adapted a multivariate dynamic regression method named ARGO (autoregression with general online information), designed for tracking influenza at the national level, and showed that it effectively uses the above data sources to monitor and forecast influenza at the city level 1 week ahead of the current date. Finally, we presented an ensemble-based approach capable of combining information from models based on multiple data sources to more robustly nowcast as well as forecast influenza activity in the Boston metropolitan area. The performances of our models were evaluated in an out-of-sample fashion over 4 influenza seasons within 2012-2016, as well as a holdout validation period from 2016 to 2017. Results Our ensemble-based methods incorporating information from diverse models based on multiple data sources, including ARGO, produced the most robust and accurate results. The observed Pearson correlations between our out-of-sample flu activity estimates and those historically reported by the BPHC were 0.98 in nowcasting influenza and 0.94 in forecasting influenza 1 week ahead of the current date. Conclusions We show that information from Internet-based data sources, when combined using an informed, robust methodology, can be effectively used as early indicators of influenza activity at fine geographic resolutions. PMID:29317382

  20. Quality Assessment of the Cobel-Isba Numerical Forecast System of Fog and Low Clouds

    NASA Astrophysics Data System (ADS)

    Bergot, Thierry

    2007-06-01

    Short-term forecasting of fog is a difficult issue which can have a large societal impact. Fog appears in the surface boundary layer and is driven by the interactions between land surface and the lower layers of the atmosphere. These interactions are still not well parameterized in current operational NWP models, and a new methodology based on local observations, an adaptive assimilation scheme and a local numerical model is tested. The proposed numerical forecast method of foggy conditions has been run during three years at Paris-CdG international airport. This test over a long-time period allows an in-depth evaluation of the forecast quality. This study demonstrates that detailed 1-D models, including detailed physical parameterizations and high vertical resolution, can reasonably represent the major features of the life cycle of fog (onset, development and dissipation) up to +6 h. The error on the forecast onset and burn-off time is typically 1 h. The major weakness of the methodology is related to the evolution of low clouds (stratus lowering). Even if the occurrence of fog is well forecasted, the value of the horizontal visibility is only crudely forecasted. Improvements in the microphysical parameterization and in the translation algorithm converting NWP prognostic variables into a corresponding horizontal visibility seems necessary to accurately forecast the value of the visibility.

Top