Sample records for sample load forecasting

  1. Load forecast method of electric vehicle charging station using SVR based on GA-PSO

    NASA Astrophysics Data System (ADS)

    Lu, Kuan; Sun, Wenxue; Ma, Changhui; Yang, Shenquan; Zhu, Zijian; Zhao, Pengfei; Zhao, Xin; Xu, Nan

    2017-06-01

    This paper presents a Support Vector Regression (SVR) method for electric vehicle (EV) charging station load forecast based on genetic algorithm (GA) and particle swarm optimization (PSO). Fuzzy C-Means (FCM) clustering is used to establish similar day samples. GA is used for global parameter searching and PSO is used for a more accurately local searching. Load forecast is then regressed using SVR. The practical load data of an EV charging station were taken to illustrate the proposed method. The result indicates an obvious improvement in the forecasting accuracy compared with SVRs based on PSO and GA exclusively.

  2. NASA Products to Enhance Energy Utility Load Forecasting

    NASA Technical Reports Server (NTRS)

    Lough, G.; Zell, E.; Engel-Cox, J.; Fungard, Y.; Jedlovec, G.; Stackhouse, P.; Homer, R.; Biley, S.

    2012-01-01

    Existing energy load forecasting tools rely upon historical load and forecasted weather to predict load within energy company service areas. The shortcomings of load forecasts are often the result of weather forecasts that are not at a fine enough spatial or temporal resolution to capture local-scale weather events. This project aims to improve the performance of load forecasting tools through the integration of high-resolution, weather-related NASA Earth Science Data, such as temperature, relative humidity, and wind speed. Three companies are participating in operational testing one natural gas company, and two electric providers. Operational results comparing load forecasts with and without NASA weather forecasts have been generated since March 2010. We have worked with end users at the three companies to refine selection of weather forecast information and optimize load forecast model performance. The project will conclude in 2012 with transitioning documented improvements from the inclusion of NASA forecasts for sustained use by energy utilities nationwide in a variety of load forecasting tools. In addition, Battelle has consulted with energy companies nationwide to document their information needs for long-term planning, in light of climate change and regulatory impacts.

  3. Load Forecasting of Central Urban Area Power Grid Based on Saturated Load Density Index

    NASA Astrophysics Data System (ADS)

    Huping, Yang; Chengyi, Tang; Meng, Yu

    2018-03-01

    In the current society, coordination between urban power grid development and city development has become more and more prominent. Electricity saturated load forecasting plays an important role in the planning and development of power grids. Electricity saturated load forecasting is a new concept put forward by China in recent years in the field of grid planning. Urban saturation load forecast is different from the traditional load forecasting method for specific years, the time span of it often relatively large, and involves a wide range of aspects. This study takes a county in eastern Jiangxi as an example, this paper chooses a variety of load forecasting methods to carry on the recent load forecasting calculation to central urban area. At the same time, this paper uses load density index method to predict the Longterm load forecasting of electric saturation load of central urban area lasted until 2030. And further study shows the general distribution of the urban saturation load in space.

  4. 7 CFR 1710.208 - RUS criteria for approval of all load forecasts by power supply borrowers and by distribution...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... power supply borrowers and by distribution borrowers required to maintain an approved load forecast on... forecasts by power supply borrowers and by distribution borrowers required to maintain an approved load forecast on an ongoing basis. All load forecasts submitted by power supply borrowers and by distribution...

  5. 7 CFR 1710.208 - RUS criteria for approval of all load forecasts by power supply borrowers and by distribution...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... power supply borrowers and by distribution borrowers required to maintain an approved load forecast on... forecasts by power supply borrowers and by distribution borrowers required to maintain an approved load forecast on an ongoing basis. All load forecasts submitted by power supply borrowers and by distribution...

  6. 7 CFR 1710.208 - RUS criteria for approval of all load forecasts by power supply borrowers and by distribution...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... power supply borrowers and by distribution borrowers required to maintain an approved load forecast on... forecasts by power supply borrowers and by distribution borrowers required to maintain an approved load forecast on an ongoing basis. All load forecasts submitted by power supply borrowers and by distribution...

  7. 7 CFR 1710.208 - RUS criteria for approval of all load forecasts by power supply borrowers and by distribution...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... power supply borrowers and by distribution borrowers required to maintain an approved load forecast on... forecasts by power supply borrowers and by distribution borrowers required to maintain an approved load forecast on an ongoing basis. All load forecasts submitted by power supply borrowers and by distribution...

  8. Integration of Behind-the-Meter PV Fleet Forecasts into Utility Grid System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoff, Thomas Hoff; Kankiewicz, Adam

    Four major research objectives were completed over the course of this study. Three of the objectives were to evaluate three, new, state-of-the-art solar irradiance forecasting models. The fourth objective was to improve the California Independent System Operator’s (ISO) load forecasts by integrating behind-the-meter (BTM) PV forecasts. The three, new, state-of-the-art solar irradiance forecasting models included: the infrared (IR) satellite-based cloud motion vector (CMV) model; the WRF-SolarCA model and variants; and the Optimized Deep Machine Learning (ODML)-training model. The first two forecasting models targeted known weaknesses in current operational solar forecasts. They were benchmarked against existing operational numerical weather prediction (NWP)more » forecasts, visible satellite CMV forecasts, and measured PV plant power production. IR CMV, WRF-SolarCA, and ODML-training forecasting models all improved the forecast to a significant degree. Improvements varied depending on time of day, cloudiness index, and geographic location. The fourth objective was to demonstrate that the California ISO’s load forecasts could be improved by integrating BTM PV forecasts. This objective represented the project’s most exciting and applicable gains. Operational BTM forecasts consisting of 200,000+ individual rooftop PV forecasts were delivered into the California ISO’s real-time automated load forecasting (ALFS) environment. They were then evaluated side-by-side with operational load forecasts with no BTM-treatment. Overall, ALFS-BTM day-ahead (DA) forecasts performed better than baseline ALFS forecasts when compared to actual load data. Specifically, ALFS-BTM DA forecasts were observed to have the largest reduction of error during the afternoon on cloudy days. Shorter term 30 minute-ahead ALFS-BTM forecasts were shown to have less error under all sky conditions, especially during the morning time periods when traditional load forecasts often experience their largest uncertainties. This work culminated in a GO decision being made by the California ISO to include zonal BTM forecasts into its operational load forecasting system. The California ISO’s Manager of Short Term Forecasting, Jim Blatchford, summarized the research performed in this project with the following quote: “The behind-the-meter (BTM) California ISO region forecasting research performed by Clean Power Research and sponsored by the Department of Energy’s SUNRISE program was an opportunity to verify value and demonstrate improved load forecast capability. In 2016, the California ISO will be incorporating the BTM forecast into the Hour Ahead and Day Ahead load models to look for improvements in the overall load forecast accuracy as BTM PV capacity continues to grow.”« less

  9. A Load-Based Temperature Prediction Model for Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Sobhani, Masoud

    Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.

  10. Forecasting electricity usage using univariate time series models

    NASA Astrophysics Data System (ADS)

    Hock-Eam, Lim; Chee-Yin, Yip

    2014-12-01

    Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.

  11. 7 CFR 1710.209 - Approval requirements for load forecast work plans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... cooperate in the preparation of and submittal of the load forecast work plan of their power supply borrower. (b) An approved load forecast work plan establishes the process for the preparation and maintenance... approved load forecast work plan must outline the coordination and preparation requirements for both the...

  12. Supplier Short Term Load Forecasting Using Support Vector Regression and Exogenous Input

    NASA Astrophysics Data System (ADS)

    Matijaš, Marin; Vukićcević, Milan; Krajcar, Slavko

    2011-09-01

    In power systems, task of load forecasting is important for keeping equilibrium between production and consumption. With liberalization of electricity markets, task of load forecasting changed because each market participant has to forecast their own load. Consumption of end-consumers is stochastic in nature. Due to competition, suppliers are not in a position to transfer their costs to end-consumers; therefore it is essential to keep forecasting error as low as possible. Numerous papers are investigating load forecasting from the perspective of the grid or production planning. We research forecasting models from the perspective of a supplier. In this paper, we investigate different combinations of exogenous input on the simulated supplier loads and show that using points of delivery as a feature for Support Vector Regression leads to lower forecasting error, while adding customer number in different datasets does the opposite.

  13. Short Term Load Forecasting with Fuzzy Logic Systems for power system planning and reliability-A Review

    NASA Astrophysics Data System (ADS)

    Holmukhe, R. M.; Dhumale, Mrs. Sunita; Chaudhari, Mr. P. S.; Kulkarni, Mr. P. P.

    2010-10-01

    Load forecasting is very essential to the operation of Electricity companies. It enhances the energy efficient and reliable operation of power system. Forecasting of load demand data forms an important component in planning generation schedules in a power system. The purpose of this paper is to identify issues and better method for load foecasting. In this paper we focus on fuzzy logic system based short term load forecasting. It serves as overview of the state of the art in the intelligent techniques employed for load forecasting in power system planning and reliability. Literature review has been conducted and fuzzy logic method has been summarized to highlight advantages and disadvantages of this technique. The proposed technique for implementing fuzzy logic based forecasting is by Identification of the specific day and by using maximum and minimum temperature for that day and finally listing the maximum temperature and peak load for that day. The results show that Load forecasting where there are considerable changes in temperature parameter is better dealt with Fuzzy Logic system method as compared to other short term forecasting techniques.

  14. Neural network based short-term load forecasting using weather compensation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, T.W.S.; Leung, C.T.

    This paper presents a novel technique for electric load forecasting based on neural weather compensation. The proposed method is a nonlinear generalization of Box and Jenkins approach for nonstationary time-series prediction. A weather compensation neural network is implemented for one-day ahead electric load forecasting. The weather compensation neural network can accurately predict the change of actual electric load consumption from the previous day. The results, based on Hong Kong Island historical load demand, indicate that this methodology is capable of providing a more accurate load forecast with a 0.9% reduction in forecast error.

  15. Short-Term Forecasting of Loads and Wind Power for Latvian Power System: Accuracy and Capacity of the Developed Tools

    NASA Astrophysics Data System (ADS)

    Radziukynas, V.; Klementavičius, A.

    2016-04-01

    The paper analyses the performance results of the recently developed short-term forecasting suit for the Latvian power system. The system load and wind power are forecasted using ANN and ARIMA models, respectively, and the forecasting accuracy is evaluated in terms of errors, mean absolute errors and mean absolute percentage errors. The investigation of influence of additional input variables on load forecasting errors is performed. The interplay of hourly loads and wind power forecasting errors is also evaluated for the Latvian power system with historical loads (the year 2011) and planned wind power capacities (the year 2023).

  16. Short-Term Load Forecasting Based Automatic Distribution Network Reconfiguration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Ding, Fei; Zhang, Yingchen

    In a traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of the load forecasting technique can provide an accurate prediction of the load power that will happen in a future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during a longer time period instead of using a snapshot of the load at the time when the reconfiguration happens; thus, the distribution system operatormore » can use this information to better operate the system reconfiguration and achieve optimal solutions. This paper proposes a short-term load forecasting approach to automatically reconfigure distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with a forecaster based on support vector regression and parallel parameters optimization. The network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum amount of loss at the future time. The simulation results validate and evaluate the proposed approach.« less

  17. Short-Term Load Forecasting Based Automatic Distribution Network Reconfiguration: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Ding, Fei; Zhang, Yingchen

    In the traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of load forecasting technique can provide accurate prediction of load power that will happen in future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during the longer time period instead of using the snapshot of load at the time when the reconfiguration happens, and thus it can provide information to the distribution systemmore » operator (DSO) to better operate the system reconfiguration to achieve optimal solutions. Thus, this paper proposes a short-term load forecasting based approach for automatically reconfiguring distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with support vector regression (SVR) based forecaster and parallel parameters optimization. And the network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum loss at the future time. The simulation results validate and evaluate the proposed approach.« less

  18. Short-Term Load Forecasting-Based Automatic Distribution Network Reconfiguration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Ding, Fei; Zhang, Yingchen

    In a traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of the load forecasting technique can provide an accurate prediction of the load power that will happen in a future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during a longer time period instead of using a snapshot of the load at the time when the reconfiguration happens; thus, the distribution system operatormore » can use this information to better operate the system reconfiguration and achieve optimal solutions. This paper proposes a short-term load forecasting approach to automatically reconfigure distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with a forecaster based on support vector regression and parallel parameters optimization. The network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum amount of loss at the future time. The simulation results validate and evaluate the proposed approach.« less

  19. Load Modeling and Forecasting | Grid Modernization | NREL

    Science.gov Websites

    Load Modeling and Forecasting Load Modeling and Forecasting NREL's work in load modeling is focused resources (such as rooftop photovoltaic systems) and changing customer energy use profiles, new load models distribution system. In addition, NREL researchers are developing load models for individual appliances and

  20. Electricity forecasting on the individual household level enhanced based on activity patterns

    PubMed Central

    Gajowniczek, Krzysztof; Ząbkowski, Tomasz

    2017-01-01

    Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents’ daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken. PMID:28423039

  1. Electricity forecasting on the individual household level enhanced based on activity patterns.

    PubMed

    Gajowniczek, Krzysztof; Ząbkowski, Tomasz

    2017-01-01

    Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents' daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken.

  2. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-12-18

    This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and comparesmore » the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  3. Research on light rail electric load forecasting based on ARMA model

    NASA Astrophysics Data System (ADS)

    Huang, Yifan

    2018-04-01

    The article compares a variety of time series models and combines the characteristics of power load forecasting. Then, a light load forecasting model based on ARMA model is established. Based on this model, a light rail system is forecasted. The prediction results show that the accuracy of the model prediction is high.

  4. Parametric analysis of parameters for electrical-load forecasting using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Gerber, William J.; Gonzalez, Avelino J.; Georgiopoulos, Michael

    1997-04-01

    Accurate total system electrical load forecasting is a necessary part of resource management for power generation companies. The better the hourly load forecast, the more closely the power generation assets of the company can be configured to minimize the cost. Automating this process is a profitable goal and neural networks should provide an excellent means of doing the automation. However, prior to developing such a system, the optimal set of input parameters must be determined. The approach of this research was to determine what those inputs should be through a parametric study of potentially good inputs. Input parameters tested were ambient temperature, total electrical load, the day of the week, humidity, dew point temperature, daylight savings time, length of daylight, season, forecast light index and forecast wind velocity. For testing, a limited number of temperatures and total electrical loads were used as a basic reference input parameter set. Most parameters showed some forecasting improvement when added individually to the basic parameter set. Significantly, major improvements were exhibited with the day of the week, dew point temperatures, additional temperatures and loads, forecast light index and forecast wind velocity.

  5. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE PAGES

    Buitrago, Jaime; Asfour, Shihab

    2017-01-01

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  6. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buitrago, Jaime; Asfour, Shihab

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  7. Load Forecasting in Electric Utility Integrated Resource Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H

    Integrated resource planning (IRP) is a process used by many vertically-integrated U.S. electric utilities to determine least-cost/risk supply and demand-side resources that meet government policy objectives and future obligations to customers and, in many cases, shareholders. Forecasts of energy and peak demand are a critical component of the IRP process. There have been few, if any, quantitative studies of IRP long-run (planning horizons of two decades) load forecast performance and its relationship to resource planning and actual procurement decisions. In this paper, we evaluate load forecasting methods, assumptions, and outcomes for 12 Western U.S. utilities by examining and comparing plansmore » filed in the early 2000s against recent plans, up to year 2014. We find a convergence in the methods and data sources used. We also find that forecasts in more recent IRPs generally took account of new information, but that there continued to be a systematic over-estimation of load growth rates during the period studied. We compare planned and procured resource expansion against customer load and year-to-year load growth rates, but do not find a direct relationship. Load sensitivities performed in resource plans do not appear to be related to later procurement strategies even in the presence of large forecast errors. These findings suggest that resource procurement decisions may be driven by other factors than customer load growth. Our results have important implications for the integrated resource planning process, namely that load forecast accuracy may not be as important for resource procurement as is generally believed, that load forecast sensitivities could be used to improve the procurement process, and that management of load uncertainty should be prioritized over more complex forecasting techniques.« less

  8. Short-term load forecasting of power system

    NASA Astrophysics Data System (ADS)

    Xu, Xiaobin

    2017-05-01

    In order to ensure the scientific nature of optimization about power system, it is necessary to improve the load forecasting accuracy. Power system load forecasting is based on accurate statistical data and survey data, starting from the history and current situation of electricity consumption, with a scientific method to predict the future development trend of power load and change the law of science. Short-term load forecasting is the basis of power system operation and analysis, which is of great significance to unit combination, economic dispatch and safety check. Therefore, the load forecasting of the power system is explained in detail in this paper. First, we use the data from 2012 to 2014 to establish the partial least squares model to regression analysis the relationship between daily maximum load, daily minimum load, daily average load and each meteorological factor, and select the highest peak by observing the regression coefficient histogram Day maximum temperature, daily minimum temperature and daily average temperature as the meteorological factors to improve the accuracy of load forecasting indicators. Secondly, in the case of uncertain climate impact, we use the time series model to predict the load data for 2015, respectively, the 2009-2014 load data were sorted out, through the previous six years of the data to forecast the data for this time in 2015. The criterion for the accuracy of the prediction is the average of the standard deviations for the prediction results and average load for the previous six years. Finally, considering the climate effect, we use the BP neural network model to predict the data in 2015, and optimize the forecast results on the basis of the time series model.

  9. Short-term Power Load Forecasting Based on Balanced KNN

    NASA Astrophysics Data System (ADS)

    Lv, Xianlong; Cheng, Xingong; YanShuang; Tang, Yan-mei

    2018-03-01

    To improve the accuracy of load forecasting, a short-term load forecasting model based on balanced KNN algorithm is proposed; According to the load characteristics, the historical data of massive power load are divided into scenes by the K-means algorithm; In view of unbalanced load scenes, the balanced KNN algorithm is proposed to classify the scene accurately; The local weighted linear regression algorithm is used to fitting and predict the load; Adopting the Apache Hadoop programming framework of cloud computing, the proposed algorithm model is parallelized and improved to enhance its ability of dealing with massive and high-dimension data. The analysis of the household electricity consumption data for a residential district is done by 23-nodes cloud computing cluster, and experimental results show that the load forecasting accuracy and execution time by the proposed model are the better than those of traditional forecasting algorithm.

  10. A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.

    2013-07-25

    This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load valuesmore » to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less

  11. A Short-Term and High-Resolution System Load Forecasting Approach Using Support Vector Regression with Hybrid Parameters Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang

    This work proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system.« less

  12. Long term load forecasting accuracy in electric utility integrated resource planning

    DOE PAGES

    Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H.; ...

    2018-05-23

    Forecasts of electricity consumption and peak demand over time horizons of one or two decades are a key element in electric utilities’ meeting their core objective and obligation to ensure reliable and affordable electricity supplies for their customers while complying with a range of energy and environmental regulations and policies. These forecasts are an important input to integrated resource planning (IRP) processes involving utilities, regulators, and other stake-holders. Despite their importance, however, there has been little analysis of long term utility load forecasting accuracy. We conduct a retrospective analysis of long term load forecasts on twelve Western U. S. electricmore » utilities in the mid-2000s to find that most overestimated both energy consumption and peak demand growth. A key reason for this was the use of assumptions that led to an overestimation of economic growth. We find that the complexity of forecast methods and the accuracy of these forecasts are mildly correlated. In addition, sensitivity and risk analysis of load growth and its implications for capacity expansion were not well integrated with subsequent implementation. As a result, we review changes in the utilities load forecasting methods over the subsequent decade, and discuss the policy implications of long term load forecast inaccuracy and its underlying causes.« less

  13. Long term load forecasting accuracy in electric utility integrated resource planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H.

    Forecasts of electricity consumption and peak demand over time horizons of one or two decades are a key element in electric utilities’ meeting their core objective and obligation to ensure reliable and affordable electricity supplies for their customers while complying with a range of energy and environmental regulations and policies. These forecasts are an important input to integrated resource planning (IRP) processes involving utilities, regulators, and other stake-holders. Despite their importance, however, there has been little analysis of long term utility load forecasting accuracy. We conduct a retrospective analysis of long term load forecasts on twelve Western U. S. electricmore » utilities in the mid-2000s to find that most overestimated both energy consumption and peak demand growth. A key reason for this was the use of assumptions that led to an overestimation of economic growth. We find that the complexity of forecast methods and the accuracy of these forecasts are mildly correlated. In addition, sensitivity and risk analysis of load growth and its implications for capacity expansion were not well integrated with subsequent implementation. As a result, we review changes in the utilities load forecasting methods over the subsequent decade, and discuss the policy implications of long term load forecast inaccuracy and its underlying causes.« less

  14. Short term load forecasting of anomalous load using hybrid soft computing methods

    NASA Astrophysics Data System (ADS)

    Rasyid, S. A.; Abdullah, A. G.; Mulyadi, Y.

    2016-04-01

    Load forecast accuracy will have an impact on the generation cost is more economical. The use of electrical energy by consumers on holiday, show the tendency of the load patterns are not identical, it is different from the pattern of the load on a normal day. It is then defined as a anomalous load. In this paper, the method of hybrid ANN-Particle Swarm proposed to improve the accuracy of anomalous load forecasting that often occur on holidays. The proposed methodology has been used to forecast the half-hourly electricity demand for power systems in the Indonesia National Electricity Market in West Java region. Experiments were conducted by testing various of learning rate and learning data input. Performance of this methodology will be validated with real data from the national of electricity company. The result of observations show that the proposed formula is very effective to short-term load forecasting in the case of anomalous load. Hybrid ANN-Swarm Particle relatively simple and easy as a analysis tool by engineers.

  15. Residential Saudi load forecasting using analytical model and Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Al-Harbi, Ahmad Abdulaziz

    In recent years, load forecasting has become one of the main fields of study and research. Short Term Load Forecasting (STLF) is an important part of electrical power system operation and planning. This work investigates the applicability of different approaches; Artificial Neural Networks (ANNs) and hybrid analytical models to forecast residential load in Kingdom of Saudi Arabia (KSA). These two techniques are based on model human modes behavior formulation. These human modes represent social, religious, official occasions and environmental parameters impact. The analysis is carried out on residential areas for three regions in two countries exposed to distinct people activities and weather conditions. The collected data are for Al-Khubar and Yanbu industrial city in KSA, in addition to Seattle, USA to show the validity of the proposed models applied on residential load. For each region, two models are proposed. First model is next hour load forecasting while second model is next day load forecasting. Both models are analyzed using the two techniques. The obtained results for ANN next hour models yield very accurate results for all areas while relatively reasonable results are achieved when using hybrid analytical model. For next day load forecasting, the two approaches yield satisfactory results. Comparative studies were conducted to prove the effectiveness of the models proposed.

  16. Study on load forecasting to data centers of high power density based on power usage effectiveness

    NASA Astrophysics Data System (ADS)

    Zhou, C. C.; Zhang, F.; Yuan, Z.; Zhou, L. M.; Wang, F. M.; Li, W.; Yang, J. H.

    2016-08-01

    There is usually considerable energy consumption in data centers. Load forecasting to data centers is in favor of formulating regional load density indexes and of great benefit to getting regional spatial load forecasting more accurately. The building structure and the other influential factors, i.e. equipment, geographic and climatic conditions, are considered for the data centers, and a method to forecast the load of the data centers based on power usage effectiveness is proposed. The cooling capacity of a data center and the index of the power usage effectiveness are used to forecast the power load of the data center in the method. The cooling capacity is obtained by calculating the heat load of the data center. The index is estimated using the group decision-making method of mixed language information. An example is given to prove the applicability and accuracy of this method.

  17. Improved Neural Networks with Random Weights for Short-Term Load Forecasting

    PubMed Central

    Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo

    2015-01-01

    An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting. PMID:26629825

  18. Improved Neural Networks with Random Weights for Short-Term Load Forecasting.

    PubMed

    Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo

    2015-01-01

    An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting.

  19. System load forecasts for an electric utility. [Hourly loads using Box-Jenkins method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uri, N.D.

    This paper discusses forecasting hourly system load for an electric utility using Box-Jenkins time-series analysis. The results indicate that a model based on the method of Box and Jenkins, given its simplicity, gives excellent results over the forecast horizon.

  20. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting.

    PubMed

    Hippert, Henrique S; Taylor, James W

    2010-04-01

    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. Short-Term Load Forecasting Error Distributions and Implications for Renewable Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, B. M.; Lew, D.; Milligan, M.

    2013-01-01

    Load forecasting in the day-ahead timescale is a critical aspect of power system operations that is used in the unit commitment process. It is also an important factor in renewable energy integration studies, where the combination of load and wind or solar forecasting techniques create the net load uncertainty that must be managed by the economic dispatch process or with suitable reserves. An understanding of that load forecasting errors that may be expected in this process can lead to better decisions about the amount of reserves necessary to compensate errors. In this work, we performed a statistical analysis of themore » day-ahead (and two-day-ahead) load forecasting errors observed in two independent system operators for a one-year period. Comparisons were made with the normal distribution commonly assumed in power system operation simulations used for renewable power integration studies. Further analysis identified time periods when the load is more likely to be under- or overforecast.« less

  2. Steam-load-forecasting technique for central-heating plants. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, M.C.; Carnahan, J.V.

    Because boilers generally are most efficient at full loads, the Army could achieve significant savings by running fewer boilers at high loads rather than more boilers at low loads. A reliable load prediction technique could help ensure that only those boilers required to meet demand are on line. This report presents the results of an investigation into the feasibility of forecasting heat plant steam loads from historical patterns and weather information. Using steam flow data collected at Fort Benjamin Harrison, IN, a Box-Jenkins transfer function model with an acceptably small prediction error was initially identified. Initial investigation of forecast modelmore » development appeared successful. Dynamic regression methods using actual ambient temperatures yielded the best results. Box-Jenkins univariate models' results appeared slightly less accurate. Since temperature information was not needed for model building and forecasting, however, it is recommended that Box-Jenkins models be considered prime candidates for load forecasting due to their simpler mathematics.« less

  3. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Minimum approval requirements for all load forecasts. 1710.205 Section 1710.205 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND GUARANTEES Load Forecasts §...

  4. A short-term and high-resolution distribution system load forecasting approach using support vector regression with hybrid parameters optimization

    DOE PAGES

    Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard; ...

    2016-01-01

    This paper proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system. The performance of the proposed approach is compared to some classic methods in later sections of the paper.« less

  5. 7 CFR 1710.209 - Approval requirements for load forecast work plans.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) In addition to the approved load forecast required under §§ 1710.202 and 1710.203, any power supply... that are members of a power supply borrower with a total utility plant of $500 million or more must cooperate in the preparation of and submittal of the load forecast work plan of their power supply borrower...

  6. Analysis and Synthesis of Load Forecasting Data for Renewable Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steckler, N.; Florita, A.; Zhang, J.

    2013-11-01

    As renewable energy constitutes greater portions of the generation fleet, the importance of modeling uncertainty as part of integration studies also increases. In pursuit of optimal system operations, it is important to capture not only the definitive behavior of power plants, but also the risks associated with systemwide interactions. This research examines the dependence of load forecast errors on external predictor variables such as temperature, day type, and time of day. The analysis was utilized to create statistically relevant instances of sequential load forecasts with only a time series of historic, measured load available. The creation of such load forecastsmore » relies on Bayesian techniques for informing and updating the model, thus providing a basis for networked and adaptive load forecast models in future operational applications.« less

  7. Comparison of Wind Power and Load Forecasting Error Distributions: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, B. M.; Florita, A.; Orwig, K.

    2012-07-01

    The introduction of large amounts of variable and uncertain power sources, such as wind power, into the electricity grid presents a number of challenges for system operations. One issue involves the uncertainty associated with scheduling power that wind will supply in future timeframes. However, this is not an entirely new challenge; load is also variable and uncertain, and is strongly influenced by weather patterns. In this work we make a comparison between the day-ahead forecasting errors encountered in wind power forecasting and load forecasting. The study examines the distribution of errors from operational forecasting systems in two different Independent Systemmore » Operator (ISO) regions for both wind power and load forecasts at the day-ahead timeframe. The day-ahead timescale is critical in power system operations because it serves the unit commitment function for slow-starting conventional generators.« less

  8. Compensated Box-Jenkins transfer function for short term load forecast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breipohl, A.; Yu, Z.; Lee, F.N.

    In the past years, the Box-Jenkins ARIMA method and the Box-Jenkins transfer function method (BJTF) have been among the most commonly used methods for short term electrical load forecasting. But when there exists a sudden change in the temperature, both methods tend to exhibit larger errors in the forecast. This paper demonstrates that the load forecasting errors resulting from either the BJ ARIMA model or the BJTF model are not simply white noise, but rather well-patterned noise, and the patterns in the noise can be used to improve the forecasts. Thus a compensated Box-Jenkins transfer method (CBJTF) is proposed tomore » improve the accuracy of the load prediction. Some case studies have been made which result in about a 14-33% reduction of the root mean square (RMS) errors of the forecasts, depending on the compensation time period as well as the compensation method used.« less

  9. Advanced Intelligent System Application to Load Forecasting and Control for Hybrid Electric Bus

    NASA Technical Reports Server (NTRS)

    Momoh, James; Chattopadhyay, Deb; Elfayoumy, Mahmoud

    1996-01-01

    The primary motivation for this research emanates from providing a decision support system to the electric bus operators in the municipal and urban localities which will guide the operators to maintain an optimal compromise among the noise level, pollution level, fuel usage etc. This study is backed up by our previous studies on study of battery characteristics, permanent magnet DC motor studies and electric traction motor size studies completed in the first year. The operator of the Hybrid Electric Car must determine optimal power management schedule to meet a given load demand for different weather and road conditions. The decision support system for the bus operator comprises three sub-tasks viz. forecast of the electrical load for the route to be traversed divided into specified time periods (few minutes); deriving an optimal 'plan' or 'preschedule' based on the load forecast for the entire time-horizon (i.e., for all time periods) ahead of time; and finally employing corrective control action to monitor and modify the optimal plan in real-time. A fully connected artificial neural network (ANN) model is developed for forecasting the kW requirement for hybrid electric bus based on inputs like climatic conditions, passenger load, road inclination, etc. The ANN model is trained using back-propagation algorithm employing improved optimization techniques like projected Lagrangian technique. The pre-scheduler is based on a Goal-Programming (GP) optimization model with noise, pollution and fuel usage as the three objectives. GP has the capability of analyzing the trade-off among the conflicting objectives and arriving at the optimal activity levels, e.g., throttle settings. The corrective control action or the third sub-task is formulated as an optimal control model with inputs from the real-time data base as well as the GP model to minimize the error (or deviation) from the optimal plan. These three activities linked with the ANN forecaster proving the output to the GP model which in turn produces the pre-schedule of the optimal control model. Some preliminary results based on a hypothetical test case will be presented for the load forecasting module. The computer codes for the three modules will be made available fe adoption by bus operating agencies. Sample results will be provided using these models. The software will be a useful tool for supporting the control systems for the Electric Bus project of NASA.

  10. 7 CFR 1710.202 - Requirement to prepare a load forecast-power supply borrowers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Requirement to prepare a load forecast-power supply...—power supply borrowers. (a) A power supply borrower with a total utility plant of $500 million or more... be prepared pursuant to the approved load forecast work plan. (b) A power supply borrower that is a...

  11. [Demography perspectives and forecasts of the demand for electricity].

    PubMed

    Roy, L; Guimond, E

    1995-01-01

    "Demographic perspectives form an integral part in the development of electric load forecasts. These forecasts in turn are used to justify the addition and repair of generating facilities that will supply power in the coming decades. The goal of this article is to present how demographic perspectives are incorporated into the electric load forecasting in Quebec. The first part presents the methods, hypotheses and results of population and household projections used by Hydro-Quebec in updating its latest development plan. The second section demonstrates applications of such demographic projections for forecasting the electric load, with a focus on the residential sector." (SUMMARY IN ENG AND SPA) excerpt

  12. Studies of air traffic forecasts, airspace load and the effect of ADS-B via satellites on flight times

    NASA Astrophysics Data System (ADS)

    Zhong, Z. W.; Ridhwan Salleh, Saiful; Chow, W. X.; Ong, Z. M.

    2016-10-01

    Air traffic forecasting is important as it helps stakeholders to plan their budgets and facilities. Thus, three most commonly used forecasting models were compared to see which model suited the air passenger traffic the best. General forecasting equations were also created to forecast the passenger traffic. The equations could forecast around 6.0% growth from 2015 onwards. Another study sought to provide an initial work for determining a theoretical airspace load with relevant calculations. The air traffic was simulated to investigate the current airspace load. Logical and reasonable results were obtained from the modelling and simulations. The current utilization percentages for airspace load per hour and the static airspace load in the interested airspace were found to be 6.64% and 11.21% respectively. Our research also studied how ADS-B would affect the time taken for aircraft to travel. 6000 flights departing from and landing at the airport were studied. New flight plans were simulated with improved flight paths due to the implementation of ADS-B, and flight times of all studied flights could be improved.

  13. 7 CFR 1710.207 - RUS criteria for approval of load forecasts by distribution borrowers not required to maintain an...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false RUS criteria for approval of load forecasts by distribution borrowers not required to maintain an approved load forecast on an ongoing basis. 1710.207 Section 1710.207 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PR...

  14. Short term load forecasting using a self-supervised adaptive neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, H.; Pimmel, R.L.

    The authors developed a self-supervised adaptive neural network to perform short term load forecasts (STLF) for a large power system covering a wide service area with several heavy load centers. They used the self-supervised network to extract correlational features from temperature and load data. In using data from the calendar year 1993 as a test case, they found a 0.90 percent error for hour-ahead forecasting and 1.92 percent error for day-ahead forecasting. These levels of error compare favorably with those obtained by other techniques. The algorithm ran in a couple of minutes on a PC containing an Intel Pentium --more » 120 MHz CPU. Since the algorithm included searching the historical database, training the network, and actually performing the forecasts, this approach provides a real-time, portable, and adaptable STLF.« less

  15. Energy management of a university campus utilizing short-term load forecasting with an artificial neural network

    NASA Astrophysics Data System (ADS)

    Palchak, David

    Electrical load forecasting is a tool that has been utilized by distribution designers and operators as a means for resource planning and generation dispatch. The techniques employed in these predictions are proving useful in the growing market of consumer, or end-user, participation in electrical energy consumption. These predictions are based on exogenous variables, such as weather, and time variables, such as day of week and time of day as well as prior energy consumption patterns. The participation of the end-user is a cornerstone of the Smart Grid initiative presented in the Energy Independence and Security Act of 2007, and is being made possible by the emergence of enabling technologies such as advanced metering infrastructure. The optimal application of the data provided by an advanced metering infrastructure is the primary motivation for the work done in this thesis. The methodology for using this data in an energy management scheme that utilizes a short-term load forecast is presented. The objective of this research is to quantify opportunities for a range of energy management and operation cost savings of a university campus through the use of a forecasted daily electrical load profile. The proposed algorithm for short-term load forecasting is optimized for Colorado State University's main campus, and utilizes an artificial neural network that accepts weather and time variables as inputs. The performance of the predicted daily electrical load is evaluated using a number of error measurements that seek to quantify the best application of the forecast. The energy management presented utilizes historical electrical load data from the local service provider to optimize the time of day that electrical loads are being managed. Finally, the utilization of forecasts in the presented energy management scenario is evaluated based on cost and energy savings.

  16. Short-term load forecasting using neural network for future smart grid application

    NASA Astrophysics Data System (ADS)

    Zennamo, Joseph Anthony, III

    Short-term load forecasting of power system has been a classic problem for a long time. Not merely it has been researched extensively and intensively, but also a variety of forecasting methods has been raised. This thesis outlines some aspects and functions of smart meter. It also presents different policies and current statuses as well as future projects and objectives of SG development in several countries. Then the thesis compares main aspects about latest products of smart meter from different companies. Lastly, three types of prediction models are established in MATLAB to emulate the functions of smart grid in the short-term load forecasting, and then their results are compared and analyzed in terms of accuracy. For this thesis, more variables such as dew point temperature are used in the Neural Network model to achieve more accuracy for better short-term load forecasting results.

  17. Uses and Applications of Climate Forecasts for Power Utilities.

    NASA Astrophysics Data System (ADS)

    Changnon, Stanley A.; Changnon, Joyce M.; Changnon, David

    1995-05-01

    The uses and potential applications of climate forecasts for electric and gas utilities were assessed 1) to discern needs for improving climate forecasts and guiding future research, and 2) to assist utilities in making wise use of forecasts. In-depth structured interviews were conducted with 56 decision makers in six utilities to assess existing and potential uses of climate forecasts. Only 3 of the 56 use forecasts. Eighty percent of those sampled envisioned applications of climate forecasts, given certain changes and additional information. Primary applications exist in power trading, load forecasting, fuel acquisition, and systems planning, with slight differences in interests between utilities. Utility staff understand probability-based forecasts but desire climatological information related to forecasted outcomes, including analogs similar to the forecasts, and explanations of the forecasts. Desired lead times vary from a week to three months, along with forecasts of up to four seasons ahead. The new NOAA forecasts initiated in 1995 provide the lead times and longer-term forecasts desired. Major hindrances to use of forecasts are hard-to-understand formats, lack of corporate acceptance, and lack of access to expertise. Recent changes in government regulations altered the utility industry, leading to a more competitive world wherein information about future weather conditions assumes much more value. Outreach efforts by government forecast agencies appear valuable to help achieve the appropriate and enhanced use of climate forecasts by the utility industry. An opportunity for service exists also for the private weather sector.

  18. Short-term forecasting of individual household electricity loads with investigating impact of data resolution and forecast horizon

    NASA Astrophysics Data System (ADS)

    Yildiz, Baran; Bilbao, Jose I.; Dore, Jonathon; Sproul, Alistair B.

    2018-05-01

    Smart grid components such as smart home and battery energy management systems, high penetration of renewable energy systems, and demand response activities, require accurate electricity demand forecasts for the successful operation of the electricity distribution networks. For example, in order to optimize residential PV generation and electricity consumption and plan battery charge-discharge regimes by scheduling household appliances, forecasts need to target and be tailored to individual household electricity loads. The recent uptake of smart meters allows easier access to electricity readings at very fine resolutions; hence, it is possible to utilize this source of available data to create forecast models. In this paper, models which predominantly use smart meter data alongside with weather variables, or smart meter based models (SMBM), are implemented to forecast individual household loads. Well-known machine learning models such as artificial neural networks (ANN), support vector machines (SVM) and Least-Square SVM are implemented within the SMBM framework and their performance is compared. The analysed household stock consists of 14 households from the state of New South Wales, Australia, with at least a year worth of 5 min. resolution data. In order for the results to be comparable between different households, our study first investigates household load profiles according to their volatility and reveals the relationship between load standard deviation and forecast performance. The analysis extends previous research by evaluating forecasts over four different data resolution; 5, 15, 30 and 60 min, each resolution analysed for four different horizons; 1, 6, 12 and 24 h ahead. Both, data resolution and forecast horizon, proved to have significant impact on the forecast performance and the obtained results provide important insights for the operation of various smart grid applications. Finally, it is shown that the load profile of some households vary significantly across different days; as a result, providing a single model for the entire period may result in limited performance. By the use of a pre-clustering step, similar daily load profiles are grouped together according to their standard deviation, and instead of applying one SMBM for the entire data-set of a particular household, separate SMBMs are applied to each one of the clusters. This preliminary clustering step increases the complexity of the analysis however it results in significant improvements in forecast performance.

  19. Towards smart energy systems: application of kernel machine regression for medium term electricity load forecasting.

    PubMed

    Alamaniotis, Miltiadis; Bargiotas, Dimitrios; Tsoukalas, Lefteri H

    2016-01-01

    Integration of energy systems with information technologies has facilitated the realization of smart energy systems that utilize information to optimize system operation. To that end, crucial in optimizing energy system operation is the accurate, ahead-of-time forecasting of load demand. In particular, load forecasting allows planning of system expansion, and decision making for enhancing system safety and reliability. In this paper, the application of two types of kernel machines for medium term load forecasting (MTLF) is presented and their performance is recorded based on a set of historical electricity load demand data. The two kernel machine models and more specifically Gaussian process regression (GPR) and relevance vector regression (RVR) are utilized for making predictions over future load demand. Both models, i.e., GPR and RVR, are equipped with a Gaussian kernel and are tested on daily predictions for a 30-day-ahead horizon taken from the New England Area. Furthermore, their performance is compared to the ARMA(2,2) model with respect to mean average percentage error and squared correlation coefficient. Results demonstrate the superiority of RVR over the other forecasting models in performing MTLF.

  20. A stochastic post-processing method for solar irradiance forecasts derived from NWPs models

    NASA Astrophysics Data System (ADS)

    Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.

    2010-09-01

    Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.

  1. A VVWBO-BVO-based GM (1,1) and its parameter optimization by GRA-IGSA integration algorithm for annual power load forecasting

    PubMed Central

    Wang, Hongguang

    2018-01-01

    Annual power load forecasting is not only the premise of formulating reasonable macro power planning, but also an important guarantee for the safety and economic operation of power system. In view of the characteristics of annual power load forecasting, the grey model of GM (1,1) are widely applied. Introducing buffer operator into GM (1,1) to pre-process the historical annual power load data is an approach to improve the forecasting accuracy. To solve the problem of nonadjustable action intensity of traditional weakening buffer operator, variable-weight weakening buffer operator (VWWBO) and background value optimization (BVO) are used to dynamically pre-process the historical annual power load data and a VWWBO-BVO-based GM (1,1) is proposed. To find the optimal value of variable-weight buffer coefficient and background value weight generating coefficient of the proposed model, grey relational analysis (GRA) and improved gravitational search algorithm (IGSA) are integrated and a GRA-IGSA integration algorithm is constructed aiming to maximize the grey relativity between simulating value sequence and actual value sequence. By the adjustable action intensity of buffer operator, the proposed model optimized by GRA-IGSA integration algorithm can obtain a better forecasting accuracy which is demonstrated by the case studies and can provide an optimized solution for annual power load forecasting. PMID:29768450

  2. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... narrative shall address the overall approach, time periods, and expected internal and external uses of the forecast. Examples of internal uses include providing information for developing or monitoring demand side... suppliers. Examples of external uses include meeting state and Federal regulatory requirements, obtaining...

  3. Performance of fuzzy approach in Malaysia short-term electricity load forecasting

    NASA Astrophysics Data System (ADS)

    Mansor, Rosnalini; Zulkifli, Malina; Yusof, Muhammad Mat; Ismail, Mohd Isfahani; Ismail, Suzilah; Yin, Yip Chee

    2014-12-01

    Many activities such as economic, education and manafucturing would paralyse with limited supply of electricity but surplus contribute to high operating cost. Therefore electricity load forecasting is important in order to avoid shortage or excess. Previous finding showed festive celebration has effect on short-term electricity load forecasting. Being a multi culture country Malaysia has many major festive celebrations such as Eidul Fitri, Chinese New Year and Deepavali but they are moving holidays due to non-fixed dates on the Gregorian calendar. This study emphasis on the performance of fuzzy approach in forecasting electricity load when considering the presence of moving holidays. Autoregressive Distributed Lag model was estimated using simulated data by including model simplification concept (manual or automatic), day types (weekdays or weekend), public holidays and lags of electricity load. The result indicated that day types, public holidays and several lags of electricity load were significant in the model. Overall, model simplification improves fuzzy performance due to less variables and rules.

  4. A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty

    NASA Astrophysics Data System (ADS)

    Ohmi, Masataro; Mori, Hiroyuki

    In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.

  5. Forecasting of hourly load by pattern recognition in a small area power system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehdashti-Shahrokh, A.

    1982-01-01

    An intuitive, logical, simple and efficient method of forecasting hourly load in a small area power system is presented. A pattern recognition approach is used in developing the forecasting model. Pattern recognition techniques are powerful tools in the field of artificial intelligence (cybernetics) and simulate the way the human brain operates to make decisions. Pattern recognition is generally used in analysis of processes where the total physical nature behind the process variation is unkown but specific kinds of measurements explain their behavior. In this research basic multivariate analyses, in conjunction with pattern recognition techniques, are used to develop a linearmore » deterministic model to forecast hourly load. This method assumes that load patterns in the same geographical area are direct results of climatological changes (weather sensitive load), and have occurred in the past as a result of similar climatic conditions. The algorithm described in here searches for the best possible pattern from a seasonal library of load and weather data in forecasting hourly load. To accommodate the unpredictability of weather and the resulting load, the basic twenty-four load pattern was divided into eight three-hour intervals. This division was made to make the model adaptive to sudden climatic changes. The proposed method offers flexible lead times of one to twenty-four hours. The results of actual data testing had indicated that this proposed method is computationally efficient, highly adaptive, with acceptable data storage size and accuracy that is comparable to many other existing methods.« less

  6. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  7. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... effects on electric revenues caused by competition from alternative energy sources or other electric... uncertainty or alternative futures that may determine the borrower's actual loads. Examples of economic... basis. Include alternative futures, as applicable. This summary shall be designed to accommodate the...

  8. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... effects on electric revenues caused by competition from alternative energy sources or other electric... uncertainty or alternative futures that may determine the borrower's actual loads. Examples of economic... basis. Include alternative futures, as applicable. This summary shall be designed to accommodate the...

  9. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... effects on electric revenues caused by competition from alternative energy sources or other electric... uncertainty or alternative futures that may determine the borrower's actual loads. Examples of economic... basis. Include alternative futures, as applicable. This summary shall be designed to accommodate the...

  10. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... effects on electric revenues caused by competition from alternative energy sources or other electric... uncertainty or alternative futures that may determine the borrower's actual loads. Examples of economic... basis. Include alternative futures, as applicable. This summary shall be designed to accommodate the...

  11. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... electronically to RUS computer software applications. RUS will evaluate borrower load forecasts for readability...'s engineering planning documents, such as the construction work plan, incorporate consumer and usage...

  12. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... electronically to RUS computer software applications. RUS will evaluate borrower load forecasts for readability...'s engineering planning documents, such as the construction work plan, incorporate consumer and usage...

  13. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... computer software applications. RUS will evaluate borrower load forecasts for readability, understanding..., distribution costs, other systems costs, average revenue per kWh, and inflation. Also, a borrower's engineering...

  14. 7 CFR 1710.205 - Minimum approval requirements for all load forecasts.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... electronically to RUS computer software applications. RUS will evaluate borrower load forecasts for readability...'s engineering planning documents, such as the construction work plan, incorporate consumer and usage...

  15. Daily Peak Load Forecasting of Next Day using Weather Distribution and Comparison Value of Each Nearby Date Data

    NASA Astrophysics Data System (ADS)

    Ito, Shigenobu; Yukita, Kazuto; Goto, Yasuyuki; Ichiyanagi, Katsuhiro; Nakano, Hiroyuki

    By the development of industry, in recent years; dependence to electric energy is growing year by year. Therefore, reliable electric power supply is in need. However, to stock a huge amount of electric energy is very difficult. Also, there is a necessity to keep balance between the demand and supply, which changes hour after hour. Consequently, to supply the high quality and highly dependable electric power supply, economically, and with high efficiency, there is a need to forecast the movement of the electric power demand carefully in advance. And using that forecast as the source, supply and demand management plan should be made. Thus load forecasting is said to be an important job among demand investment of electric power companies. So far, forecasting method using Fuzzy logic, Neural Net Work, Regression model has been suggested for the development of forecasting accuracy. Those forecasting accuracy is in a high level. But to invest electric power in higher accuracy more economically, a new forecasting method with higher accuracy is needed. In this paper, to develop the forecasting accuracy of the former methods, the daily peak load forecasting method using the weather distribution of highest and lowest temperatures, and comparison value of each nearby date data is suggested.

  16. A clustering-based fuzzy wavelet neural network model for short-term load forecasting.

    PubMed

    Kodogiannis, Vassilis S; Amina, Mahdi; Petrounias, Ilias

    2013-10-01

    Load forecasting is a critical element of power system operation, involving prediction of the future level of demand to serve as the basis for supply and demand planning. This paper presents the development of a novel clustering-based fuzzy wavelet neural network (CB-FWNN) model and validates its prediction on the short-term electric load forecasting of the Power System of the Greek Island of Crete. The proposed model is obtained from the traditional Takagi-Sugeno-Kang fuzzy system by replacing the THEN part of fuzzy rules with a "multiplication" wavelet neural network (MWNN). Multidimensional Gaussian type of activation functions have been used in the IF part of the fuzzyrules. A Fuzzy Subtractive Clustering scheme is employed as a pre-processing technique to find out the initial set and adequate number of clusters and ultimately the number of multiplication nodes in MWNN, while Gaussian Mixture Models with the Expectation Maximization algorithm are utilized for the definition of the multidimensional Gaussians. The results corresponding to the minimum and maximum power load indicate that the proposed load forecasting model provides significantly accurate forecasts, compared to conventional neural networks models.

  17. Optimize Short Term load Forcasting Anomalous Based Feed Forward Backpropagation

    NASA Astrophysics Data System (ADS)

    Mulyadi, Y.; Abdullah, A. G.; Rohmah, K. A.

    2017-03-01

    This paper contains the Short-Term Load Forecasting (STLF) using artificial neural network especially feed forward back propagation algorithm which is particularly optimized in order to getting a reduced error value result. Electrical load forecasting target is a holiday that hasn’t identical pattern and different from weekday’s pattern, in other words the pattern of holiday load is an anomalous. Under these conditions, the level of forecasting accuracy will be decrease. Hence we need a method that capable to reducing error value in anomalous load forecasting. Learning process of algorithm is supervised or controlled, then some parameters are arranged before performing computation process. Momentum constant a value is set at 0.8 which serve as a reference because it has the greatest converge tendency. Learning rate selection is made up to 2 decimal digits. In addition, hidden layer and input component are tested in several variation of number also. The test result leads to the conclusion that the number of hidden layer impact on the forecasting accuracy and test duration determined by the number of iterations when performing input data until it reaches the maximum of a parameter value.

  18. 7 CFR 1710.203 - Requirement to prepare a load forecast-distribution borrowers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Requirement to prepare a load forecast-distribution borrowers. 1710.203 Section 1710.203 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND GUARANTEES Load...

  19. The time series approach to short term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagan, M.T.; Behr, S.M.

    The application of time series analysis methods to load forecasting is reviewed. It is shown than Box and Jenkins time series models, in particular, are well suited to this application. The logical and organized procedures for model development using the autocorrelation function make these models particularly attractive. One of the drawbacks of these models is the inability to accurately represent the nonlinear relationship between load and temperature. A simple procedure for overcoming this difficulty is introduced, and several Box and Jenkins models are compared with a forecasting procedure currently used by a utility company.

  20. Peak load demand forecasting using two-level discrete wavelet decomposition and neural network algorithm

    NASA Astrophysics Data System (ADS)

    Bunnoon, Pituk; Chalermyanont, Kusumal; Limsakul, Chusak

    2010-02-01

    This paper proposed the discrete transform and neural network algorithms to obtain the monthly peak load demand in mid term load forecasting. The mother wavelet daubechies2 (db2) is employed to decomposed, high pass filter and low pass filter signals from the original signal before using feed forward back propagation neural network to determine the forecasting results. The historical data records in 1997-2007 of Electricity Generating Authority of Thailand (EGAT) is used as reference. In this study, historical information of peak load demand(MW), mean temperature(Tmean), consumer price index (CPI), and industrial index (economic:IDI) are used as feature inputs of the network. The experimental results show that the Mean Absolute Percentage Error (MAPE) is approximately 4.32%. This forecasting results can be used for fuel planning and unit commitment of the power system in the future.

  1. Joint Seasonal ARMA Approach for Modeling of Load Forecast Errors in Planning Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.

    2014-04-14

    To make informed and robust decisions in the probabilistic power system operation and planning process, it is critical to conduct multiple simulations of the generated combinations of wind and load parameters and their forecast errors to handle the variability and uncertainty of these time series. In order for the simulation results to be trustworthy, the simulated series must preserve the salient statistical characteristics of the real series. In this paper, we analyze day-ahead load forecast error data from multiple balancing authority locations and characterize statistical properties such as mean, standard deviation, autocorrelation, correlation between series, time-of-day bias, and time-of-day autocorrelation.more » We then construct and validate a seasonal autoregressive moving average (ARMA) model to model these characteristics, and use the model to jointly simulate day-ahead load forecast error series for all BAs.« less

  2. Toward quantitative forecasts of volcanic ash dispersal: Using satellite retrievals for optimal estimation of source terms

    NASA Astrophysics Data System (ADS)

    Zidikheri, Meelis J.; Lucas, Christopher; Potts, Rodney J.

    2017-08-01

    Airborne volcanic ash is a hazard to aviation. There is an increasing demand for quantitative forecasts of ash properties such as ash mass load to allow airline operators to better manage the risks of flying through airspace likely to be contaminated by ash. In this paper we show how satellite-derived mass load information at times prior to the issuance of the latest forecast can be used to estimate various model parameters that are not easily obtained by other means such as the distribution of mass of the ash column at the volcano. This in turn leads to better forecasts of ash mass load. We demonstrate the efficacy of this approach using several case studies.

  3. 7 CFR 1710.202 - Requirement to prepare a load forecast-power supply borrowers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Requirement to prepare a load forecast-power supply borrowers. 1710.202 Section 1710.202 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND GUARANTEES Load...

  4. Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Libera, D.

    2017-12-01

    Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.

  5. Analysis of recurrent neural networks for short-term energy load forecasting

    NASA Astrophysics Data System (ADS)

    Di Persio, Luca; Honchar, Oleksandr

    2017-11-01

    Short-term forecasts have recently gained an increasing attention because of the rise of competitive electricity markets. In fact, short-terms forecast of possible future loads turn out to be fundamental to build efficient energy management strategies as well as to avoid energy wastage. Such type of challenges are difficult to tackle both from a theoretical and applied point of view. Latter tasks require sophisticated methods to manage multidimensional time series related to stochastic phenomena which are often highly interconnected. In the present work we first review novel approaches to energy load forecasting based on recurrent neural network, focusing our attention on long/short term memory architectures (LSTMs). Such type of artificial neural networks have been widely applied to problems dealing with sequential data such it happens, e.g., in socio-economics settings, for text recognition purposes, concerning video signals, etc., always showing their effectiveness to model complex temporal data. Moreover, we consider different novel variations of basic LSTMs, such as sequence-to-sequence approach and bidirectional LSTMs, aiming at providing effective models for energy load data. Last but not least, we test all the described algorithms on real energy load data showing not only that deep recurrent networks can be successfully applied to energy load forecasting, but also that this approach can be extended to other problems based on time series prediction.

  6. Wind Energy Management System EMS Integration Project: Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.

    2010-01-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind and solar power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation), and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the loadmore » and wind/solar forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. To improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. Currently, uncertainties associated with wind and load forecasts, as well as uncertainties associated with random generator outages and unexpected disconnection of supply lines, are not taken into account in power grid operation. Thus, operators have little means to weigh the likelihood and magnitude of upcoming events of power imbalance. In this project, funded by the U.S. Department of Energy (DOE), a framework has been developed for incorporating uncertainties associated with wind and load forecast errors, unpredicted ramps, and forced generation disconnections into the energy management system (EMS) as well as generation dispatch and commitment applications. A new approach to evaluate the uncertainty ranges for the required generation performance envelope including balancing capacity, ramping capability, and ramp duration has been proposed. The approach includes three stages: forecast and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence levels. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis, incorporating all sources of uncertainties of both continuous (wind and load forecast errors) and discrete (forced generator outages and start-up failures) nature. A new method called the “flying brick” technique has been developed to evaluate the look-ahead required generation performance envelope for the worst case scenario within a user-specified confidence level. A self-validation algorithm has been developed to validate the accuracy of the confidence intervals.« less

  7. Wind Energy Management System Integration Project Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.

    2010-09-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and windmore » forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. In this report, a new methodology to predict the uncertainty ranges for the required balancing capacity, ramping capability and ramp duration is presented. Uncertainties created by system load forecast errors, wind and solar forecast errors, generation forced outages are taken into account. The uncertainty ranges are evaluated for different confidence levels of having the actual generation requirements within the corresponding limits. The methodology helps to identify system balancing reserve requirement based on a desired system performance levels, identify system “breaking points”, where the generation system becomes unable to follow the generation requirement curve with the user-specified probability level, and determine the time remaining to these potential events. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (California ISO) real life data have shown the effectiveness of the proposed approach. A tool developed based on the new methodology described in this report will be integrated with the California ISO systems. Contractual work is currently in place to integrate the tool with the AREVA EMS system.« less

  8. Confidence intervals in Flow Forecasting by using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Panagoulia, Dionysia; Tsekouras, George

    2014-05-01

    One of the major inadequacies in implementation of Artificial Neural Networks (ANNs) for flow forecasting is the development of confidence intervals, because the relevant estimation cannot be implemented directly, contrasted to the classical forecasting methods. The variation in the ANN output is a measure of uncertainty in the model predictions based on the training data set. Different methods for uncertainty analysis, such as bootstrap, Bayesian, Monte Carlo, have already proposed for hydrologic and geophysical models, while methods for confidence intervals, such as error output, re-sampling, multi-linear regression adapted to ANN have been used for power load forecasting [1-2]. The aim of this paper is to present the re-sampling method for ANN prediction models and to develop this for flow forecasting of the next day. The re-sampling method is based on the ascending sorting of the errors between real and predicted values for all input vectors. The cumulative sample distribution function of the prediction errors is calculated and the confidence intervals are estimated by keeping the intermediate value, rejecting the extreme values according to the desired confidence levels, and holding the intervals symmetrical in probability. For application of the confidence intervals issue, input vectors are used from the Mesochora catchment in western-central Greece. The ANN's training algorithm is the stochastic training back-propagation process with decreasing functions of learning rate and momentum term, for which an optimization process is conducted regarding the crucial parameters values, such as the number of neurons, the kind of activation functions, the initial values and time parameters of learning rate and momentum term etc. Input variables are historical data of previous days, such as flows, nonlinearly weather related temperatures and nonlinearly weather related rainfalls based on correlation analysis between the under prediction flow and each implicit input variable of different ANN structures [3]. The performance of each ANN structure is evaluated by the voting analysis based on eleven criteria, which are the root mean square error (RMSE), the correlation index (R), the mean absolute percentage error (MAPE), the mean percentage error (MPE), the mean percentage error (ME), the percentage volume in errors (VE), the percentage error in peak (MF), the normalized mean bias error (NMBE), the normalized root mean bias error (NRMSE), the Nash-Sutcliffe model efficiency coefficient (E) and the modified Nash-Sutcliffe model efficiency coefficient (E1). The next day flow for the test set is calculated using the best ANN structure's model. Consequently, the confidence intervals of various confidence levels for training, evaluation and test sets are compared in order to explore the generalisation dynamics of confidence intervals from training and evaluation sets. [1] H.S. Hippert, C.E. Pedreira, R.C. Souza, "Neural networks for short-term load forecasting: A review and evaluation," IEEE Trans. on Power Systems, vol. 16, no. 1, 2001, pp. 44-55. [2] G. J. Tsekouras, N.E. Mastorakis, F.D. Kanellos, V.T. Kontargyri, C.D. Tsirekis, I.S. Karanasiou, Ch.N. Elias, A.D. Salis, P.A. Kontaxis, A.A. Gialketsi: "Short term load forecasting in Greek interconnected power system using ANN: Confidence Interval using a novel re-sampling technique with corrective Factor", WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing, (CSECS '10), Vouliagmeni, Athens, Greece, December 29-31, 2010. [3] D. Panagoulia, I. Trichakis, G. J. Tsekouras: "Flow Forecasting via Artificial Neural Networks - A Study for Input Variables conditioned on atmospheric circulation", European Geosciences Union, General Assembly 2012 (NH1.1 / AS1.16 - Extreme meteorological and hydrological events induced by severe weather and climate change), Vienna, Austria, 22-27 April 2012.

  9. Load forecasting via suboptimal seasonal autoregressive models and iteratively reweighted least squares estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mbamalu, G.A.N.; El-Hawary, M.E.

    The authors propose suboptimal least squares or IRWLS procedures for estimating the parameters of a seasonal multiplicative AR model encountered during power system load forecasting. The proposed method involves using an interactive computer environment to estimate the parameters of a seasonal multiplicative AR process. The method comprises five major computational steps. The first determines the order of the seasonal multiplicative AR process, and the second uses the least squares or the IRWLS to estimate the optimal nonseasonal AR model parameters. In the third step one obtains the intermediate series by back forecast, which is followed by using the least squaresmore » or the IRWLS to estimate the optimal season AR parameters. The final step uses the estimated parameters to forecast future load. The method is applied to predict the Nova Scotia Power Corporation's 168 lead time hourly load. The results obtained are documented and compared with results based on the Box and Jenkins method.« less

  10. Operational Planning of Channel Airlift Missions Using Forecasted Demand

    DTIC Science & Technology

    2013-03-01

    tailored to the specific problem ( Metaheuristics , 2005). As seen in the section Cargo Loading Algorithm , heuristic methods are often iterative...that are equivalent to the forecasted cargo amount. The simulated pallets are then used in a heuristic cargo loading algorithm . The loading... algorithm places cargo onto available aircraft (based on real schedules) given the date and the destination and outputs statistics based on the aircraft ton

  11. Forecasting the brittle failure of heterogeneous, porous geomaterials

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian; Heap, Michael; Main, Ian; Lavallée, Yan; Dingwell, Donald

    2017-04-01

    Heterogeneity develops in magmas during ascent and is dominated by the development of crystal and importantly, bubble populations or pore-network clusters which grow, interact, localize, coalesce, outgas and resorb. Pore-scale heterogeneity is also ubiquitous in sedimentary basin fill during diagenesis. As a first step, we construct numerical simulations in 3D in which randomly generated heterogeneous and polydisperse spheres are placed in volumes and which are permitted to overlap with one another, designed to represent the random growth and interaction of bubbles in a liquid volume. We use these simulated geometries to show that statistical predictions of the inter-bubble lengthscales and evolving bubble surface area or cluster densities can be made based on fundamental percolation theory. As a second step, we take a range of well constrained random heterogeneous rock samples including sandstones, andesites, synthetic partially sintered glass bead samples, and intact glass samples and subject them to a variety of stress loading conditions at a range of temperatures until failure. We record in real time the evolution of the number of acoustic events that precede failure and show that in all scenarios, the acoustic event rate accelerates toward failure, consistent with previous findings. Applying tools designed to forecast the failure time based on these precursory signals, we constrain the absolute error on the forecast time. We find that for all sample types, the error associated with an accurate forecast of failure scales non-linearly with the lengthscale between the pore clusters in the material. Moreover, using a simple micromechanical model for the deformation of porous elastic bodies, we show that the ratio between the equilibrium sub-critical crack length emanating from the pore clusters relative to the inter-pore lengthscale, provides a scaling for the error on forecast accuracy. Thus for the first time we provide a potential quantitative correction for forecasting the failure of porous brittle solids that build the Earth's crust.

  12. Selection of Hidden Layer Neurons and Best Training Method for FFNN in Application of Long Term Load Forecasting

    NASA Astrophysics Data System (ADS)

    Singh, Navneet K.; Singh, Asheesh K.; Tripathy, Manoj

    2012-05-01

    For power industries electricity load forecast plays an important role for real-time control, security, optimal unit commitment, economic scheduling, maintenance, energy management, and plant structure planning etc. A new technique for long term load forecasting (LTLF) using optimized feed forward artificial neural network (FFNN) architecture is presented in this paper, which selects optimal number of neurons in the hidden layer as well as the best training method for the case study. The prediction performance of proposed technique is evaluated using mean absolute percentage error (MAPE) of Thailand private electricity consumption and forecasted data. The results obtained are compared with the results of classical auto-regressive (AR) and moving average (MA) methods. It is, in general, observed that the proposed method is prediction wise more accurate.

  13. Real-time anomaly detection for very short-term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Jian; Hong, Tao; Yue, Meng

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  14. Real-time anomaly detection for very short-term load forecasting

    DOE PAGES

    Luo, Jian; Hong, Tao; Yue, Meng

    2018-01-06

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  15. Load Forecasting Based Distribution System Network Reconfiguration -- A Distributed Data-Driven Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard

    In this paper, a short-term load forecasting approach based network reconfiguration is proposed in a parallel manner. Specifically, a support vector regression (SVR) based short-term load forecasting approach is designed to provide an accurate load prediction and benefit the network reconfiguration. Because of the nonconvexity of the three-phase balanced optimal power flow, a second-order cone program (SOCP) based approach is used to relax the optimal power flow problem. Then, the alternating direction method of multipliers (ADMM) is used to compute the optimal power flow in distributed manner. Considering the limited number of the switches and the increasing computation capability, themore » proposed network reconfiguration is solved in a parallel way. The numerical results demonstrate the feasible and effectiveness of the proposed approach.« less

  16. The prediction of the impact of climatic factors on short-term electric power load based on the big data of smart city

    NASA Astrophysics Data System (ADS)

    Qiu, Yunfei; Li, Xizhong; Zheng, Wei; Hu, Qinghe; Wei, Zhanmeng; Yue, Yaqin

    2017-08-01

    The climate changes have great impact on the residents’ electricity consumption, so the study on the impact of climatic factors on electric power load is of significance. In this paper, the effects of the data of temperature, rainfall and wind of smart city on short-term power load is studied to predict power load. The authors studied the relation between power load and daily temperature, rainfall and wind in the 31 days of January of one year. In the research, the authors used the Matlab neural network toolbox to establish the combinational forecasting model. The authors trained the original input data continuously to get the internal rules inside the data and used the rules to predict the daily power load in the next January. The prediction method relies on the accuracy of weather forecasting. If the weather forecasting is different from the actual weather, we need to correct the climatic factors to ensure accurate prediction.

  17. A New Approach to Detection of Systematic Errors in Secondary Substation Monitoring Equipment Based on Short Term Load Forecasting

    PubMed Central

    Moriano, Javier; Rodríguez, Francisco Javier; Martín, Pedro; Jiménez, Jose Antonio; Vuksanovic, Branislav

    2016-01-01

    In recent years, Secondary Substations (SSs) are being provided with equipment that allows their full management. This is particularly useful not only for monitoring and planning purposes but also for detecting erroneous measurements, which could negatively affect the performance of the SS. On the other hand, load forecasting is extremely important since they help electricity companies to make crucial decisions regarding purchasing and generating electric power, load switching, and infrastructure development. In this regard, Short Term Load Forecasting (STLF) allows the electric power load to be predicted over an interval ranging from one hour to one week. However, important issues concerning error detection by employing STLF has not been specifically addressed until now. This paper proposes a novel STLF-based approach to the detection of gain and offset errors introduced by the measurement equipment. The implemented system has been tested against real power load data provided by electricity suppliers. Different gain and offset error levels are successfully detected. PMID:26771613

  18. A Beacon Transmission Power Control Algorithm Based on Wireless Channel Load Forecasting in VANETs.

    PubMed

    Mo, Yuanfu; Yu, Dexin; Song, Jun; Zheng, Kun; Guo, Yajuan

    2015-01-01

    In a vehicular ad hoc network (VANET), the periodic exchange of single-hop status information broadcasts (beacon frames) produces channel loading, which causes channel congestion and induces information conflict problems. To guarantee fairness in beacon transmissions from each node and maximum network connectivity, adjustment of the beacon transmission power is an effective method for reducing and preventing channel congestion. In this study, the primary factors that influence wireless channel loading are selected to construct the KF-BCLF, which is a channel load forecasting algorithm based on a recursive Kalman filter and employs multiple regression equation. By pre-adjusting the transmission power based on the forecasted channel load, the channel load was kept within a predefined range; therefore, channel congestion was prevented. Based on this method, the CLF-BTPC, which is a transmission power control algorithm, is proposed. To verify KF-BCLF algorithm, a traffic survey method that involved the collection of floating car data along a major traffic road in Changchun City is employed. By comparing this forecast with the measured channel loads, the proposed KF-BCLF algorithm was proven to be effective. In addition, the CLF-BTPC algorithm is verified by simulating a section of eight-lane highway and a signal-controlled urban intersection. The results of the two verification process indicate that this distributed CLF-BTPC algorithm can effectively control channel load, prevent channel congestion, and enhance the stability and robustness of wireless beacon transmission in a vehicular network.

  19. A Beacon Transmission Power Control Algorithm Based on Wireless Channel Load Forecasting in VANETs

    PubMed Central

    Mo, Yuanfu; Yu, Dexin; Song, Jun; Zheng, Kun; Guo, Yajuan

    2015-01-01

    In a vehicular ad hoc network (VANET), the periodic exchange of single-hop status information broadcasts (beacon frames) produces channel loading, which causes channel congestion and induces information conflict problems. To guarantee fairness in beacon transmissions from each node and maximum network connectivity, adjustment of the beacon transmission power is an effective method for reducing and preventing channel congestion. In this study, the primary factors that influence wireless channel loading are selected to construct the KF-BCLF, which is a channel load forecasting algorithm based on a recursive Kalman filter and employs multiple regression equation. By pre-adjusting the transmission power based on the forecasted channel load, the channel load was kept within a predefined range; therefore, channel congestion was prevented. Based on this method, the CLF-BTPC, which is a transmission power control algorithm, is proposed. To verify KF-BCLF algorithm, a traffic survey method that involved the collection of floating car data along a major traffic road in Changchun City is employed. By comparing this forecast with the measured channel loads, the proposed KF-BCLF algorithm was proven to be effective. In addition, the CLF-BTPC algorithm is verified by simulating a section of eight-lane highway and a signal-controlled urban intersection. The results of the two verification process indicate that this distributed CLF-BTPC algorithm can effectively control channel load, prevent channel congestion, and enhance the stability and robustness of wireless beacon transmission in a vehicular network. PMID:26571042

  20. 2014 Gulf of Mexico Hypoxia Forecast

    USGS Publications Warehouse

    Scavia, Donald; Evans, Mary Anne; Obenour, Dan

    2014-01-01

    The Gulf of Mexico annual summer hypoxia forecasts are based on average May total nitrogen loads from the Mississippi River basin for that year. The load estimate, recently released by USGS, is 4,761 metric tons per day. Based on that estimate, we predict the area of this summer’s hypoxic zone to be 14,000 square kilometers (95% credible interval, 8,000 to 20,000) – an “average year”. Our forecast hypoxic volume is 50 km3 (95% credible interval, 20 to 77).

  1. Intra-Hour Dispatch and Automatic Generator Control Demonstration with Solar Forecasting - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coimbra, Carlos F. M.

    2016-02-25

    In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior inmore » real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.« less

  2. Short-term load and wind power forecasting using neural network-based prediction intervals.

    PubMed

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2014-02-01

    Electrical power systems are evolving from today's centralized bulk systems to more decentralized systems. Penetrations of renewable energies, such as wind and solar power, significantly increase the level of uncertainty in power systems. Accurate load forecasting becomes more complex, yet more important for management of power systems. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in system operations. To quantify potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for the construction of prediction intervals (PIs). A newly introduced method, called lower upper bound estimation (LUBE), is applied and extended to develop PIs using NN models. A new problem formulation is proposed, which translates the primary multiobjective problem into a constrained single-objective problem. Compared with the cost function, this new formulation is closer to the primary problem and has fewer parameters. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Electrical demands from Singapore and New South Wales (Australia), as well as wind power generation from Capital Wind Farm, are used to validate the PSO-based LUBE method. Comparative results show that the proposed method can construct higher quality PIs for load and wind power generation forecasts in a short time.

  3. Quantitative impact of aerosols on numerical weather prediction. Part I: Direct radiative forcing

    NASA Astrophysics Data System (ADS)

    Marquis, J. W.; Zhang, J.; Reid, J. S.; Benedetti, A.; Christensen, M.

    2017-12-01

    While the effects of aerosols on climate have been extensively studied over the past two decades, the impacts of aerosols on operational weather forecasts have not been carefully quantified. Despite this lack of quantification, aerosol plumes can impact weather forecasts directly by reducing surface reaching solar radiation and indirectly through affecting remotely sensed data that are used for weather forecasts. In part I of this study, the direct impact of smoke aerosol plumes on surface temperature forecasts are quantified using a smoke aerosol event affecting the United States Upper-Midwest in 2015. NCEP, ECMWF and UKMO model forecast surface temperature uncertainties are studied with respect to aerosol loading. Smoke aerosol direct cooling efficiencies are derived and the potential of including aerosol particles in operational forecasts is discussed, with the consideration of aerosol trends, especially over regions with heavy aerosol loading.

  4. Electricity Load Forecasting Using Support Vector Regression with Memetic Algorithms

    PubMed Central

    Hu, Zhongyi; Xiong, Tao

    2013-01-01

    Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature. PMID:24459425

  5. Electricity load forecasting using support vector regression with memetic algorithms.

    PubMed

    Hu, Zhongyi; Bao, Yukun; Xiong, Tao

    2013-01-01

    Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature.

  6. Modeling spot markets for electricity and pricing electricity derivatives

    NASA Astrophysics Data System (ADS)

    Ning, Yumei

    Spot prices for electricity have been very volatile with dramatic price spikes occurring in restructured market. The task of forecasting electricity prices and managing price risk presents a new challenge for market players. The objectives of this dissertation are: (1) to develop a stochastic model of price behavior and predict price spikes; (2) to examine the effect of weather forecasts on forecasted prices; (3) to price electricity options and value generation capacity. The volatile behavior of prices can be represented by a stochastic regime-switching model. In the model, the means of the high-price and low-price regimes and the probabilities of switching from one regime to the other are specified as functions of daily peak load. The probability of switching to the high-price regime is positively related to load, but is still not high enough at the highest loads to predict price spikes accurately. An application of this model shows how the structure of the Pennsylvania-New Jersey-Maryland market changed when market-based offers were allowed, resulting in higher price spikes. An ARIMA model including temperature, seasonal, and weekly effects is estimated to forecast daily peak load. Forecasts of load under different assumptions about weather patterns are used to predict changes of price behavior given the regime-switching model of prices. Results show that the range of temperature forecasts from a normal summer to an extremely warm summer cause relatively small increases in temperature (+1.5%) and load (+3.0%). In contrast, the increases in prices are large (+20%). The conclusion is that the seasonal outlook forecasts provided by NOAA are potentially valuable for predicting prices in electricity markets. The traditional option models, based on Geometric Brownian Motion are not appropriate for electricity prices. An option model using the regime-switching framework is developed to value a European call option. The model includes volatility risk and allows changes in prices and volatility to be correlated. The results show that the value of a power plant is much higher using the financial option model than using traditional discounted cash flow.

  7. Payette River Basin Project: Improving Operational Forecasting in Complex Terrain through Chemistry

    NASA Astrophysics Data System (ADS)

    Blestrud, D.; Kunkel, M. L.; Parkinson, S.; Holbrook, V. P.; Benner, S. G.; Fisher, J.

    2015-12-01

    Idaho Power Company (IPC) is an investor owned hydroelectric based utility, serving customers throughout southern Idaho and eastern Oregon. The University of Arizona (UA) runs an operational 1.8-km resolution Weather and Research Forecast (WRF) model for IPC, which is incorporated into IPC near and real-time forecasts for hydro, solar and wind generation, load servicing and a large-scale wintertime cloud seeding operation to increase winter snowpack. Winter snowpack is critical to IPC, as hydropower provides ~50% of the company's generation needs. In efforts to improve IPC's near-term forecasts and operational guidance to its cloud seeding program, IPC is working extensively with UA and the National Center for Atmospheric Research (NCAR) to improve WRF performance in the complex terrain of central Idaho. As part of this project, NCAR has developed a WRF based cloud seeding module (WRF CS) to deliver high-resolution, tailored forecasts to provide accurate guidance for IPC's operations. Working with Boise State University (BSU), IPC is conducting a multiyear campaign to validate the WRF CS's ability to account for and disperse the cloud seeding agent (AgI) within the boundary layer. This improved understanding of how WRF handles the AgI dispersion and fate will improve the understanding and ultimately the performance of WRF to forecast other parameters. As part of this campaign, IPC has developed an extensive ground based monitoring network including a Remote Area Snow Sampling Device (RASSD) that provides spatially and temporally discrete snow samples during active cloud seeding periods. To quantify AgI dispersion in the complex terrain, BSU conducts trace element analysis using LA-ICP-MS on the RASSD sampled snow to provide measurements (at the 10-12 level) of incorporated AgI, measurements are compare directly with WRF CS's estimates of distributed AgI. Modeling and analysis results from previous year's research and plans for coming seasons will be presented.

  8. 2013 Gulf of Mexico Hypoxia Forecast

    USGS Publications Warehouse

    Scavia, Donald; Evans, Mary Anne; Obenour, Dan

    2013-01-01

    The Gulf of Mexico annual summer hypoxia forecasts are based on average May total nitrogen loads from the Mississippi River basin for that year. The load estimate, recently released by USGS, is 7,316 metric tons per day. Based on that estimate, we predict the area of this summer’s hypoxic zone to be 18,900 square kilometers (95% credible interval, 13,400 to 24,200), the 7th largest reported and about the size of New Jersey. Our forecast hypoxic volume is 74.5 km3 (95% credible interval, 51.5 to 97.0), also the 7th largest on record.

  9. 7 CFR 1710.204 - Filing requirements for borrowers that must maintain an approved load forecast on an ongoing basis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Filing requirements for borrowers that must maintain an approved load forecast on an ongoing basis. 1710.204 Section 1710.204 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO...

  10. Automatic load forecasting. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, D.J.; Vemuri, S.

    A method which lends itself to on-line forecasting of hourly electric loads is presented and the results of its use are compared to models developed using the Box-Jenkins method. The method consists of processing the historical hourly loads with a sequential least-squares estimator to identify a finite order autoregressive model which in turn is used to obtain a parsimonious autoregressive-moving average model. A procedure is also defined for incorporating temperature as a variable to improve forecasts where loads are temperature dependent. The method presented has several advantages in comparison to the Box-Jenkins method including much less human intervention and improvedmore » model identification. The method has been tested using three-hourly data from the Lincoln Electric System, Lincoln, Nebraska. In the exhaustive analyses performed on this data base this method produced significantly better results than the Box-Jenkins method. The method also proved to be more robust in that greater confidence could be placed in the accuracy of models based upon the various measures available at the identification stage.« less

  11. A temperature match based optimization method for daily load prediction considering DLC effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Z.

    This paper presents a unique optimization method for short term load forecasting. The new method is based on the optimal template temperature match between the future and past temperatures. The optimal error reduction technique is a new concept introduced in this paper. Two case studies show that for hourly load forecasting, this method can yield results as good as the rather complicated Box-Jenkins Transfer Function method, and better than the Box-Jenkins method; for peak load prediction, this method is comparable in accuracy to the neural network method with back propagation, and can produce more accurate results than the multi-linear regressionmore » method. The DLC effect on system load is also considered in this method.« less

  12. Chesapeake Bay hypoxic volume forecasts and results

    USGS Publications Warehouse

    Scavia, Donald; Evans, Mary Anne

    2013-01-01

    The 2013 Forecast - Given the average Jan-May 2013 total nitrogen load of 162,028 kg/day, this summer’s hypoxia volume forecast is 6.1 km3, slightly smaller than average size for the period of record and almost the same as 2012. The late July 2013 measured volume was 6.92 km3.

  13. An econometric simulation model of income and electricity demand in Alaska's Railbelt, 1982-2022

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddigan, R.J.; Hill, L.J.; Hamblin, D.M.

    1987-01-01

    This report describes the specification of-and forecasts derived from-the Alaska Railbelt Electricity Load, Macroeconomic (ARELM) model. ARELM was developed as an independent, modeling tool for the evaluation of the need for power from the Susitna Hydroelectric Project which has been proposed by the Alaska Power Authority. ARELM is an econometric simulation model consisting of 61 equations - 46 behavioral equations and 15 identities. The system includes two components: (1) ARELM-MACRO which is a system of equations that simulates the performance of both the total Alaskan and Railbelt macroeconomies and (2) ARELM-LOAD which projects electricity-related activity in the Alaskan Railbelt region.more » The modeling system is block recursive in the sense that forecasts of population, personal income, and employment in the Railbelt derived from ARELM-MACRO are used as explanatory variables in ARELM-LOAD to simulate electricity demand, the real average price of electricity, and the number of customers in the Railbelt. Three scenarios based on assumptions about the future price of crude oil are simulated and documented in the report. The simulations, which do not include the cost-of-power impacts of Susitna-based generation, show that the growth rate in Railbelt electricity load is between 2.5 and 2.7% over the 1982 to 2022 forecast period. The forecasting results are consistent with other projections of load growth in the region using different modeling approaches.« less

  14. Data-driven forecasting algorithms for building energy consumption

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram

    2013-04-01

    This paper introduces two forecasting methods for building energy consumption data that are recorded from smart meters in high resolution. For utility companies, it is important to reliably forecast the aggregate consumption profile to determine energy supply for the next day and prevent any crisis. The proposed methods involve forecasting individual load on the basis of their measurement history and weather data without using complicated models of building system. The first method is most efficient for a very short-term prediction, such as the prediction period of one hour, and uses a simple adaptive time-series model. For a longer-term prediction, a nonparametric Gaussian process has been applied to forecast the load profiles and their uncertainty bounds to predict a day-ahead. These methods are computationally simple and adaptive and thus suitable for analyzing a large set of data whose pattern changes over the time. These forecasting methods are applied to several sets of building energy consumption data for lighting and heating-ventilation-air-conditioning (HVAC) systems collected from a campus building at Stanford University. The measurements are collected every minute, and corresponding weather data are provided hourly. The results show that the proposed algorithms can predict those energy consumption data with high accuracy.

  15. Forecasting Wind and Solar Generation: Improving System Operations, Greening the Grid (Spanish Version)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Tian; Chernyakhovskiy, Ilya; Brancucci Martinez-Anido, Carlo

    This document is the Spanish version of 'Greening the Grid- Forecasting Wind and Solar Generation Improving System Operations'. It discusses improving system operations with forecasting with and solar generation. By integrating variable renewable energy (VRE) forecasts into system operations, power system operators can anticipate up- and down-ramps in VRE generation in order to cost-effectively balance load and generation in intra-day and day-ahead scheduling. This leads to reduced fuel costs, improved system reliability, and maximum use of renewable resources.

  16. The Delicate Analysis of Short-Term Load Forecasting

    NASA Astrophysics Data System (ADS)

    Song, Changwei; Zheng, Yuan

    2017-05-01

    This paper proposes a new method for short-term load forecasting based on the similar day method, correlation coefficient and Fast Fourier Transform (FFT) to achieve the precision analysis of load variation from three aspects (typical day, correlation coefficient, spectral analysis) and three dimensions (time dimension, industry dimensions, the main factors influencing the load characteristic such as national policies, regional economic, holidays, electricity and so on). First, the branch algorithm one-class-SVM is adopted to selection the typical day. Second, correlation coefficient method is used to obtain the direction and strength of the linear relationship between two random variables, which can reflect the influence caused by the customer macro policy and the scale of production to the electricity price. Third, Fourier transform residual error correction model is proposed to reflect the nature of load extracting from the residual error. Finally, simulation result indicates the validity and engineering practicability of the proposed method.

  17. Application of Classification Methods for Forecasting Mid-Term Power Load Patterns

    NASA Astrophysics Data System (ADS)

    Piao, Minghao; Lee, Heon Gyu; Park, Jin Hyoung; Ryu, Keun Ho

    Currently an automated methodology based on data mining techniques is presented for the prediction of customer load patterns in long duration load profiles. The proposed approach in this paper consists of three stages: (i) data preprocessing: noise or outlier is removed and the continuous attribute-valued features are transformed to discrete values, (ii) cluster analysis: k-means clustering is used to create load pattern classes and the representative load profiles for each class and (iii) classification: we evaluated several supervised learning methods in order to select a suitable prediction method. According to the proposed methodology, power load measured from AMR (automatic meter reading) system, as well as customer indexes, were used as inputs for clustering. The output of clustering was the classification of representative load profiles (or classes). In order to evaluate the result of forecasting load patterns, the several classification methods were applied on a set of high voltage customers of the Korea power system and derived class labels from clustering and other features are used as input to produce classifiers. Lastly, the result of our experiments was presented.

  18. Ecological Forecasting: Microbial Contamination and Atmospheric Loadings of Nutrients to Land and Water

    EPA Science Inventory

    The development of ecological forecasts, namely, methodologies to predict the chemical, biological, and physical changes in terrestrial and aquatic ecosystems is desirable so that effective strategies for reducing the adverse impacts of human activities and extreme natural events...

  19. Carbon-Carbon Recuperators in Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Johnson, Paul K.

    2006-01-01

    The use of carbon-carbon (C-C) recuperators in closed-Brayton-cycle space power conversion systems was assessed. Recuperator performance was forecast based on notional thermodynamic cycle state values for planetary missions. Resulting thermal performance, mass and volume for plate-fin C-C recuperators were estimated and quantitatively compared with values for conventional offset-strip-fin metallic designs. Mass savings of 40-55% were projected for C-C recuperators with effectiveness greater than 0.9 and thermal loads from 25-1400 kWt. The smaller thermal loads corresponded with lower mass savings; however, at least 50% savings were forecast for all loads above 300 kWt. System-related material challenges and compatibility issues were also discussed.

  20. Managing the space-time-load continuum in TMDL planning: A case study for understanding groundwater loads through advanced mapping techniques

    EPA Science Inventory

    The lag time between groundwater recharge and discharge in a watershed and the potential groundwater load to streams is an important factor in forecasting responses to future land use practices. We call this concept managing the “space-time-load continuum”. It’s understood that i...

  1. An evaluation of the impact of biomass burning smoke aerosol particles on near surface temperature forecasts

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Reid, J. S.; Benedetti, A.; Christensen, M.; Marquis, J. W.

    2016-12-01

    Currently, with the improvements in aerosol forecast accuracies through aerosol data assimilation, the community is unavoidably facing a scientific question: is it worth the computational time to insert real-time aerosol analyses into numerical models for weather forecasts? In this study, by analyzing a significant biomass burning aerosol event that occurred in 2015 over the Northern part of the Central US, the impact of aerosol particles on near-surface temperature forecasts is evaluated. The aerosol direct surface cooling efficiency, which links surface temperature changes to aerosol loading, is derived from observational-based data for the first time. The potential of including real-time aerosol analyses into weather forecasting models for near surface temperature forecasts is also investigated.

  2. 7 CFR 1710.300 - General.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... forecast. The forecast should be used by the board of directors and the manager to guide the system towards... projected results of future actions planned by the borrower's board of directors; (2) The financial goals... type of large power loads, projections of future borrowings and the associated interest, projected...

  3. 7 CFR 1710.300 - General.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... forecast. The forecast should be used by the board of directors and the manager to guide the system towards... projected results of future actions planned by the borrower's board of directors; (2) The financial goals... type of large power loads, projections of future borrowings and the associated interest, projected...

  4. BLOND, a building-level office environment dataset of typical electrical appliances.

    PubMed

    Kriechbaumer, Thomas; Jacobsen, Hans-Arno

    2018-03-27

    Energy metering has gained popularity as conventional meters are replaced by electronic smart meters that promise energy savings and higher comfort levels for occupants. Achieving these goals requires a deeper understanding of consumption patterns to reduce the energy footprint: load profile forecasting, power disaggregation, appliance identification, startup event detection, etc. Publicly available datasets are used to test, verify, and benchmark possible solutions to these problems. For this purpose, we present the BLOND dataset: continuous energy measurements of a typical office environment at high sampling rates with common appliances and load profiles. We provide voltage and current readings for aggregated circuits and matching fully-labeled ground truth data (individual appliance measurements). The dataset contains 53 appliances (16 classes) in a 3-phase power grid. BLOND-50 contains 213 days of measurements sampled at 50kSps (aggregate) and 6.4kSps (individual appliances). BLOND-250 consists of the same setup: 50 days, 250kSps (aggregate), 50kSps (individual appliances). These are the longest continuous measurements at such high sampling rates and fully-labeled ground truth we are aware of.

  5. BLOND, a building-level office environment dataset of typical electrical appliances

    NASA Astrophysics Data System (ADS)

    Kriechbaumer, Thomas; Jacobsen, Hans-Arno

    2018-03-01

    Energy metering has gained popularity as conventional meters are replaced by electronic smart meters that promise energy savings and higher comfort levels for occupants. Achieving these goals requires a deeper understanding of consumption patterns to reduce the energy footprint: load profile forecasting, power disaggregation, appliance identification, startup event detection, etc. Publicly available datasets are used to test, verify, and benchmark possible solutions to these problems. For this purpose, we present the BLOND dataset: continuous energy measurements of a typical office environment at high sampling rates with common appliances and load profiles. We provide voltage and current readings for aggregated circuits and matching fully-labeled ground truth data (individual appliance measurements). The dataset contains 53 appliances (16 classes) in a 3-phase power grid. BLOND-50 contains 213 days of measurements sampled at 50kSps (aggregate) and 6.4kSps (individual appliances). BLOND-250 consists of the same setup: 50 days, 250kSps (aggregate), 50kSps (individual appliances). These are the longest continuous measurements at such high sampling rates and fully-labeled ground truth we are aware of.

  6. BLOND, a building-level office environment dataset of typical electrical appliances

    PubMed Central

    Kriechbaumer, Thomas; Jacobsen, Hans-Arno

    2018-01-01

    Energy metering has gained popularity as conventional meters are replaced by electronic smart meters that promise energy savings and higher comfort levels for occupants. Achieving these goals requires a deeper understanding of consumption patterns to reduce the energy footprint: load profile forecasting, power disaggregation, appliance identification, startup event detection, etc. Publicly available datasets are used to test, verify, and benchmark possible solutions to these problems. For this purpose, we present the BLOND dataset: continuous energy measurements of a typical office environment at high sampling rates with common appliances and load profiles. We provide voltage and current readings for aggregated circuits and matching fully-labeled ground truth data (individual appliance measurements). The dataset contains 53 appliances (16 classes) in a 3-phase power grid. BLOND-50 contains 213 days of measurements sampled at 50kSps (aggregate) and 6.4kSps (individual appliances). BLOND-250 consists of the same setup: 50 days, 250kSps (aggregate), 50kSps (individual appliances). These are the longest continuous measurements at such high sampling rates and fully-labeled ground truth we are aware of. PMID:29583141

  7. Carbon-Carbon Recuperators in Closed-Brayton-Cycle Space Power Systems

    NASA Technical Reports Server (NTRS)

    Barrett, Michael J.; Johnson, Paul K.; Naples, Andrew G.

    2006-01-01

    The feasibility of using carbon-carbon (C-C) recuperators in conceptual closed-Brayton-cycle space power conversion systems was assessed. Recuperator performance expectations were forecast based on notional thermodynamic cycle state values for potential planetary missions. Resulting thermal performance, mass and volume for plate-fin C-C recuperators were estimated and quantitatively compared with values for conventional offset-strip-fin metallic designs. Mass savings of 30 to 60 percent were projected for C-C recuperators with effectiveness greater than 0.9 and thermal loads from 25 to 1400 kWt. The smaller thermal loads corresponded with lower mass savings; however, 60 percent savings were forecast for all loads above 300 kWt. System-related material challenges and compatibility issues were also discussed.

  8. Utilization of Model Predictive Control to Balance Power Absorption Against Load Accumulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbas, Nikhar; Tom, Nathan M

    2017-06-03

    Wave energy converter (WEC) control strategies have been primarily focused on maximizing power absorption. The use of model predictive control strategies allows for a finite-horizon, multiterm objective function to be solved. This work utilizes a multiterm objective function to maximize power absorption while minimizing the structural loads on the WEC system. Furthermore, a Kalman filter and autoregressive model were used to estimate and forecast the wave exciting force and predict the future dynamics of the WEC. The WEC's power-take-off time-averaged power and structural loads under a perfect forecast assumption in irregular waves were compared against results obtained from the Kalmanmore » filter and autoregressive model to evaluate model predictive control performance.« less

  9. Utilization of Model Predictive Control to Balance Power Absorption Against Load Accumulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbas, Nikhar; Tom, Nathan

    Wave energy converter (WEC) control strategies have been primarily focused on maximizing power absorption. The use of model predictive control strategies allows for a finite-horizon, multiterm objective function to be solved. This work utilizes a multiterm objective function to maximize power absorption while minimizing the structural loads on the WEC system. Furthermore, a Kalman filter and autoregressive model were used to estimate and forecast the wave exciting force and predict the future dynamics of the WEC. The WEC's power-take-off time-averaged power and structural loads under a perfect forecast assumption in irregular waves were compared against results obtained from the Kalmanmore » filter and autoregressive model to evaluate model predictive control performance.« less

  10. On-line algorithms for forecasting hourly loads of an electric utility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vemuri, S.; Huang, W.L.; Nelson, D.J.

    A method that lends itself to on-line forecasting of hourly electric loads is presented, and the results of its use are compared to models developed using the Box-Jenkins method. The method consits of processing the historical hourly loads with a sequential least-squares estimator to identify a finite-order autoregressive model which, in turn, is used to obtain a parsimonious autoregressive-moving average model. The method presented has several advantages in comparison with the Box-Jenkins method including much-less human intervention, improved model identification, and better results. The method is also more robust in that greater confidence can be placed in the accuracy ofmore » models based upon the various measures available at the identification stage.« less

  11. Chesapeake Bay Hypoxic Volume Forecasts and Results

    USGS Publications Warehouse

    Evans, Mary Anne; Scavia, Donald

    2013-01-01

    Given the average Jan-May 2013 total nitrogen load of 162,028 kg/day, this summer's hypoxia volume forecast is 6.1 km3, slightly smaller than average size for the period of record and almost the same as 2012. The late July 2013 measured volume was 6.92 km3.

  12. A Solar Time-Based Analog Ensemble Method for Regional Solar Power Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Zhang, Xinmin; Li, Yuan

    This paper presents a new analog ensemble method for day-ahead regional photovoltaic (PV) power forecasting with hourly resolution. By utilizing open weather forecast and power measurement data, this prediction method is processed within a set of historical data with similar meteorological data (temperature and irradiance), and astronomical date (solar time and earth declination angle). Further, clustering and blending strategies are applied to improve its accuracy in regional PV forecasting. The robustness of the proposed method is demonstrated with three different numerical weather prediction models, the North American Mesoscale Forecast System, the Global Forecast System, and the Short-Range Ensemble Forecast, formore » both region level and single site level PV forecasts. Using real measured data, the new forecasting approach is applied to the load zone in Southeastern Massachusetts as a case study. The normalized root mean square error (NRMSE) has been reduced by 13.80%-61.21% when compared with three tested baselines.« less

  13. Analyzing Effect of System Inertia on Grid Frequency Forecasting Usnig Two Stage Neuro-Fuzzy System

    NASA Astrophysics Data System (ADS)

    Chourey, Divyansh R.; Gupta, Himanshu; Kumar, Amit; Kumar, Jitesh; Kumar, Anand; Mishra, Anup

    2018-04-01

    Frequency forecasting is an important aspect of power system operation. The system frequency varies with load-generation imbalance. Frequency variation depends upon various parameters including system inertia. System inertia determines the rate of fall of frequency after the disturbance in the grid. Though, inertia of the system is not considered while forecasting the frequency of power system during planning and operation. This leads to significant errors in forecasting. In this paper, the effect of inertia on frequency forecasting is analysed for a particular grid system. In this paper, a parameter equivalent to system inertia is introduced. This parameter is used to forecast the frequency of a typical power grid for any instant of time. The system gives appreciable result with reduced error.

  14. Youth temperament, harsh parenting, and variation in the oxytocin receptor gene forecast allostatic load during emerging adulthood.

    PubMed

    Brody, Gene H; Yu, Tianyi; Barton, Allen W; Miller, Gregory E; Chen, Edith

    2017-08-01

    An association has been found between receipt of harsh parenting in childhood and adult health problems. However, this research has been principally retrospective, has treated children as passive recipients of parental behavior, and has overlooked individual differences in youth responsivity to harsh parenting. In a 10-year multiple-wave prospective study of African American families, we addressed these issues by focusing on the influence of polymorphisms in the oxytocin receptor gene (OXTR), variants of which appear to buffer or amplify responses to environmental stress. The participants were 303 youths, with a mean age of 11.2 at the first assessment, and their parents, all of whom were genotyped for variations in the rs53576 (A/G) polymorphism. Teachers rated preadolescent (ages 11 to 13) emotionally intense and distractible temperaments, and adolescents (ages 15 and 16) reported receipt of harsh parenting. Allostatic load was assessed during young adulthood (ages 20 and 21). Difficult preadolescent temperament forecast elevated receipt of harsh parenting in adolescence, and adolescents who experienced harsh parenting evinced high allostatic load during young adulthood. However, these associations emerged only among children and parents who carried A alleles of the OXTR genotype. The results suggest the oxytocin system operates along with temperament and parenting to forecast young adults' allostatic load.

  15. Load research manual. Volume 2: Fundamentals of implementing load research procedures

    NASA Astrophysics Data System (ADS)

    1980-11-01

    This manual will assist electric utilities and state regulatory authorities in investigating customer electricity demand as part of cost-of-service studies, rate design, marketing research, system design, load forecasting, rate reform analysis, and load management research. Load research procedures are described in detail. Research programs at three utilities are compared: Carolina Power and Light Company, Long Island Lighting Company, and Southern California Edison Company. A load research bibliography and glossaries of load research and statistical terms are also included.

  16. Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, S.; Klein, B.

    2017-11-01

    Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.

  17. Traffic-load forecasting using weigh-in-motion data

    DOT National Transportation Integrated Search

    1997-03-01

    Vehicular traffic loading is a crucial consideration for the design and maintenance of pavements. With the help of weigh-in-motion (WIM) systems, the information about date, time, speed, lane of travel, lateral lane position, axle spacing, and wheel ...

  18. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Shuai; Makarov, Yuri V.

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on Februarymore » 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.« less

  19. Adaptive measurements of urban runoff quality

    NASA Astrophysics Data System (ADS)

    Wong, Brandon P.; Kerkez, Branko

    2016-11-01

    An approach to adaptively measure runoff water quality dynamics is introduced, focusing specifically on characterizing the timing and magnitude of urban pollutographs. Rather than relying on a static schedule or flow-weighted sampling, which can miss important water quality dynamics if parameterized inadequately, novel Internet-enabled sensor nodes are used to autonomously adapt their measurement frequency to real-time weather forecasts and hydrologic conditions. This dynamic approach has the potential to significantly improve the use of constrained experimental resources, such as automated grab samplers, which continue to provide a strong alternative to sampling water quality dynamics when in situ sensors are not available. Compared to conventional flow-weighted or time-weighted sampling schemes, which rely on preset thresholds, a major benefit of the approach is the ability to dynamically adapt to features of an underlying hydrologic signal. A 28 km2 urban watershed was studied to characterize concentrations of total suspended solids (TSS) and total phosphorus. Water quality samples were autonomously triggered in response to features in the underlying hydrograph and real-time weather forecasts. The study watershed did not exhibit a strong first flush and intraevent concentration variability was driven by flow acceleration, wherein the largest loadings of TSS and total phosphorus corresponded with the steepest rising limbs of the storm hydrograph. The scalability of the proposed method is discussed in the context of larger sensor network deployments, as well the potential to improving control of urban water quality.

  20. Development of personal pollen information—the next generation of pollen information and a step forward for hay fever sufferers

    NASA Astrophysics Data System (ADS)

    Kmenta, Maximilian; Bastl, Katharina; Jäger, Siegfried; Berger, Uwe

    2014-10-01

    Pollen allergies affect a large part of the European population and are considered likely to increase. User feedback indicates that there are difficulties in providing proper information and valid forecasts using traditional methods of aerobiology due to a variety of factors. Allergen content, pollen loads, and pollen allergy symptoms vary per region and year. The first steps in challenging such issues have already been undertaken. A personalized pollen-related symptom forecast is thought to be a possible answer. However, attempts made thus far have not led to an improvement in daily forecasting procedures. This study describes a model that was launched in 2013 in Austria to provide the first available personal pollen information. This system includes innovative forecast models using bi-hourly pollen data, traditional pollen forecasts based on historical data, meteorological data, and recent symptom data from the patient's hayfever diary. Furthermore, it calculates the personal symptom load in real time, in particular, the entries of the previous 5 days, to classify users. The personal pollen information was made available in Austria on the Austrian pollen information website and via a mobile pollen application, described herein for the first time. It is supposed that the inclusion of personal symptoms will lead to major improvements in pollen information concerning hay fever sufferers.

  1. Evaluation of Pollen Apps Forecasts: The Need for Quality Control in an eHealth Service.

    PubMed

    Bastl, Katharina; Berger, Uwe; Kmenta, Maximilian

    2017-05-08

    Pollen forecasts are highly valuable for allergen avoidance and thus raising the quality of life of persons concerned by pollen allergies. They are considered as valuable free services for the public. Careful scientific evaluation of pollen forecasts in terms of accurateness and reliability has not been available till date. The aim of this study was to analyze 9 mobile apps, which deliver pollen information and pollen forecasts, with a focus on their accurateness regarding the prediction of the pollen load in the grass pollen season 2016 to assess their usefulness for pollen allergy sufferers. The following number of apps was evaluated for each location: 3 apps for Vienna (Austria), 4 apps for Berlin (Germany), and 1 app each for Basel (Switzerland) and London (United Kingdom). All mobile apps were freely available. Today's grass pollen forecast was compared throughout the defined grass pollen season at each respective location with measured grass pollen concentrations. Hit rates were calculated for the exact performance and for a tolerance in a range of ±2 and ±4 pollen per cubic meter. In general, for most apps, hit rates score around 50% (6 apps). It was found that 1 app showed better results, whereas 3 apps performed less well. Hit rates increased when calculated with tolerances for most apps. In contrast, the forecast for the "readiness to flower" for grasses was performed at a sufficiently accurate level, although only two apps provided such a forecast. The last of those forecasts coincided with the first moderate grass pollen load on the predicted day or 3 days after and performed even from about a month before well within the range of 3 days. Advertisement was present in 3 of the 9 analyzed apps, whereas an imprint mentioning institutions with experience in pollen forecasting was present in only three other apps. The quality of pollen forecasts is in need of improvement, and quality control for pollen forecasts is recommended to avoid potential harm to pollen allergy sufferers due to inadequate forecasts. The inclusion of information on reliability of provided forecasts and a similar handling regarding probabilistic weather forecasts should be considered. ©Katharina Bastl, Uwe Berger, Maximilian Kmenta. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 08.05.2017.

  2. Evaluation of Pollen Apps Forecasts: The Need for Quality Control in an eHealth Service

    PubMed Central

    Berger, Uwe; Kmenta, Maximilian

    2017-01-01

    Background Pollen forecasts are highly valuable for allergen avoidance and thus raising the quality of life of persons concerned by pollen allergies. They are considered as valuable free services for the public. Careful scientific evaluation of pollen forecasts in terms of accurateness and reliability has not been available till date. Objective The aim of this study was to analyze 9 mobile apps, which deliver pollen information and pollen forecasts, with a focus on their accurateness regarding the prediction of the pollen load in the grass pollen season 2016 to assess their usefulness for pollen allergy sufferers. Methods The following number of apps was evaluated for each location: 3 apps for Vienna (Austria), 4 apps for Berlin (Germany), and 1 app each for Basel (Switzerland) and London (United Kingdom). All mobile apps were freely available. Today’s grass pollen forecast was compared throughout the defined grass pollen season at each respective location with measured grass pollen concentrations. Hit rates were calculated for the exact performance and for a tolerance in a range of ±2 and ±4 pollen per cubic meter. Results In general, for most apps, hit rates score around 50% (6 apps). It was found that 1 app showed better results, whereas 3 apps performed less well. Hit rates increased when calculated with tolerances for most apps. In contrast, the forecast for the “readiness to flower” for grasses was performed at a sufficiently accurate level, although only two apps provided such a forecast. The last of those forecasts coincided with the first moderate grass pollen load on the predicted day or 3 days after and performed even from about a month before well within the range of 3 days. Advertisement was present in 3 of the 9 analyzed apps, whereas an imprint mentioning institutions with experience in pollen forecasting was present in only three other apps. Conclusions The quality of pollen forecasts is in need of improvement, and quality control for pollen forecasts is recommended to avoid potential harm to pollen allergy sufferers due to inadequate forecasts. The inclusion of information on reliability of provided forecasts and a similar handling regarding probabilistic weather forecasts should be considered. PMID:28483740

  3. KSC-2010-1054

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., Boeing spacecraft fueling technicians from Kennedy Space Center prepare the equipment necessary to sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  4. Clarus multi-state regional demonstrations, evaluation of use case #2 : seasonal load restriction tool.

    DOT National Transportation Integrated Search

    2011-07-01

    This report presents the results of an evaluation of the demonstration of an experimental seasonal load restriction decision support tool. This system offers state DOTs subsurface condition forecasts (such as moisture, temperature, and freeze-thaw tr...

  5. Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality

    NASA Astrophysics Data System (ADS)

    Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.

    2014-12-01

    The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.

  6. Bulk electric system reliability evaluation incorporating wind power and demand side management

    NASA Astrophysics Data System (ADS)

    Huang, Dange

    Electric power systems are experiencing dramatic changes with respect to structure, operation and regulation and are facing increasing pressure due to environmental and societal constraints. Bulk electric system reliability is an important consideration in power system planning, design and operation particularly in the new competitive environment. A wide range of methods have been developed to perform bulk electric system reliability evaluation. Theoretically, sequential Monte Carlo simulation can include all aspects and contingencies in a power system and can be used to produce an informative set of reliability indices. It has become a practical and viable tool for large system reliability assessment technique due to the development of computing power and is used in the studies described in this thesis. The well-being approach used in this research provides the opportunity to integrate an accepted deterministic criterion into a probabilistic framework. This research work includes the investigation of important factors that impact bulk electric system adequacy evaluation and security constrained adequacy assessment using the well-being analysis framework. Load forecast uncertainty is an important consideration in an electrical power system. This research includes load forecast uncertainty considerations in bulk electric system reliability assessment and the effects on system, load point and well-being indices and reliability index probability distributions are examined. There has been increasing worldwide interest in the utilization of wind power as a renewable energy source over the last two decades due to enhanced public awareness of the environment. Increasing penetration of wind power has significant impacts on power system reliability, and security analyses become more uncertain due to the unpredictable nature of wind power. The effects of wind power additions in generating and bulk electric system reliability assessment considering site wind speed correlations and the interactive effects of wind power and load forecast uncertainty on system reliability are examined. The concept of the security cost associated with operating in the marginal state in the well-being framework is incorporated in the economic analyses associated with system expansion planning including wind power and load forecast uncertainty. Overall reliability cost/worth analyses including security cost concepts are applied to select an optimal wind power injection strategy in a bulk electric system. The effects of the various demand side management measures on system reliability are illustrated using the system, load point, and well-being indices, and the reliability index probability distributions. The reliability effects of demand side management procedures in a bulk electric system including wind power and load forecast uncertainty considerations are also investigated. The system reliability effects due to specific demand side management programs are quantified and examined in terms of their reliability benefits.

  7. Self-heating forecasting for thick laminate specimens in fatigue

    NASA Astrophysics Data System (ADS)

    Lahuerta, F.; Westphal, T.; Nijssen, R. P. L.

    2014-12-01

    Thick laminate sections can be found from the tip to the root in most common wind turbine blade designs. Obtaining accurate and reliable design data for thick laminates is subject of investigations, which include experiments on thick laminate coupons. Due to the poor thermal conductivity properties of composites and the material self-heating that occurs during the fatigue loading, high temperature gradients may appear through the laminate thickness. In the case of thick laminates in high load regimes, the core temperature might influence the mechanical properties, leading to premature failures. In the present work a method to forecast the self-heating of thick laminates in fatigue loading is presented. The mechanical loading is related with the laminate self-heating, via the cyclic strain energy and the energy loss ratio. Based on this internal volumetric heat load a thermal model is built and solved to obtain the temperature distribution in the transient state. Based on experimental measurements of the energy loss factor for 10mm thick coupons, the method is described and the resulting predictions are compared with experimental surface temperature measurements on 10 and 30mm UD thick laminate specimens.

  8. Forecasting staffing needs for productivity management in hospital laboratories.

    PubMed

    Pang, C Y; Swint, J M

    1985-12-01

    Daily and weekly prediction models are developed to help forecast hospital laboratory work load for the entire laboratory and individual sections of the laboratory. The models are tested using historical data obtained from hospital census and laboratory log books of a 90-bed southwestern hospital. The results indicate that the predictor variables account for 50%, 81%, 56%, and 82% of the daily work load variation for chemistry, hematology, and microbiology sections, and for the entire laboratory, respectively. Equivalent results for the weekly model are 53%, 72%, 12%, and 78% for the same respective sections. On the basis of the predicted work load, staffing assessment is made and a productivity monitoring system constructed. The purpose of such a system is to assist laboratory management in efforts to utilize laboratory manpower in a more efficient and cost-effective manner.

  9. Efficient Resources Provisioning Based on Load Forecasting in Cloud

    PubMed Central

    Hu, Rongdong; Jiang, Jingfei; Liu, Guangming; Wang, Lixin

    2014-01-01

    Cloud providers should ensure QoS while maximizing resources utilization. One optimal strategy is to timely allocate resources in a fine-grained mode according to application's actual resources demand. The necessary precondition of this strategy is obtaining future load information in advance. We propose a multi-step-ahead load forecasting method, KSwSVR, based on statistical learning theory which is suitable for the complex and dynamic characteristics of the cloud computing environment. It integrates an improved support vector regression algorithm and Kalman smoother. Public trace data taken from multitypes of resources were used to verify its prediction accuracy, stability, and adaptability, comparing with AR, BPNN, and standard SVR. Subsequently, based on the predicted results, a simple and efficient strategy is proposed for resource provisioning. CPU allocation experiment indicated it can effectively reduce resources consumption while meeting service level agreements requirements. PMID:24701160

  10. Spring thaw predictor & development of real time spring load restrictions.

    DOT National Transportation Integrated Search

    2011-02-01

    This report summarizes the results of a study to develop a correlation between weather forecasts and the : spring thaw in order to reduce the duration of load limits on New Hampshire roadways. The study used a falling : weight deflectometer at 10 sit...

  11. Managing the space-time-load continuum in TMDL planning: a case study for understanding groundwaer loads through advanced mapping techniques

    Treesearch

    Phillip Harte; Marcel Belaval; Andrea Traviglia

    2016-01-01

    The lag time between groundwater recharge and discharge in a watershed and the potential groundwater load to streams is an important factor in forecasting responses to future land use practices. We call this concept managing the “space-time-load continuum.” It’s understood that in any given watershed, the response function (the load at any given time) will differ for...

  12. Forecasting Strategies for Predicting Peak Electric Load Days

    NASA Astrophysics Data System (ADS)

    Saxena, Harshit

    Academic institutions spend thousands of dollars every month on their electric power consumption. Some of these institutions follow a demand charges pricing structure; here the amount a customer pays to the utility is decided based on the total energy consumed during the month, with an additional charge based on the highest average power load required by the customer over a moving window of time as decided by the utility. Therefore, it is crucial for these institutions to minimize the time periods where a high amount of electric load is demanded over a short duration of time. In order to reduce the peak loads and have more uniform energy consumption, it is imperative to predict when these peaks occur, so that appropriate mitigation strategies can be developed. The research work presented in this thesis has been conducted for Rochester Institute of Technology (RIT), where the demand charges are decided based on a 15 minute sliding window panned over the entire month. This case study makes use of different statistical and machine learning algorithms to develop a forecasting strategy for predicting the peak electric load days of the month. The proposed strategy was tested for a whole year starting May 2015 to April 2016 during which a total of 57 peak days were observed. The model predicted a total of 74 peak days during this period, 40 of these cases were true positives, hence achieving an accuracy level of 70 percent. The results obtained with the proposed forecasting strategy are promising and demonstrate an annual savings potential worth about $80,000 for a single submeter of RIT.

  13. Reassessing hypoxia forecasts for the Gulf of Mexico.

    PubMed

    Scavia, Donald; Donnelly, Kristina A

    2007-12-01

    Gulf of Mexico hypoxia has received considerable scientific and policy attention because of its potential ecological and economic impacts and implications for agriculture within its massive watershed. A 2000 assessment concluded that increased nitrate load to the Gulf since the 1950s was the primary cause of large-scale hypoxia areas. More recently, models have suggested that large-scale hypoxia did not start untilthe mid-1970s, and that a 40-45% nitrogen load reduction may be needed to reach the hypoxia area goal of the Hypoxia Action Plan. Recently, USGS revised nutrient load estimates to the Gulf, and the Action Plan reassessment has questioned the role of phosphorus versus nitrogen in controlling hypoxia. In this paper, we re-evaluate model simulations, hindcasts, and forecasts using revised nitrogen loads, and testthe ability of a phosphorus-driven version of the model to reproduce hypoxia trends. Our analysis suggests that, if phosphorus is limiting now, it became so because of relative increases in nitrogen loads during the 1970s and 1980s. While our model suggests nitrogen load reductions of 37-45% or phosphorus load reductions of 40-50% below the 1980-1996 average are needed, we caution that a phosphorus-only strategy is potentially dangerous, and suggest it would be prudent to reduce both.

  14. 7 CFR 1710.203 - Requirement to prepare a load forecast-distribution borrowers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...—distribution borrowers. (a) A distribution borrower that is a member of a power supply borrower with a total... forecast work plan of its power supply borrower. (b) A distribution borrower that is a member of a power supply borrower which is itself a member of another power supply borrower that has a total utility plant...

  15. Wet snow hazard for power lines: a forecast and alert system applied in Italy

    NASA Astrophysics Data System (ADS)

    Bonelli, P.; Lacavalla, M.; Marcacci, P.; Mariani, G.; Stella, G.

    2011-09-01

    Wet snow icing accretion on power lines is a real problem in Italy, causing failures on high and medium voltage power supplies during the cold season. The phenomenon is a process in which many large and local scale variables contribute in a complex way and not completely understood. A numerical weather forecast can be used to select areas where wet snow accretion has an high probability of occurring, but a specific accretion model must also be used to estimate the load of an ice sleeve and its hazard. All the information must be carefully selected and shown to the electric grid operator in order to warn him promptly. The authors describe a prototype of forecast and alert system, WOLF (Wet snow Overload aLert and Forecast), developed and applied in Italy. The prototype elaborates the output of a numerical weather prediction model, as temperature, precipitation, wind intensity and direction, to determine the areas of potential risk for the power lines. Then an accretion model computes the ice sleeves' load for different conductor diameters. The highest values are selected and displayed on a WEB-GIS application principally devoted to the electric operator, but also to more expert users. Some experimental field campaigns have been conducted to better parameterize the accretion model. Comparisons between real accidents and forecasted icing conditions are presented and discussed.

  16. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  17. Controller for thermostatically controlled loads

    DOEpatents

    Lu, Ning; Zhang, Yu; Du, Pengwei; Makarov, Yuri V.

    2016-06-07

    A system and method of controlling aggregated thermostatically controlled appliances (TCAs) for demand response is disclosed. A targeted load profile is formulated and a forecasted load profile is generated. The TCAs within an "on" or "off" control group are prioritized based on their operating temperatures. The "on" or "off" status of the TCAs is determined. Command signals are sent to turn on or turn off the TCAs.

  18. Nowcasting and Forecasting the Monthly Food Stamps Data in the US Using Online Search Data

    PubMed Central

    Fantazzini, Dean

    2014-01-01

    We propose the use of Google online search data for nowcasting and forecasting the number of food stamps recipients. We perform a large out-of-sample forecasting exercise with almost 3000 competing models with forecast horizons up to 2 years ahead, and we show that models including Google search data statistically outperform the competing models at all considered horizons. These results hold also with several robustness checks, considering alternative keywords, a falsification test, different out-of-samples, directional accuracy and forecasts at the state-level. PMID:25369315

  19. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  20. 76 FR 72203 - Voltage Coordination on High Voltage Grids; Notice of Reliability Workshop Agenda

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. AD12-5-000] Voltage... currently coordinate the dispatch of reactive resources to support forecasted loads, generation and... reactive power needs of the distribution system or loads are coordinated or optimized. Panelists: Khaled...

  1. A SIMPLE MODEL FOR FORECASTING THE EFFECTS OF NITROGEN LOADS ON CHESAPEAKE BAY HYPOXIA

    EPA Science Inventory

    The causes and consequences of oxygen depletion in Chesapeake Bay have been the focus of research, assessment, and policy action over the past several decades. An ongoing scientific re-evaluation of what nutrients load reductions are necessary to meet the water quality goals is ...

  2. KSC-2010-1052

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., spacecraft fueling technicians from Kennedy Space Center prepare to sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. From left are SDO technician Brian Kittle and ASTROTECH mission/facility manager Gerard Gleeson. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  3. KSC-2010-1055

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., Boeing spacecraft fueling technicians from Kennedy Space Center take a sample of the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO, which is protectively covered. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  4. KSC-2010-1053

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., Boeing spacecraft fueling technicians from Kennedy Space Center prepare to sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO, which is protectively covered. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  5. KSC-2010-1050

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., spacecraft fueling technicians from Kennedy Space Center prepare to sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. From left are Boeing technician Steve Lay and ASTROTECH mission/facility manager Gerard Gleeson. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  6. KSC-2010-1049

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., spacecraft fueling technicians from Kennedy Space Center prepare to sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. From left are Boeing technicians Richard Gillman and Steve Lay, and SDO technician Brian Kittle. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  7. KSC-2010-1058

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – In the control room at the Astrotech Space Operations facility in Titusville, Fla., test conductors from ASTROTECH and Kennedy Space Center monitor data received from the clean room as technicians sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  8. KSC-2010-1057

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – In the control room at the Astrotech Space Operations facility in Titusville, Fla., a team of Kennedy Space Center spacecraft fueling specialists and engineers monitors data received from the clean room as technicians sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  9. KSC-2010-1056

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., Boeing spacecraft fueling technicians from Kennedy Space Center take a sample of the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO, which is protectively covered. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  10. KSC-2010-1051

    NASA Image and Video Library

    2010-01-07

    CAPE CANAVERAL, Fla. – At the Astrotech Space Operations facility in Titusville, Fla., spacecraft fueling technicians from Kennedy Space Center prepare to sample the monomethylhydrazine propellant that will be loaded aboard the Solar Dynamics Observatory, or SDO. From left are Boeing technician Steve Lay and ASTROTECH mission/facility manager Gerard Gleeson. The hydrazine fuel is being sampled for purity before it is loaded aboard the spacecraft. The technicians are dressed in self-contained atmospheric protective ensemble suits, or SCAPE suits, as a safety precaution in the unlikely event that any of the highly toxic chemical should escape from the storage tank. The nitrogen tetroxide oxidizer was loaded earlier in the week which is customarily followed by loading of the fuel. Propellant loading is one of the final processing milestones before the spacecraft is encapsulated in its fairing for launch. SDO is the first mission in NASA's Living With a Star Program and is designed to study the causes of solar variability and its impacts on Earth. The spacecraft's long-term measurements will give solar scientists in-depth information to help characterize the interior of the Sun, the Sun's magnetic field, the hot plasma of the solar corona, and the density of radiation that creates the ionosphere of the planets. The information will be used to create better forecasts of space weather needed to protect the aircraft, satellites and astronauts living and working in space. Liftoff aboard an Atlas V rocket is targeted for Feb. 9 from Launch Complex 41 on Cape Canaveral Air Force Station. For information on SDO, visit http://www.nasa.gov/sdo. Photo credit: NASA/Jack Pfaller

  11. A robust method to forecast volcanic ash clouds

    USGS Publications Warehouse

    Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin

    2012-01-01

    Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an efficient means to assess all of the hazards associated with these ash clouds.

  12. Modeling and Analysis of Commercial Building Electrical Loads for Demand Side Management

    NASA Astrophysics Data System (ADS)

    Berardino, Jonathan

    In recent years there has been a push in the electric power industry for more customer involvement in the electricity markets. Traditionally the end user has played a passive role in the planning and operation of the power grid. However, many energy markets have begun opening up opportunities to consumers who wish to commit a certain amount of their electrical load under various demand side management programs. The potential benefits of more demand participation include reduced operating costs and new revenue opportunities for the consumer, as well as more reliable and secure operations for the utilities. The management of these load resources creates challenges and opportunities to the end user that were not present in previous market structures. This work examines the behavior of commercial-type building electrical loads and their capacity for supporting demand side management actions. This work is motivated by the need for accurate and dynamic tools to aid in the advancement of demand side operations. A dynamic load model is proposed for capturing the response of controllable building loads. Building-specific load forecasting techniques are developed, with particular focus paid to the integration of building management system (BMS) information. These approaches are tested using Drexel University building data. The application of building-specific load forecasts and dynamic load modeling to the optimal scheduling of multi-building systems in the energy market is proposed. Sources of potential load uncertainty are introduced in the proposed energy management problem formulation in order to investigate the impact on the resulting load schedule.

  13. Artificial neural network and SARIMA based models for power load forecasting in Turkish electricity market

    PubMed Central

    2017-01-01

    Load information plays an important role in deregulated electricity markets, since it is the primary factor to make critical decisions on production planning, day-to-day operations, unit commitment and economic dispatch. Being able to predict the load for a short term, which covers one hour to a few days, equips power generation facilities and traders with an advantage. With the deregulation of electricity markets, a variety of short term load forecasting models are developed. Deregulation in Turkish Electricity Market has started in 2001 and liberalization is still in progress with rules being effective in its predefined schedule. However, there is a very limited number of studies for Turkish Market. In this study, we introduce two different models for current Turkish Market using Seasonal Autoregressive Integrated Moving Average (SARIMA) and Artificial Neural Network (ANN) and present their comparative performances. Building models that cope with the dynamic nature of deregulated market and are able to run in real-time is the main contribution of this study. We also use our ANN based model to evaluate the effect of several factors, which are claimed to have effect on electrical load. PMID:28426739

  14. Artificial neural network and SARIMA based models for power load forecasting in Turkish electricity market.

    PubMed

    Bozkurt, Ömer Özgür; Biricik, Göksel; Tayşi, Ziya Cihan

    2017-01-01

    Load information plays an important role in deregulated electricity markets, since it is the primary factor to make critical decisions on production planning, day-to-day operations, unit commitment and economic dispatch. Being able to predict the load for a short term, which covers one hour to a few days, equips power generation facilities and traders with an advantage. With the deregulation of electricity markets, a variety of short term load forecasting models are developed. Deregulation in Turkish Electricity Market has started in 2001 and liberalization is still in progress with rules being effective in its predefined schedule. However, there is a very limited number of studies for Turkish Market. In this study, we introduce two different models for current Turkish Market using Seasonal Autoregressive Integrated Moving Average (SARIMA) and Artificial Neural Network (ANN) and present their comparative performances. Building models that cope with the dynamic nature of deregulated market and are able to run in real-time is the main contribution of this study. We also use our ANN based model to evaluate the effect of several factors, which are claimed to have effect on electrical load.

  15. Determination of sample size for higher volatile data using new framework of Box-Jenkins model with GARCH: A case study on gold price

    NASA Astrophysics Data System (ADS)

    Roslindar Yaziz, Siti; Zakaria, Roslinazairimah; Hura Ahmad, Maizah

    2017-09-01

    The model of Box-Jenkins - GARCH has been shown to be a promising tool for forecasting higher volatile time series. In this study, the framework of determining the optimal sample size using Box-Jenkins model with GARCH is proposed for practical application in analysing and forecasting higher volatile data. The proposed framework is employed to daily world gold price series from year 1971 to 2013. The data is divided into 12 different sample sizes (from 30 to 10200). Each sample is tested using different combination of the hybrid Box-Jenkins - GARCH model. Our study shows that the optimal sample size to forecast gold price using the framework of the hybrid model is 1250 data of 5-year sample. Hence, the empirical results of model selection criteria and 1-step-ahead forecasting evaluations suggest that the latest 12.25% (5-year data) of 10200 data is sufficient enough to be employed in the model of Box-Jenkins - GARCH with similar forecasting performance as by using 41-year data.

  16. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.

  17. Advanced Cloud Forecasting for Solar Energy Production

    NASA Astrophysics Data System (ADS)

    Werth, D. W.; Parker, M. J.

    2017-12-01

    A power utility must decide days in advance how it will allocate projected loads among its various generating sources. If the latter includes solar plants, the utility must predict how much energy the plants will produce - any shortfall will have to be compensated for by purchasing power as it is needed, when it is more expensive. To avoid this, utilities often err on the side of caution and assume that a relatively small amount of solar energy will be available, and allocate correspondingly more load to coal-fired plants. If solar irradiance can be predicted more accurately, utilities can be more confident that the predicted solar energy will indeed be available when needed, and assign solar plants a larger share of the future load. Solar power production is increasing in the Southeast, but is often hampered by irregular cloud fields, especially during high-pressure periods when rapid afternoon thunderstorm development can occur during what was predicted to be a clear day. We are currently developing an analog forecasting system to predict solar irradiance at the surface at the Savannah River Site in South Carolina, with the goal of improving predictions of available solar energy. Analog forecasting is based on the assumption that similar initial conditions will lead to similar outcomes, and involves the use of an algorithm to look through the weather patterns of the past to identify previous conditions (the analogs) similar to those of today. For our application, we select three predictor variables - sea-level pressure, 700mb geopotential, and 700mb humidity. These fields for the current day are compared to those from past days, and a weighted combination of the differences (defined by a cost function) is used to select the five best analog days. The observed solar irradiance values subsequent to the dates of those analogs are then combined to represent the forecast for the next day. We will explain how we apply the analog process, and compare it to existing solar forecasts.

  18. Short-Term Distribution System State Forecast Based on Optimal Synchrophasor Sensor Placement and Extreme Learning Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang; Zhang, Yingchen

    This paper proposes an approach for distribution system state forecasting, which aims to provide an accurate and high speed state forecasting with an optimal synchrophasor sensor placement (OSSP) based state estimator and an extreme learning machine (ELM) based forecaster. Specifically, considering the sensor installation cost and measurement error, an OSSP algorithm is proposed to reduce the number of synchrophasor sensor and keep the whole distribution system numerically and topologically observable. Then, the weighted least square (WLS) based system state estimator is used to produce the training data for the proposed forecaster. Traditionally, the artificial neural network (ANN) and support vectormore » regression (SVR) are widely used in forecasting due to their nonlinear modeling capabilities. However, the ANN contains heavy computation load and the best parameters for SVR are difficult to obtain. In this paper, the ELM, which overcomes these drawbacks, is used to forecast the future system states with the historical system states. The proposed approach is effective and accurate based on the testing results.« less

  19. Personality, Cognitive Style, Motivation, and Aptitude Predict Systematic Trends in Analytic Forecasting Behavior.

    PubMed

    Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M

    2014-12-01

    The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.

  20. Personality, Cognitive Style, Motivation, and Aptitude Predict Systematic Trends in Analytic Forecasting Behavior

    PubMed Central

    Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.

    2014-01-01

    The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670

  1. [Forecast of costs of ecodependent cancer treatment for the development of management decisions].

    PubMed

    Krasovskiy, V O

    2014-01-01

    The methodical approach for probabilistic forecasting and differentiation of treatment of costs of ecodependent cancer cases has been elaborated. The modality is useful in the organization of medical aid to cancer patients, in developing management decisions for the reduction the occupational load on the population, as well as in solutions problems in compensation to the population economic and social loss from industrial plants.

  2. Visualizing and Integrating AFSCN Utilization into a Common Operational Picture

    NASA Astrophysics Data System (ADS)

    Hays, B.; Carlile, A.; Mitchell, T.

    The Department of Defense (DoD) and the 50th Space Network Operations Group Studies and Analysis branch (50th SCS/SCXI), located at Schriever AFB Colorado, face the unique challenge of forecasting the expected near term and future utilization of the Air Force Satellite Control Network (AFSCN). The forecasting timeframe covers the planned load from the current date to ten years out. The various satellite missions, satellite requirements, orbital regions, and ground architecture dynamics provide the model inputs and constraints that are used in generating the forecasted load. The AFSCN is the largest network the Air Force uses to control satellites worldwide. Each day, network personnel perform over 500 scheduled events-from satellite maneuvers to critical data downloads. The Forecasting Objective is to provide leadership with the insights necessary to manage the network today and tomorrow. For both today's needs and future needs, SCXI develops AFSCN utilization forecasts to optimize the ground system's coverage and capacity to meet user satellite requirements. SCXI also performs satellite program specific studies to determine network support feasibility. STK and STK Scheduler form the core of the tools used by SCXI. To establish this tool suite, we had to evaluate, evolve, and validate both the COTS products and our own developed code and processes. This began with calibrating the network model to emulate the real life scheduling environment of the AFSCN. Multiple STK Scheduler optimizing (de-confliction) algorithms, including Multi-Pass, Sequential, Random, and Neural, were evaluated and adjusted to determine applicability to the model and the accuracy of the prediction. Additionally, the scheduling Figure of Merit (FOM), which permits custom weighting of various parameters, was analyzed and tested to achieve the most accurate real life result. With the inherent capabilities of STK and the ability to wrap and automate output, SCXI is now able to visually communicate satellite loads in a manner never seen before in AFSCN management meetings. Scenarios such as regional antenna load stress, satellite missed opportunities, and the overall network "big picture" can be visually displayed in 3D versus the textual and line graph methods used for many years. This is the first step towards an integrated space awareness picture with an operational focus. SCXI is working on taking the visual forecast concept farther and begin fusing multiple sources of data to build a 50 SW Common Operating Picture (COP). The vision is to integrate more effective orbital determination processes, resource outages, current and forecasted satellite mission requirements, and future architectural changes into a real-time visual status to enable quick and responsive decisions. This COP would be utilized in a Wing Operations Center to provide up to the minute network status on where satellites are, which ground resources are in contact with them, and what resources are down. The ability to quickly absorb and process this data will enhance decision analysis and save valuable time in both day to day operations and wartime scenarios.

  3. Adjusting particle-size distributions to account for aggregation in tephra-deposit model forecasts

    USGS Publications Warehouse

    Mastin, Larry G.; Van Eaton, Alexa; Durant, A.J.

    2016-01-01

    Volcanic ash transport and dispersion (VATD) models are used to forecast tephra deposition during volcanic eruptions. Model accuracy is limited by the fact that fine-ash aggregates (clumps into clusters), thus altering patterns of deposition. In most models this is accounted for by ad hoc changes to model input, representing fine ash as aggregates with density ρagg, and a log-normal size distribution with median μagg and standard deviation σagg. Optimal values may vary between eruptions. To test the variance, we used the Ash3d tephra model to simulate four deposits: 18 May 1980 Mount St. Helens; 16–17 September 1992 Crater Peak (Mount Spurr); 17 June 1996 Ruapehu; and 23 March 2009 Mount Redoubt. In 192 simulations, we systematically varied μagg and σagg, holding ρagg constant at 600 kg m−3. We evaluated the fit using three indices that compare modeled versus measured (1) mass load at sample locations; (2) mass load versus distance along the dispersal axis; and (3) isomass area. For all deposits, under these inputs, the best-fit value of μagg ranged narrowly between  ∼  2.3 and 2.7φ (0.20–0.15 mm), despite large variations in erupted mass (0.25–50 Tg), plume height (8.5–25 km), mass fraction of fine ( <  0.063 mm) ash (3–59 %), atmospheric temperature, and water content between these eruptions. This close agreement suggests that aggregation may be treated as a discrete process that is insensitive to eruptive style or magnitude. This result offers the potential for a simple, computationally efficient parameterization scheme for use in operational model forecasts. Further research may indicate whether this narrow range also reflects physical constraints on processes in the evolving cloud.

  4. Operational wave now- and forecast in the German Bight as a basis for the assessment of wave-induced hydrodynamic loads on coastal dikes

    NASA Astrophysics Data System (ADS)

    Dreier, Norman; Fröhle, Peter

    2017-12-01

    The knowledge of the wave-induced hydrodynamic loads on coastal dikes including their temporal and spatial resolution on the dike in combination with actual water levels is of crucial importance of any risk-based early warning system. As a basis for the assessment of the wave-induced hydrodynamic loads, an operational wave now- and forecast system is set up that consists of i) available field measurements from the federal and local authorities and ii) data from numerical simulation of waves in the German Bight using the SWAN wave model. In this study, results of the hindcast of deep water wave conditions during the winter storm on 5-6 December, 2013 (German name `Xaver') are shown and compared with available measurements. Moreover field measurements of wave run-up from the local authorities at a sea dike on the German North Sea Island of Pellworm are presented and compared against calculated wave run-up using the EurOtop (2016) approach.

  5. David Palchak | NREL

    Science.gov Websites

    Electrical load forecasting with artificial neural networks Demand-side management optimization with Matlab -58491. D. Palchak, S. Suryanarayanan, and D. Zimmerle. "An Artificial Neural Network in Short-Term

  6. A comparative study on GM (1,1) and FRMGM (1,1) model in forecasting FBM KLCI

    NASA Astrophysics Data System (ADS)

    Ying, Sah Pei; Zakaria, Syerrina; Mutalib, Sharifah Sakinah Syed Abd

    2017-11-01

    FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBM KLCI) is a group of indexes combined in a standardized way and is used to measure the Malaysia overall market across the time. Although composite index can give ideas about stock market to investors, it is hard to predict accurately because it is volatile and it is necessary to identify a best model to forecast FBM KLCI. The objective of this study is to determine the most accurate forecasting model between GM (1,1) model and Fourier Residual Modification GM (1,1) (FRMGM (1,1)) model to forecast FBM KLCI. In this study, the actual daily closing data of FBM KLCI was collected from January 1, 2016 to March 15, 2016. GM (1,1) model and FRMGM (1,1) model were used to build the grey model and to test forecasting power of both models. Mean Absolute Percentage Error (MAPE) was used as a measure to determine the best model. Forecasted value by FRMGM (1,1) model do not differ much than the actual value compare to GM (1,1) model for in-sample and out-sample data. Results from MAPE also show that FRMGM (1,1) model is lower than GM (1,1) model for in-sample and out-sample data. These results shown that FRMGM (1,1) model is better than GM (1,1) model to forecast FBM KLCI.

  7. Short-term forecasts gain in accuracy. [Regression technique using ''Box-Jenkins'' analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Box-Jenkins time-series models offer accuracy for short-term forecasts that compare with large-scale macroeconomic forecasts. Utilities need to be able to forecast peak demand in order to plan their generating, transmitting, and distribution systems. This new method differs from conventional models by not assuming specific data patterns, but by fitting available data into a tentative pattern on the basis of auto-correlations. Three types of models (autoregressive, moving average, or mixed autoregressive/moving average) can be used according to which provides the most appropriate combination of autocorrelations and related derivatives. Major steps in choosing a model are identifying potential models, estimating the parametersmore » of the problem, and running a diagnostic check to see if the model fits the parameters. The Box-Jenkins technique is well suited for seasonal patterns, which makes it possible to have as short as hourly forecasts of load demand. With accuracy up to two years, the method will allow electricity price-elasticity forecasting that can be applied to facility planning and rate design. (DCK)« less

  8. Forecasting giant, catastrophic slope collapse: lessons from Vajont, Northern Italy

    NASA Astrophysics Data System (ADS)

    Kilburn, Christopher R. J.; Petley, David N.

    2003-08-01

    Rapid, giant landslides, or sturzstroms, are among the most powerful natural hazards on Earth. They have minimum volumes of ˜10 6-10 7 m 3 and, normally preceded by prolonged intervals of accelerating creep, are produced by catastrophic and deep-seated slope collapse (loads ˜1-10 MPa). Conventional analyses attribute rapid collapse to unusual mechanisms, such as the vaporization of ground water during sliding. Here, catastrophic collapse is related to self-accelerating rock fracture, common in crustal rocks at loads ˜1-10 MPa and readily catalysed by circulating fluids. Fracturing produces an abrupt drop in resisting stress. Measured stress drops in crustal rock account for minimum sturzstrom volumes and rapid collapse accelerations. Fracturing also provides a physical basis for quantitatively forecasting catastrophic slope failure.

  9. Improving volcanic sulfur dioxide cloud dispersal forecasts by progressive assimilation of satellite observations

    NASA Astrophysics Data System (ADS)

    Boichu, Marie; Clarisse, Lieven; Khvorostyanov, Dmitry; Clerbaux, Cathy

    2014-04-01

    Forecasting the dispersal of volcanic clouds during an eruption is of primary importance, especially for ensuring aviation safety. As volcanic emissions are characterized by rapid variations of emission rate and height, the (generally) high level of uncertainty in the emission parameters represents a critical issue that limits the robustness of volcanic cloud dispersal forecasts. An inverse modeling scheme, combining satellite observations of the volcanic cloud with a regional chemistry-transport model, allows reconstructing this source term at high temporal resolution. We demonstrate here how a progressive assimilation of freshly acquired satellite observations, via such an inverse modeling procedure, allows for delivering robust sulfur dioxide (SO2) cloud dispersal forecasts during the eruption. This approach provides a computationally cheap estimate of the expected location and mass loading of volcanic clouds, including the identification of SO2-rich parts.

  10. Neural network based load and price forecasting and confidence interval estimation in deregulated power markets

    NASA Astrophysics Data System (ADS)

    Zhang, Li

    With the deregulation of the electric power market in New England, an independent system operator (ISO) has been separated from the New England Power Pool (NEPOOL). The ISO provides a regional spot market, with bids on various electricity-related products and services submitted by utilities and independent power producers. A utility can bid on the spot market and buy or sell electricity via bilateral transactions. Good estimation of market clearing prices (MCP) will help utilities and independent power producers determine bidding and transaction strategies with low risks, and this is crucial for utilities to compete in the deregulated environment. MCP prediction, however, is difficult since bidding strategies used by participants are complicated and MCP is a non-stationary process. The main objective of this research is to provide efficient short-term load and MCP forecasting and corresponding confidence interval estimation methodologies. In this research, the complexity of load and MCP with other factors is investigated, and neural networks are used to model the complex relationship between input and output. With improved learning algorithm and on-line update features for load forecasting, a neural network based load forecaster was developed, and has been in daily industry use since summer 1998 with good performance. MCP is volatile because of the complexity of market behaviors. In practice, neural network based MCP predictors usually have a cascaded structure, as several key input factors need to be estimated first. In this research, the uncertainties involved in a cascaded neural network structure for MCP prediction are analyzed, and prediction distribution under the Bayesian framework is developed. A fast algorithm to evaluate the confidence intervals by using the memoryless Quasi-Newton method is also developed. The traditional back-propagation algorithm for neural network learning needs to be improved since MCP is a non-stationary process. The extended Kalman filter (EKF) can be used as an integrated adaptive learning and confidence interval estimation algorithm for neural networks, with fast convergence and small confidence intervals. However, EKF learning is computationally expensive because it involves high dimensional matrix manipulations. A modified U-D factorization within the decoupled EKF (DEKF-UD) framework is developed in this research. The computational efficiency and numerical stability are significantly improved.

  11. FAA Aviation Forecasts Fiscal Years 1988-1999.

    DTIC Science & Technology

    1988-02-01

    in the 48 contiguous States, Hawaii, Puerto Rico, and the U.S. Virgin Islands. Excluded fromn the data base is activity in Alaska, other U.S...passengerl miles increased b%-’"’. 17.2 percent. Traffic in Hawaii, Puerto Rico, and the U.S. Virgin Islands, ., Ihio wevr, haid slower growth with passenger...trip length for Hawaii/Puerto Rico/ Virgin Islands is expected to remain constant at 98.0 miles over the forecast period. The average industry load

  12. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  13. Model of medicines sales forecasting taking into account factors of influence

    NASA Astrophysics Data System (ADS)

    Kravets, A. G.; Al-Gunaid, M. A.; Loshmanov, V. I.; Rasulov, S. S.; Lempert, L. B.

    2018-05-01

    The article describes a method for forecasting sales of medicines in conditions of data sampling, which is insufficient for building a model based on historical data alone. The developed method is applicable mainly to new drugs that are already licensed and released for sale but do not yet have stable sales performance in the market. The purpose of this study is to prove the effectiveness of the suggested method forecasting drug sales, taking into account the selected factors of influence, revealed during the review of existing solutions and analysis of the specificity of the area under study. Three experiments were performed on samples of different volumes, which showed an improvement in the accuracy of forecasting sales in small samples.

  14. Impact of hindcast length on estimates of seasonal climate predictability.

    PubMed

    Shi, W; Schaller, N; MacLeod, D; Palmer, T N; Weisheimer, A

    2015-03-16

    It has recently been argued that single-model seasonal forecast ensembles are overdispersive, implying that the real world is more predictable than indicated by estimates of so-called perfect model predictability, particularly over the North Atlantic. However, such estimates are based on relatively short forecast data sets comprising just 20 years of seasonal predictions. Here we study longer 40 year seasonal forecast data sets from multimodel seasonal forecast ensemble projects and show that sampling uncertainty due to the length of the hindcast periods is large. The skill of forecasting the North Atlantic Oscillation during winter varies within the 40 year data sets with high levels of skill found for some subperiods. It is demonstrated that while 20 year estimates of seasonal reliability can show evidence of overdispersive behavior, the 40 year estimates are more stable and show no evidence of overdispersion. Instead, the predominant feature on these longer time scales is underdispersion, particularly in the tropics. Predictions can appear overdispersive due to hindcast length sampling errorLonger hindcasts are more robust and underdispersive, especially in the tropicsTwenty hindcasts are an inadequate sample size to assess seasonal forecast skill.

  15. Forecast Modelling via Variations in Binary Image-Encoded Information Exploited by Deep Learning Neural Networks.

    PubMed

    Liu, Da; Xu, Ming; Niu, Dongxiao; Wang, Shoukai; Liang, Sai

    2016-01-01

    Traditional forecasting models fit a function approximation from dependent invariables to independent variables. However, they usually get into trouble when date are presented in various formats, such as text, voice and image. This study proposes a novel image-encoded forecasting method that input and output binary digital two-dimensional (2D) images are transformed from decimal data. Omitting any data analysis or cleansing steps for simplicity, all raw variables were selected and converted to binary digital images as the input of a deep learning model, convolutional neural network (CNN). Using shared weights, pooling and multiple-layer back-propagation techniques, the CNN was adopted to locate the nexus among variations in local binary digital images. Due to the computing capability that was originally developed for binary digital bitmap manipulation, this model has significant potential for forecasting with vast volume of data. The model was validated by a power loads predicting dataset from the Global Energy Forecasting Competition 2012.

  16. Forecast Modelling via Variations in Binary Image-Encoded Information Exploited by Deep Learning Neural Networks

    PubMed Central

    Xu, Ming; Niu, Dongxiao; Wang, Shoukai; Liang, Sai

    2016-01-01

    Traditional forecasting models fit a function approximation from dependent invariables to independent variables. However, they usually get into trouble when date are presented in various formats, such as text, voice and image. This study proposes a novel image-encoded forecasting method that input and output binary digital two-dimensional (2D) images are transformed from decimal data. Omitting any data analysis or cleansing steps for simplicity, all raw variables were selected and converted to binary digital images as the input of a deep learning model, convolutional neural network (CNN). Using shared weights, pooling and multiple-layer back-propagation techniques, the CNN was adopted to locate the nexus among variations in local binary digital images. Due to the computing capability that was originally developed for binary digital bitmap manipulation, this model has significant potential for forecasting with vast volume of data. The model was validated by a power loads predicting dataset from the Global Energy Forecasting Competition 2012. PMID:27281032

  17. Study on power grid characteristics in summer based on Linear regression analysis

    NASA Astrophysics Data System (ADS)

    Tang, Jin-hui; Liu, You-fei; Liu, Juan; Liu, Qiang; Liu, Zhuan; Xu, Xi

    2018-05-01

    The correlation analysis of power load and temperature is the precondition and foundation for accurate load prediction, and a great deal of research has been made. This paper constructed the linear correlation model between temperature and power load, then the correlation of fault maintenance work orders with the power load is researched. Data details of Jiangxi province in 2017 summer such as temperature, power load, fault maintenance work orders were adopted in this paper to develop data analysis and mining. Linear regression models established in this paper will promote electricity load growth forecast, fault repair work order review, distribution network operation weakness analysis and other work to further deepen the refinement.

  18. Quantitative impact of aerosols on numerical weather prediction. Part II: Impacts to IR radiance assimilation

    NASA Astrophysics Data System (ADS)

    Marquis, J. W.; Campbell, J. R.; Oyola, M. I.; Ruston, B. C.; Zhang, J.

    2017-12-01

    This is part II of a two-part series examining the impacts of aerosol particles on weather forecasts. In this study, the aerosol indirect effects on weather forecasts are explored by examining the temperature and moisture analysis associated with assimilating dust contaminated hyperspectral infrared radiances. The dust induced temperature and moisture biases are quantified for different aerosol vertical distribution and loading scenarios. The overall impacts of dust contamination on temperature and moisture forecasts are quantified over the west coast of Africa, with the assistance of aerosol retrievals from AERONET, MPL, and CALIOP. At last, methods for improving hyperspectral infrared data assimilation in dust contaminated regions are proposed.

  19. 7 CFR 1710.152 - Primary support documents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... capital investments required to serve a borrower's planned new loads, improve service reliability and... manager to guide the system toward its financial goals. The forecast submitted in support of a loan...

  20. Forecasting daily patient volumes in the emergency department.

    PubMed

    Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L

    2008-02-01

    Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.

  1. Planning a Target Renewable Portfolio using Atmospheric Modeling and Stochastic Optimization

    NASA Astrophysics Data System (ADS)

    Hart, E.; Jacobson, M. Z.

    2009-12-01

    A number of organizations have suggested that an 80% reduction in carbon emissions by 2050 is a necessary step to mitigate climate change and that decarbonization of the electricity sector is a crucial component of any strategy to meet this target. Integration of large renewable and intermittent generators poses many new problems in power system planning. In this study, we attempt to determine an optimal portfolio of renewable resources to meet best the fluctuating California load while also meeting an 80% carbon emissions reduction requirement. A stochastic optimization scheme is proposed that is based on a simplified model of the California electricity grid. In this single-busbar power system model, the load is met with generation from wind, solar thermal, photovoltaic, hydroelectric, geothermal, and natural gas plants. Wind speeds and insolation are calculated using GATOR-GCMOM, a global-through-urban climate-weather-air pollution model. Fields were produced for California and Nevada at 21km SN by 14 km WE spatial resolution every 15 minutes for the year 2006. Load data for 2006 were obtained from the California ISO OASIS database. Maximum installed capacities for wind and solar thermal generation were determined using a GIS analysis of potential development sites throughout the state. The stochastic optimization scheme requires that power balance be achieved in a number of meteorological and load scenarios that deviate from the forecasted (or modeled) data. By adjusting the error distributions of the forecasts, the model describes how improvements in wind speed and insolation forecasting may affect the optimal renewable portfolio. Using a simple model, we describe the diversity, size, and sensitivities of a renewable portfolio that is best suited to the resources and needs of California and that contributes significantly to reduction of the state’s carbon emissions.

  2. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less

  3. Scaling of coupled dilatancy-diffusion processes in space and time

    NASA Astrophysics Data System (ADS)

    Main, I. G.; Bell, A. F.; Meredith, P. G.; Brantut, N.; Heap, M.

    2012-04-01

    Coupled dilatancy-diffusion processes resulting from microscopically brittle damage due to precursory cracking have been observed in the laboratory and suggested as a mechanism for earthquake precursors. One reason precursors have proven elusive may be the scaling in space: recent geodetic and seismic data placing strong limits on the spatial extent of the nucleation zone for recent earthquakes. Another may be the scaling in time: recent laboratory results on axi-symmetric samples show both a systematic decrease in circumferential extensional strain at failure and a delayed and a sharper acceleration of acoustic emission event rate as strain rate is decreased. Here we examine the scaling of such processes in time from laboratory to field conditions using brittle creep (constant stress loading) to failure tests, in an attempt to bridge part of the strain rate gap to natural conditions, and discuss the implications for forecasting the failure time. Dilatancy rate is strongly correlated to strain rate, and decreases to zero in the steady-rate creep phase at strain rates around 10-9 s-1 for a basalt from Mount Etna. The data are well described by a creep model based on the linear superposition of transient (decelerating) and accelerating micro-crack growth due to stress corrosion. The model produces good fits to the failure time in retrospect using the accelerating acoustic emission event rate, but in prospective tests on synthetic data with the same properties we find failure-time forecasting is subject to systematic epistemic and aleatory uncertainties that degrade predictability. The next stage is to use the technology developed to attempt failure forecasting in real time, using live streamed data and a public web-based portal to quantify the prospective forecast quality under such controlled laboratory conditions.

  4. Evolution of damage during deformation in porous granular materials (Louis Néel Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Main, Ian

    2014-05-01

    'Crackling noise' occurs in a wide variety of systems that respond to external forcing in an intermittent way, leading to sudden bursts of energy release similar to those heard when crunching up a piece of paper or listening to a fire. In mineral magnetism ('Barkhausen') crackling noise occurs due to sudden changes in the size and orientation of microscopic ferromagnetic domains when the external magnetic field is changed. In rock physics sudden changes in internal stress associated with microscopically brittle failure events lead to acoustic emissions that can be recorded on the sample boundary, and used to infer the state of internal damage. Crackling noise is inherently stochastic, but the population of events often exhibits remarkably robust scaling properties, in terms of the source area, duration, energy, and in the waiting time between events. Here I describe how these scaling properties emerge and evolve spontaneously in a fully-dynamic discrete element model of sedimentary rocks subject to uniaxial compression at a constant strain rate. The discrete elements have structural disorder similar to that of a real rock, and this is the only source of heterogeneity. Despite the stationary loading and the lack of any time-dependent weakening processes, the results are all characterized by emergent power law distributions over a broad range of scales, in agreement with experimental observation. As deformation evolves, the scaling exponents change systematically in a way that is similar to the evolution of damage in experiments on real sedimentary rocks. The potential for real-time failure forecasting is examined by using synthetic and real data from laboratory tests and prior to volcanic eruptions. The combination of non-linearity and an irreducible stochastic component leads to significant variations in the precision and accuracy of the forecast failure time, leading to a significant proportion of 'false alarms' (forecast too early) and 'missed events' (forecast too late), as well as an over-optimistic assessments of forecasting power and quality when the failure time is known (the 'benefit of hindsight'). The evolution becomes progressively more complex, and the forecasting power diminishes, in going from ideal synthetics to controlled laboratory tests to open natural systems at larger scales in space and time.

  5. Volcanic Eruption Forecasts From Accelerating Rates of Drumbeat Long-Period Earthquakes

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Hernandez, Stephen; Main, Ian G.; Gaunt, H. Elizabeth; Mothes, Patricia; Ruiz, Mario

    2018-02-01

    Accelerating rates of quasiperiodic "drumbeat" long-period earthquakes (LPs) are commonly reported before eruptions at andesite and dacite volcanoes, and promise insights into the nature of fundamental preeruptive processes and improved eruption forecasts. Here we apply a new Bayesian Markov chain Monte Carlo gamma point process methodology to investigate an exceptionally well-developed sequence of drumbeat LPs preceding a recent large vulcanian explosion at Tungurahua volcano, Ecuador. For more than 24 hr, LP rates increased according to the inverse power law trend predicted by material failure theory, and with a retrospectively forecast failure time that agrees with the eruption onset within error. LPs resulted from repeated activation of a single characteristic source driven by accelerating loading, rather than a distributed failure process, showing that similar precursory trends can emerge from quite different underlying physics. Nevertheless, such sequences have clear potential for improving forecasts of eruptions at Tungurahua and analogous volcanoes.

  6. 7 CFR 1710.210 - Waiver of requirements or approval criteria.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND GUARANTEES Load Forecasts § 1710.210 Waiver of requirements or approval criteria. For good cause...

  7. 7 CFR 1710.210 - Waiver of requirements or approval criteria.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... SERVICE, DEPARTMENT OF AGRICULTURE GENERAL AND PRE-LOAN POLICIES AND PROCEDURES COMMON TO ELECTRIC LOANS AND GUARANTEES Load Forecasts § 1710.210 Waiver of requirements or approval criteria. For good cause...

  8. Dynamic linear models to explore time-varying suspended sediment-discharge rating curves

    NASA Astrophysics Data System (ADS)

    Ahn, Kuk-Hyun; Yellen, Brian; Steinschneider, Scott

    2017-06-01

    This study presents a new method to examine long-term dynamics in sediment yield using time-varying sediment-discharge rating curves. Dynamic linear models (DLMs) are introduced as a time series filter that can assess how the relationship between streamflow and sediment concentration or load changes over time in response to a wide variety of natural and anthropogenic watershed disturbances or long-term changes. The filter operates by updating parameter values using a recursive Bayesian design that responds to 1 day-ahead forecast errors while also accounting for observational noise. The estimated time series of rating curve parameters can then be used to diagnose multiscale (daily-decadal) variability in sediment yield after accounting for fluctuations in streamflow. The technique is applied in a case study examining changes in turbidity load, a proxy for sediment load, in the Esopus Creek watershed, part of the New York City drinking water supply system. The results show that turbidity load exhibits a complex array of variability across time scales. The DLM highlights flood event-driven positive hysteresis, where turbidity load remained elevated for months after large flood events, as a major component of dynamic behavior in the rating curve relationship. The DLM also produces more accurate 1 day-ahead loading forecasts compared to other static and time-varying rating curve methods. The results suggest that DLMs provide a useful tool for diagnosing changes in sediment-discharge relationships over time and may help identify variability in sediment concentrations and loads that can be used to inform dynamic water quality management.

  9. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  10. The transport forecast - an important stage of transport management

    NASA Astrophysics Data System (ADS)

    Dragu, Vasile; Dinu, Oana; Oprea, Cristina; Alina Roman, Eugenia

    2017-10-01

    The transport system is a powerful system with varying loads in operation coming from changes in freight and passenger traffic in different time periods. The variations are due to the specific conditions of organization and development of socio-economic activities. The causes of varying loads can be included in three groups: economic, technical and organizational. The assessing of transport demand variability leads to proper forecast and development of the transport system, knowing that the market price is determined on equilibrium between supply and demand. The reduction of transport demand variability through different technical solutions, organizational, administrative, legislative leads to an increase in the efficiency and effectiveness of transport. The paper presents a new way of assessing the future needs of transport through dynamic series. Both researchers and practitioners in transport planning can benefit from the research results. This paper aims to analyze in an original approach how a good transport forecast can lead to a better management in transport, with significant effects on transport demand full meeting in quality terms. The case study shows how dynamic series of statistics can be used to identify the size of future demand addressed to the transport system.

  11. Influenza forecasting with Google Flu Trends.

    PubMed

    Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E

    2013-01-01

    We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.

  12. Hybrid robust predictive optimization method of power system dispatch

    DOEpatents

    Chandra, Ramu Sharat [Niskayuna, NY; Liu, Yan [Ballston Lake, NY; Bose, Sumit [Niskayuna, NY; de Bedout, Juan Manuel [West Glenville, NY

    2011-08-02

    A method of power system dispatch control solves power system dispatch problems by integrating a larger variety of generation, load and storage assets, including without limitation, combined heat and power (CHP) units, renewable generation with forecasting, controllable loads, electric, thermal and water energy storage. The method employs a predictive algorithm to dynamically schedule different assets in order to achieve global optimization and maintain the system normal operation.

  13. Simulation of Wind Profile Perturbations for Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    2004-01-01

    Ideally, a statistically representative sample of measured high-resolution wind profiles with wavelengths as small as tens of meters is required in design studies to establish aerodynamic load indicator dispersions and vehicle control system capability. At most potential launch sites, high- resolution wind profiles may not exist. Representative samples of Rawinsonde wind profiles to altitudes of 30 km are more likely to be available from the extensive network of measurement sites established for routine sampling in support of weather observing and forecasting activity. Such a sample, large enough to be statistically representative of relatively large wavelength perturbations, would be inadequate for launch vehicle design assessments because the Rawinsonde system accurately measures wind perturbations with wavelengths no smaller than 2000 m (1000 m altitude increment). The Kennedy Space Center (KSC) Jimsphere wind profiles (150/month and seasonal 2 and 3.5-hr pairs) are the only adequate samples of high resolution profiles approx. 150 to 300 m effective resolution, but over-sampled at 25 m intervals) that have been used extensively for launch vehicle design assessments. Therefore, a simulation process has been developed for enhancement of measured low-resolution Rawinsonde profiles that would be applicable in preliminary launch vehicle design studies at launch sites other than KSC.

  14. Knowing what to expect, forecasting monthly emergency department visits: A time-series analysis.

    PubMed

    Bergs, Jochen; Heerinckx, Philipe; Verelst, Sandra

    2014-04-01

    To evaluate an automatic forecasting algorithm in order to predict the number of monthly emergency department (ED) visits one year ahead. We collected retrospective data of the number of monthly visiting patients for a 6-year period (2005-2011) from 4 Belgian Hospitals. We used an automated exponential smoothing approach to predict monthly visits during the year 2011 based on the first 5 years of the dataset. Several in- and post-sample forecasting accuracy measures were calculated. The automatic forecasting algorithm was able to predict monthly visits with a mean absolute percentage error ranging from 2.64% to 4.8%, indicating an accurate prediction. The mean absolute scaled error ranged from 0.53 to 0.68 indicating that, on average, the forecast was better compared with in-sample one-step forecast from the naïve method. The applied automated exponential smoothing approach provided useful predictions of the number of monthly visits a year in advance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Network-Cognizant Voltage Droop Control for Distribution Grids

    DOE PAGES

    Baker, Kyri; Bernstein, Andrey; Dall'Anese, Emiliano; ...

    2017-08-07

    Our paper examines distribution systems with a high integration of distributed energy resources (DERs) and addresses the design of local control methods for real-time voltage regulation. Particularly, the paper focuses on proportional control strategies where the active and reactive output-powers of DERs are adjusted in response to (and proportionally to) local changes in voltage levels. The design of the voltage-active power and voltage-reactive power characteristics leverages suitable linear approximation of the AC power-flow equations and is network-cognizant; that is, the coefficients of the controllers embed information on the location of the DERs and forecasted non-controllable loads/injections and, consequently, on themore » effect of DER power adjustments on the overall voltage profile. We pursued a robust approach to cope with uncertainty in the forecasted non-controllable loads/power injections. Stability of the proposed local controllers is analytically assessed and numerically corroborated.« less

  16. Network-Cognizant Voltage Droop Control for Distribution Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Kyri; Bernstein, Andrey; Dall'Anese, Emiliano

    Our paper examines distribution systems with a high integration of distributed energy resources (DERs) and addresses the design of local control methods for real-time voltage regulation. Particularly, the paper focuses on proportional control strategies where the active and reactive output-powers of DERs are adjusted in response to (and proportionally to) local changes in voltage levels. The design of the voltage-active power and voltage-reactive power characteristics leverages suitable linear approximation of the AC power-flow equations and is network-cognizant; that is, the coefficients of the controllers embed information on the location of the DERs and forecasted non-controllable loads/injections and, consequently, on themore » effect of DER power adjustments on the overall voltage profile. We pursued a robust approach to cope with uncertainty in the forecasted non-controllable loads/power injections. Stability of the proposed local controllers is analytically assessed and numerically corroborated.« less

  17. Some Advances in Downscaling Probabilistic Climate Forecasts for Agricultural Decision Support

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A.

    2015-12-01

    Seasonal climate forecasts, commonly provided in tercile-probabilities format (below-, near- and above-normal), need to be translated into more meaningful information for decision support of practitioners in agriculture. In this paper, we will present two new novel approaches to temporally downscale probabilistic seasonal climate forecasts: one non-parametric and another parametric method. First, the non-parametric downscaling approach called FResampler1 uses the concept of 'conditional block sampling' of weather data to create daily weather realizations of a tercile-based seasonal climate forecasts. FResampler1 randomly draws time series of daily weather parameters (e.g., rainfall, maximum and minimum temperature and solar radiation) from historical records, for the season of interest from years that belong to a certain rainfall tercile category (e.g., being below-, near- and above-normal). In this way, FResampler1 preserves the covariance between rainfall and other weather parameters as if conditionally sampling maximum and minimum temperature and solar radiation if that day is wet or dry. The second approach called predictWTD is a parametric method based on a conditional stochastic weather generator. The tercile-based seasonal climate forecast is converted into a theoretical forecast cumulative probability curve. Then the deviates for each percentile is converted into rainfall amount or frequency or intensity to downscale the 'full' distribution of probabilistic seasonal climate forecasts. Those seasonal deviates are then disaggregated on a monthly basis and used to constrain the downscaling of forecast realizations at different percentile values of the theoretical forecast curve. As well as the theoretical basis of the approaches we will discuss sensitivity analysis (length of data and size of samples) of them. In addition their potential applications for managing climate-related risks in agriculture will be shown through a couple of case studies based on actual seasonal climate forecasts for: rice cropping in the Philippines and maize cropping in India and Kenya.

  18. When Brain Beats Behavior: Neuroforecasting Crowdfunding Outcomes

    PubMed Central

    Yoon, Carolyn

    2017-01-01

    Although traditional economic and psychological theories imply that individual choice best scales to aggregate choice, primary components of choice reflected in neural activity may support even more generalizable forecasts. Crowdfunding represents a significant and growing platform for funding new and unique projects, causes, and products. To test whether neural activity could forecast market-level crowdfunding outcomes weeks later, 30 human subjects (14 female) decided whether to fund proposed projects described on an Internet crowdfunding website while undergoing scanning with functional magnetic resonance imaging. Although activity in both the nucleus accumbens (NAcc) and medial prefrontal cortex predicted individual choices to fund on a trial-to-trial basis in the neuroimaging sample, only NAcc activity generalized to forecast market funding outcomes weeks later on the Internet. Behavioral measures from the neuroimaging sample, however, did not forecast market funding outcomes. This pattern of associations was replicated in a second study. These findings demonstrate that a subset of the neural predictors of individual choice can generalize to forecast market-level crowdfunding outcomes—even better than choice itself. SIGNIFICANCE STATEMENT Forecasting aggregate behavior with individual neural data has proven elusive; even when successful, neural forecasts have not historically supplanted behavioral forecasts. In the current research, we find that neural responses can forecast market-level choice and outperform behavioral measures in a novel Internet crowdfunding context. Targeted as well as model-free analyses convergently indicated that nucleus accumbens activity can support aggregate forecasts. Beyond providing initial evidence for neuropsychological processes implicated in crowdfunding choices, these findings highlight the ability of neural features to forecast aggregate choice, which could inform applications relevant to business and policy. PMID:28821681

  19. Forecasting of dissolved oxygen in the Guanting reservoir using an optimized NGBM (1,1) model.

    PubMed

    An, Yan; Zou, Zhihong; Zhao, Yanfei

    2015-03-01

    An optimized nonlinear grey Bernoulli model was proposed by using a particle swarm optimization algorithm to solve the parameter optimization problem. In addition, each item in the first-order accumulated generating sequence was set in turn as an initial condition to determine which alternative would yield the highest forecasting accuracy. To test the forecasting performance, the optimized models with different initial conditions were then used to simulate dissolved oxygen concentrations in the Guanting reservoir inlet and outlet (China). The empirical results show that the optimized model can remarkably improve forecasting accuracy, and the particle swarm optimization technique is a good tool to solve parameter optimization problems. What's more, the optimized model with an initial condition that performs well in in-sample simulation may not do as well as in out-of-sample forecasting. Copyright © 2015. Published by Elsevier B.V.

  20. Predicting Academic Library Circulations: A Forecasting Methods Competition.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    Based on sample data representing five years of monthly circulation totals from 50 academic libraries in Illinois, Iowa, Michigan, Minnesota, Missouri, and Ohio, a study was conducted to determine the most efficient smoothing forecasting methods for academic libraries. Smoothing forecasting methods were chosen because they have been characterized…

  1. Analysis of rock mass dynamic impact influence on the operation of a powered roof support control system

    NASA Astrophysics Data System (ADS)

    Szurgacz, Dawid; Brodny, Jaroław

    2018-01-01

    A powered roof support is a machine responsible for protection of an underground excavation against deformation generated by rock mass. In the case of dynamic impact of rock mass, the proper level of protection is hard to achieve. Therefore, the units of the roof support and its components are subject to detailed tests aimed at acquiring greater reliability, efficiency and efficacy. In the course of such test, however, it is not always possible to foresee values of load that may occur in actual conditions. The article presents a case of a dynamic load impacting the powered roof support during a high-energy tremor in an underground hard coal mine. The authors discuss the method for selecting powered roof support units proper for specific forecasted load conditions. The method takes into account the construction of the support and mining and geological conditions of an excavation. Moreover, the paper includes tests carried out on hydraulic legs and yield valves which were responsible for additional yielding of the support. Real loads impacting the support unit during tremors are analysed. The results indicated that the real registered values of the load were significantly greater than the forecasted values. The analysis results of roof support operation during dynamic impact generated by the rock mass (real life conditions) prompted the authors to develop a set of recommendations for manufacturers and users of powered roof supports. These include, inter alia, the need for innovative solutions for testing hydraulic section systems.

  2. Assessment of the Charging Policy in Energy Efficiency of the Enterprise

    NASA Astrophysics Data System (ADS)

    Shutov, E. A.; E Turukina, T.; Anisimov, T. S.

    2017-04-01

    The forecasting problem for energy facilities with a power exceeding 670 kW is currently one of the main. In connection with rules of the retail electricity market such customers also pay for actual energy consumption deviations from plan value. In compliance with the hierarchical stages of the electricity market a guaranteeing supplier is to respect the interests of distribution and generation companies that require load leveling. The answer to this question for industrial enterprise is possible only within technological process through implementation of energy-efficient processing chains with the adaptive function and forecasting tool. In such a circumstance the primary objective of a forecasting is reduce the energy consumption costs by taking account of the energy cost correlation for 24 hours for forming of pumping unit work schedule. The pumping unit virtual model with the variable frequency drive is considered. The forecasting tool and the optimizer are integrated into typical control circuit. Economic assessment of the optimization method was estimated.

  3. Forecasting selenium discharges to the San Francisco Bay-Delta Estuary: ecological effects of a proposed San Luis drain extension

    USGS Publications Warehouse

    Luoma, Samuel N.; Presser, Theresa S.

    2000-01-01

    During the next few years, federal and state agencies may be required to evaluate proposals and discharge permits that could significantly change selenium (Se) inputs to the San Francisco Bay-Delta Estuary (Bay-Delta), particularly in the North Bay (i.e., Suisun Bay and San Pablo Bay). These decisions may include discharge requirements for an extension of the San Luis Drain (SLD) to the estuary to convey subsurface agricultural drainage from the western San Joaquin Valley (SJV), a renewal of an agreement to allow the existing portion of the SLD to convey subsurface agricultural drainage to a tributary of the San Joaquin River (SJR) (coincident with changes in flow patterns of the lower SJR), and refinements to promulgated Se criteria for the protection of aquatic life for the estuary. Understanding the biotransfer of Se is essential to evaluating the fate and impact of proposed changes in Se discharges to the Bay-Delta. However, past monitoring programs have not addressed the specific protocols necessary for an element that bioaccumulates. Confusion about Se threats in the past have stemmed from failure to consider the full complexity of the processes that result in Se toxicity. Past studies show that predators are more at risk from Se contamination than their prey, making it difficult to use traditional methods to predict risk from environmental concentrations alone. In this report, we employ a novel procedure to model the fate of Se under different, potentially realistic load scenarios from the SJV. For each potential load, we progressively forecast the resulting environmental concentrations, speciation, transformation to particulate form, bioaccumulation by invertebrates, trophic transfer to predators, and effects in those predators. Enough is known to establish a first order understanding of effects should Se be discharged directly into the North Bay via a conveyance such as the SLD. Our approach uses 1) existing knowledge concerning the biogeochemical reactions of Se (e.g., speciation, partitioning between dissolved and particulate forms, and bivalve assimilation efficiency) and 2) site-specific data mainly from 1986 to 1996 on clams and bottom-feeding fish and birds. Forecasts of Se loading from oil refineries and agricultural drainage from the SJV enable the calculation of a composite freshwater endmember Se concentration at the head of the estuary and at Carquinez Strait as a foundation for modeling. Our analysis of effects also takes into account the mode of conveyance for agricultural drainage (i.e., the SLD or SJR). The effects of variable flows on a seasonal or monthly basis from the Sacramento River and SJR are also considered. The results of our forecasts for external SJV watershed sources of Se mirror predictions made since 1955 of a worsening salt (and by inference, Se) buildup exacerbated by the arid climate and irrigation for agricultural use. We show that the reservoir of Se in the SJV is sufficient to provide loading at an annual rate of approximately 42,500 pounds (lbs) of Se to a Bay-Delta disposal point for 63 to 304 years at the lower range of our projections, even if influx of Se from the California Coast Ranges could be curtailed. Disposal of wastewaters on an annual basis outside of the SJV may slow the degradation of valley resources, but drainage alone cannot alleviate the salt and Se buildup in the SJV, at least within a century. Our forecasts show the different proportions of Se loading to the Bay-Delta. Oil refinery loads from 1986 to 1992 ranged from 11 to 15 lbs Se per day; with treatment and cleanup, loads decreased to 3 lbs Se per day in 1999. In contrast, SJV agricultural drainage loads could range from of 45 to 117 lbs Se per day across a set of reasonable conditions. Components of this valley-wide load include five source subareas (i.e., Grassland, Westlands, Tulare, Kern, and Northern) based on water and drainage management. Loads vary per subarea mainly because of proximity of the s

  4. 7 CFR 1710.208 - RUS criteria for approval of all load forecasts by power supply borrowers and by distribution...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... borrower developed an adequate supporting database and analyzed a reasonable range of relevant assumptions and alternative futures; (d) The borrower adopted methods and procedures in general use by the...

  5. Evaluation of TxDOT'S traffic data collection and load forecasting process

    DOT National Transportation Integrated Search

    2001-01-01

    This study had two primary objectives: (1) compare current Texas Department of Transportation (TxDOT) procedures and protocols with the state-of-the-practice and the needs of its data customers; and (2) develop enhanced traffic collection, archival, ...

  6. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.

  7. Supportive Family Environments, Genes That Confer Sensitivity, and Allostatic Load Among Rural African American Emerging Adults: A Prospective Analysis

    PubMed Central

    Brody, Gene H.; Yu, Tianyi; Chen, Yi-fu; Kogan, Steven M.; Evans, Gary W.; Windle, Michael; Gerrard, Meg; Gibbons, Frederick X.; Simons, Ronald L.; Philibert, Robert A.

    2012-01-01

    The purpose of this study was to investigate interactions between exposure to supportive family environments and genetic characteristics, which were hypothesized to forecast variations in allostatic load (AL) in a representative sample of 315 rural African American youths. Data on family environments were gathered when youths were 11–13, and genetic data were collected when they were 16, years of age. Data on AL were obtained at the beginning of emerging adulthood, age 19 years. The data analyses revealed that, as predicted, emerging adults exposed to less supportive family environments across preadolescence manifested higher levels of AL when they carried the short (s) allele at the 5-HTTLPR and an allele of DRD4 with 7 or more repeats. This is an E(family environment) × G(5-HTTLPR status) × G(DRD4 status) interaction. These data suggest that African American youths carrying genes that confer sensitivity who are exposed to less supportive family environments may be at greater risk for adverse physical health consequences that AL presages. PMID:22468688

  8. When Brain Beats Behavior: Neuroforecasting Crowdfunding Outcomes.

    PubMed

    Genevsky, Alexander; Yoon, Carolyn; Knutson, Brian

    2017-09-06

    Although traditional economic and psychological theories imply that individual choice best scales to aggregate choice, primary components of choice reflected in neural activity may support even more generalizable forecasts. Crowdfunding represents a significant and growing platform for funding new and unique projects, causes, and products. To test whether neural activity could forecast market-level crowdfunding outcomes weeks later, 30 human subjects (14 female) decided whether to fund proposed projects described on an Internet crowdfunding website while undergoing scanning with functional magnetic resonance imaging. Although activity in both the nucleus accumbens (NAcc) and medial prefrontal cortex predicted individual choices to fund on a trial-to-trial basis in the neuroimaging sample, only NAcc activity generalized to forecast market funding outcomes weeks later on the Internet. Behavioral measures from the neuroimaging sample, however, did not forecast market funding outcomes. This pattern of associations was replicated in a second study. These findings demonstrate that a subset of the neural predictors of individual choice can generalize to forecast market-level crowdfunding outcomes-even better than choice itself. SIGNIFICANCE STATEMENT Forecasting aggregate behavior with individual neural data has proven elusive; even when successful, neural forecasts have not historically supplanted behavioral forecasts. In the current research, we find that neural responses can forecast market-level choice and outperform behavioral measures in a novel Internet crowdfunding context. Targeted as well as model-free analyses convergently indicated that nucleus accumbens activity can support aggregate forecasts. Beyond providing initial evidence for neuropsychological processes implicated in crowdfunding choices, these findings highlight the ability of neural features to forecast aggregate choice, which could inform applications relevant to business and policy. Copyright © 2017 Genevsky et al.

  9. The NRL relocatable ocean/acoustic ensemble forecast system

    NASA Astrophysics Data System (ADS)

    Rowley, C.; Martin, P.; Cummings, J.; Jacobs, G.; Coelho, E.; Bishop, C.; Hong, X.; Peggion, G.; Fabre, J.

    2009-04-01

    A globally relocatable regional ocean nowcast/forecast system has been developed to support rapid implementation of new regional forecast domains. The system is in operational use at the Naval Oceanographic Office for a growing number of regional and coastal implementations. The new system is the basis for an ocean acoustic ensemble forecast and adaptive sampling capability. We present an overview of the forecast system and the ocean ensemble and adaptive sampling methods. The forecast system consists of core ocean data analysis and forecast modules, software for domain configuration, surface and boundary condition forcing processing, and job control, and global databases for ocean climatology, bathymetry, tides, and river locations and transports. The analysis component is the Navy Coupled Ocean Data Assimilation (NCODA) system, a 3D multivariate optimum interpolation system that produces simultaneous analyses of temperature, salinity, geopotential, and vector velocity using remotely-sensed SST, SSH, and sea ice concentration, plus in situ observations of temperature, salinity, and currents from ships, buoys, XBTs, CTDs, profiling floats, and autonomous gliders. The forecast component is the Navy Coastal Ocean Model (NCOM). The system supports one-way nesting and multiple assimilation methods. The ensemble system uses the ensemble transform technique with error variance estimates from the NCODA analysis to represent initial condition error. Perturbed surface forcing or an atmospheric ensemble is used to represent errors in surface forcing. The ensemble transform Kalman filter is used to assess the impact of adaptive observations on future analysis and forecast uncertainty for both ocean and acoustic properties.

  10. Ensemble Bayesian forecasting system Part I: Theory and algorithms

    NASA Astrophysics Data System (ADS)

    Herr, Henry D.; Krzysztofowicz, Roman

    2015-05-01

    The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.

  11. Forecasts of health care utilization related to pandemic A(H1N1)2009 influenza in the Nord-Pas-de-Calais region, France.

    PubMed

    Giovannelli, J; Loury, P; Lainé, M; Spaccaferri, G; Hubert, B; Chaud, P

    2015-05-01

    To describe and evaluate the forecasts of the load that pandemic A(H1N1)2009 influenza would have on the general practitioners (GP) and hospital care systems, especially during its peak, in the Nord-Pas-de-Calais (NPDC) region, France. Modelling study. The epidemic curve was modelled using an assumption of normal distribution of cases. The values for the forecast parameters were estimated from a literature review of observed data from the Southern hemisphere and French Overseas Territories, where the pandemic had already occurred. Two scenarios were considered, one realistic, the other pessimistic, enabling the authors to evaluate the 'reasonable worst case'. Forecasts were then assessed by comparing them with observed data in the NPDC region--of 4 million people. The realistic scenarios forecasts estimated 300,000 cases, 1500 hospitalizations, 225 intensive care units (ICU) admissions for the pandemic wave; 115 hospital beds and 45 ICU beds would be required per day during the peak. The pessimistic scenario's forecasts were 2-3 times higher than the realistic scenario's forecasts. Observed data were: 235,000 cases, 1585 hospitalizations, 58 ICU admissions; and a maximum of 11.6 ICU beds per day. The realistic scenario correctly estimated the temporal distribution of GP and hospitalized cases but overestimated the number of cases admitted to ICU. Obtaining more robust data for parameters estimation--particularly the rate of ICU admission among the population that the authors recommend to use--may provide better forecasts. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  12. Combining forecast weights: Why and how?

    NASA Astrophysics Data System (ADS)

    Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim

    2012-09-01

    This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.

  13. Loss of Load Probability Calculation for West Java Power System with Nuclear Power Plant Scenario

    NASA Astrophysics Data System (ADS)

    Azizah, I. D.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.; Shafii, M. A.

    2017-03-01

    Loss of Load Probability (LOLP) index showing the quality and performance of an electrical system. LOLP value is affected by load growth, the load duration curve, forced outage rate of the plant, number and capacity of generating units. This reliability index calculation begins with load forecasting to 2018 using multiple regression method. Scenario 1 with compositions of conventional plants produce the largest LOLP in 2017 amounted to 71.609 days / year. While the best reliability index generated in scenario 2 with the NPP amounted to 6.941 days / year in 2015. Improved reliability of systems using nuclear power more efficiently when compared to conventional plants because it also has advantages such as emission-free, inexpensive fuel costs, as well as high level of plant availability.

  14. Application of global weather and climate model output to the design and operation of wind-energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Judith

    This project addressed the challenge of providing weather and climate information to support the operation, management and planning for wind-energy systems. The need for forecast information is extending to longer projection windows with increasing penetration of wind power into the grid and also with diminishing reserve margins to meet peak loads during significant weather events. Maintenance planning and natural gas trading is being influenced increasingly by anticipation of wind generation on timescales of weeks to months. Future scenarios on decadal time scales are needed to support assessment of wind farm siting, government planning, long-term wind purchase agreements and the regulatorymore » environment. The challenge of making wind forecasts on these longer time scales is associated with a wide range of uncertainties in general circulation and regional climate models that make them unsuitable for direct use in the design and planning of wind-energy systems. To address this challenge, CFAN has developed a hybrid statistical/dynamical forecasting scheme for delivering probabilistic forecasts on time scales from one day to seven months using what is arguably the best forecasting system in the world (European Centre for Medium Range Weather Forecasting, ECMWF). The project also provided a framework to assess future wind power through developing scenarios of interannual to decadal climate variability and change. The Phase II research has successfully developed an operational wind power forecasting system for the U.S., which is being extended to Europe and possibly Asia.« less

  15. Quasi-most unstable modes: a window to 'À la carte' ensemble diversity?

    NASA Astrophysics Data System (ADS)

    Homar Santaner, Victor; Stensrud, David J.

    2010-05-01

    The atmospheric scientific community is nowadays facing the ambitious challenge of providing useful forecasts of atmospheric events that produce high societal impact. The low level of social resilience to false alarms creates tremendous pressure on forecasting offices to issue accurate, timely and reliable warnings.Currently, no operational numerical forecasting system is able to respond to the societal demand for high-resolution (in time and space) predictions in the 12-72h time span. The main reasons for such deficiencies are the lack of adequate observations and the high non-linearity of the numerical models that are currently used. The whole weather forecasting problem is intrinsically probabilistic and current methods aim at coping with the various sources of uncertainties and the error propagation throughout the forecasting system. This probabilistic perspective is often created by generating ensembles of deterministic predictions that are aimed at sampling the most important sources of uncertainty in the forecasting system. The ensemble generation/sampling strategy is a crucial aspect of their performance and various methods have been proposed. Although global forecasting offices have been using ensembles of perturbed initial conditions for medium-range operational forecasts since 1994, no consensus exists regarding the optimum sampling strategy for high resolution short-range ensemble forecasts. Bred vectors, however, have been hypothesized to better capture the growing modes in the highly nonlinear mesoscale dynamics of severe episodes than singular vectors or observation perturbations. Yet even this technique is not able to produce enough diversity in the ensembles to accurately and routinely predict extreme phenomena such as severe weather. Thus, we propose a new method to generate ensembles of initial conditions perturbations that is based on the breeding technique. Given a standard bred mode, a set of customized perturbations is derived with specified amplitudes and horizontal scales. This allows the ensemble to excite growing modes across a wider range of scales. Results show that this approach produces significantly more spread in the ensemble prediction than standard bred modes alone. Several examples that illustrate the benefits from this approach for severe weather forecasts will be provided.

  16. A temporal-spatial postprocessing model for probabilistic run-off forecast. With a case study from Ulla-Førre with five catchments and ten lead times

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.

    2012-04-01

    This work is driven by the needs of next generation short term optimization methodology for hydro power production. Stochastic optimization are about to be introduced; i.e. optimizing when available resources (water) and utility (prices) are uncertain. In this paper we focus on the available resources, i.e. water, where uncertainty mainly comes from uncertainty in future runoff. When optimizing a water system all catchments and several lead times have to be considered simultaneously. Depending on the system of hydropower reservoirs, it might be a set of headwater catchments, a system of upstream /downstream reservoirs where water used from one catchment /dam arrives in a lower catchment maybe days later, or a combination of both. The aim of this paper is therefore to construct a simultaneous probabilistic forecast for several catchments and lead times, i.e. to provide a predictive distribution for the forecasts. Stochastic optimization methods need samples/ensembles of run-off forecasts as input. Hence, it should also be possible to sample from our probabilistic forecast. A post-processing approach is taken, and an error model based on Box- Cox transformation, power transform and a temporal-spatial copula model is used. It accounts for both between catchment and between lead time dependencies. In operational use it is strait forward to sample run-off ensembles from this models that inherits the catchment and lead time dependencies. The methodology is tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times are constructed. The methodology has enough flexibility to model operationally important features in this case study such as hetroscadasety, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. Our model is evaluated using CRPS for marginal predictive distributions and energy score for joint predictive distribution. It is tested against deterministic run-off forecast, climatology forecast and a persistent forecast, and is found to be the better probabilistic forecast for lead time grater then two. From an operational point of view the results are interesting as the between catchment dependency gets stronger with longer lead-times.

  17. Acceleration to failure in geophysical signals prior to laboratory rock failure and volcanic eruptions (Invited)

    NASA Astrophysics Data System (ADS)

    Main, I. G.; Bell, A. F.; Greenhough, J.; Heap, M. J.; Meredith, P. G.

    2010-12-01

    The nucleation processes that ultimately lead to earthquakes, volcanic eruptions, rock bursts in mines, and landslides from cliff slopes are likely to be controlled at some scale by brittle failure of the Earth’s crust. In laboratory brittle deformation experiments geophysical signals commonly exhibit an accelerating trend prior to dynamic failure. Similar signals have been observed prior to volcanic eruptions, including volcano-tectonic earthquake event and moment release rates. Despite a large amount of effort in the search, no such statistically robust systematic trend is found prior to natural earthquakes. Here we describe the results of a suite of laboratory tests on Mount Etna Basalt and other rocks to examine the nature of the non-linear scaling from laboratory to field conditions, notably using laboratory ‘creep’ tests to reduce the boundary strain rate to conditions more similar to those in the field. Seismic event rate, seismic moment release rate and rate of porosity change show a classic ‘bathtub’ graph that can be derived from a simple damage model based on separate transient and accelerating sub-critical crack growth mechanisms, resulting from separate processes of negative and positive feedback in the population dynamics. The signals exhibit clear precursors based on formal statistical model tests using maximum likelihood techniques with Poisson errors. After correcting for the finite loading time of the signal, the results show a transient creep rate that decays as a classic Omori law for earthquake aftershocks, and remarkably with an exponent near unity, as commonly observed for natural earthquake sequences. The accelerating trend follows an inverse power law when fitted in retrospect, i.e. with prior knowledge of the failure time. In contrast the strain measured on the sample boundary shows a less obvious but still accelerating signal that is often absent altogether in natural strain data prior to volcanic eruptions. To test the forecasting power of such constitutive rules in prospective mode, we examine the forecast quality of several synthetic trials, by adding representative statistical fluctuations, due to finite real-time sampling effects, to an underlying accelerating trend. Metrics of forecast quality change systematically and dramatically with time. In particular the model accuracy increases, and the forecast bias decreases, as the failure time approaches.

  18. Forecasting carbon dioxide emissions based on a hybrid of mixed data sampling regression model and back propagation neural network in the USA.

    PubMed

    Zhao, Xin; Han, Meng; Ding, Lili; Calin, Adrian Cantemir

    2018-01-01

    The accurate forecast of carbon dioxide emissions is critical for policy makers to take proper measures to establish a low carbon society. This paper discusses a hybrid of the mixed data sampling (MIDAS) regression model and BP (back propagation) neural network (MIDAS-BP model) to forecast carbon dioxide emissions. Such analysis uses mixed frequency data to study the effects of quarterly economic growth on annual carbon dioxide emissions. The forecasting ability of MIDAS-BP is remarkably better than MIDAS, ordinary least square (OLS), polynomial distributed lags (PDL), autoregressive distributed lags (ADL), and auto-regressive moving average (ARMA) models. The MIDAS-BP model is suitable for forecasting carbon dioxide emissions for both the short and longer term. This research is expected to influence the methodology for forecasting carbon dioxide emissions by improving the forecast accuracy. Empirical results show that economic growth has both negative and positive effects on carbon dioxide emissions that last 15 quarters. Carbon dioxide emissions are also affected by their own change within 3 years. Therefore, there is a need for policy makers to explore an alternative way to develop the economy, especially applying new energy policies to establish a low carbon society.

  19. Magnetogram Forecast: An All-Clear Space Weather Forecasting System

    NASA Technical Reports Server (NTRS)

    Barghouty, Nasser; Falconer, David

    2015-01-01

    Solar flares and coronal mass ejections (CMEs) are the drivers of severe space weather. Forecasting the probability of their occurrence is critical in improving space weather forecasts. The National Oceanic and Atmospheric Administration (NOAA) currently uses the McIntosh active region category system, in which each active region on the disk is assigned to one of 60 categories, and uses the historical flare rates of that category to make an initial forecast that can then be adjusted by the NOAA forecaster. Flares and CMEs are caused by the sudden release of energy from the coronal magnetic field by magnetic reconnection. It is believed that the rate of flare and CME occurrence in an active region is correlated with the free energy of an active region. While the free energy cannot be measured directly with present observations, proxies of the free energy can instead be used to characterize the relative free energy of an active region. The Magnetogram Forecast (MAG4) (output is available at the Community Coordinated Modeling Center) was conceived and designed to be a databased, all-clear forecasting system to support the operational goals of NASA's Space Radiation Analysis Group. The MAG4 system automatically downloads nearreal- time line-of-sight Helioseismic and Magnetic Imager (HMI) magnetograms on the Solar Dynamics Observatory (SDO) satellite, identifies active regions on the solar disk, measures a free-energy proxy, and then applies forecasting curves to convert the free-energy proxy into predicted event rates for X-class flares, M- and X-class flares, CMEs, fast CMEs, and solar energetic particle events (SPEs). The forecast curves themselves are derived from a sample of 40,000 magnetograms from 1,300 active region samples, observed by the Solar and Heliospheric Observatory Michelson Doppler Imager. Figure 1 is an example of MAG4 visual output

  20. Final research findings on traffic-load forecasting using weigh-in-motion data

    DOT National Transportation Integrated Search

    1998-09-01

    The overall objective of Project 7-987 was to develop a long-range pavement rehabilitation plan for a segment of US 59, a four-lane divided principal arterial highway in TxDOT's Lufkin District. To identify feasible pavement : structures, test sectio...

  1. 7 CFR 1710.300 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the forecast, including the methodology used to project loads, rates, revenue, power costs, operating expenses, plant additions, and other factors having a material effect on the balance sheet and on financial... regional office will consult with the Power Supply Division in the case of generation projects for...

  2. Research on Operation Strategy for Bundled Wind-thermal Generation Power Systems Based on Two-Stage Optimization Model

    NASA Astrophysics Data System (ADS)

    Sun, Congcong; Wang, Zhijie; Liu, Sanming; Jiang, Xiuchen; Sheng, Gehao; Liu, Tianyu

    2017-05-01

    Wind power has the advantages of being clean and non-polluting and the development of bundled wind-thermal generation power systems (BWTGSs) is one of the important means to improve wind power accommodation rate and implement “clean alternative” on generation side. A two-stage optimization strategy for BWTGSs considering wind speed forecasting results and load characteristics is proposed. By taking short-term wind speed forecasting results of generation side and load characteristics of demand side into account, a two-stage optimization model for BWTGSs is formulated. By using the environmental benefit index of BWTGSs as the objective function, supply-demand balance and generator operation as the constraints, the first-stage optimization model is developed with the chance-constrained programming theory. By using the operation cost for BWTGSs as the objective function, the second-stage optimization model is developed with the greedy algorithm. The improved PSO algorithm is employed to solve the model and numerical test verifies the effectiveness of the proposed strategy.

  3. Integrating Solar PV in Utility System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, A.; Botterud, A.; Wu, J.

    2013-10-31

    This study develops a systematic framework for estimating the increase in operating costs due to uncertainty and variability in renewable resources, uses the framework to quantify the integration costs associated with sub-hourly solar power variability and uncertainty, and shows how changes in system operations may affect these costs. Toward this end, we present a statistical method for estimating the required balancing reserves to maintain system reliability along with a model for commitment and dispatch of the portfolio of thermal and renewable resources at different stages of system operations. We estimate the costs of sub-hourly solar variability, short-term forecast errors, andmore » day-ahead (DA) forecast errors as the difference in production costs between a case with “realistic” PV (i.e., subhourly solar variability and uncertainty are fully included in the modeling) and a case with “well behaved” PV (i.e., PV is assumed to have no sub-hourly variability and can be perfectly forecasted). In addition, we highlight current practices that allow utilities to compensate for the issues encountered at the sub-hourly time frame with increased levels of PV penetration. In this analysis we use the analytical framework to simulate utility operations with increasing deployment of PV in a case study of Arizona Public Service Company (APS), a utility in the southwestern United States. In our analysis, we focus on three processes that are important in understanding the management of PV variability and uncertainty in power system operations. First, we represent the decisions made the day before the operating day through a DA commitment model that relies on imperfect DA forecasts of load and wind as well as PV generation. Second, we represent the decisions made by schedulers in the operating day through hour-ahead (HA) scheduling. Peaking units can be committed or decommitted in the HA schedules and online units can be redispatched using forecasts that are improved relative to DA forecasts, but still imperfect. Finally, we represent decisions within the operating hour by schedulers and transmission system operators as real-time (RT) balancing. We simulate the DA and HA scheduling processes with a detailed unit-commitment (UC) and economic dispatch (ED) optimization model. This model creates a least-cost dispatch and commitment plan for the conventional generating units using forecasts and reserve requirements as inputs. We consider only the generation units and load of the utility in this analysis; we do not consider opportunities to trade power with neighboring utilities. We also do not consider provision of reserves from renewables or from demand-side options. We estimate dynamic reserve requirements in order to meet reliability requirements in the RT operations, considering the uncertainty and variability in load, solar PV, and wind resources. Balancing reserve requirements are based on the 2.5th and 97.5th percentile of 1-min deviations from the HA schedule in a previous year. We then simulate RT deployment of balancing reserves using a separate minute-by-minute simulation of deviations from the HA schedules in the operating year. In the simulations we assume that balancing reserves can be fully deployed in 10 min. The minute-by-minute deviations account for HA forecasting errors and the actual variability of the load, wind, and solar generation. Using these minute-by-minute deviations and deployment of balancing reserves, we evaluate the impact of PV on system reliability through the calculation of the standard reliability metric called Control Performance Standard 2 (CPS2). Broadly speaking, the CPS2 score measures the percentage of 10-min periods in which a balancing area is able to balance supply and demand within a specific threshold. Compliance with the North American Electric Reliability Corporation (NERC) reliability standards requires that the CPS2 score must exceed 90% (i.e., the balancing area must maintain adequate balance for 90% of the 10-min periods). The combination of representing DA forecast errors in the DA commitments, using 1-min PV data to simulate RT balancing, and estimates of reliability performance through the CPS2 metric, all factors that are important to operating systems with increasing amounts of PV, makes this study unique in its scope.« less

  4. Bayesian Processor of Output for Probabilistic Quantitative Precipitation Forecasting

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, R.; Maranzano, C. J.

    2006-05-01

    The Bayesian Processor of Output (BPO) is a new, theoretically-based technique for probabilistic forecasting of weather variates. It processes output from a numerical weather prediction (NWP) model and optimally fuses it with climatic data in order to quantify uncertainty about a predictand. The BPO is being tested by producing Probabilistic Quantitative Precipitation Forecasts (PQPFs) for a set of climatically diverse stations in the contiguous U.S. For each station, the PQPFs are produced for the same 6-h, 12-h, and 24-h periods up to 84- h ahead for which operational forecasts are produced by the AVN-MOS (Model Output Statistics technique applied to output fields from the Global Spectral Model run under the code name AVN). The inputs into the BPO are estimated as follows. The prior distribution is estimated from a (relatively long) climatic sample of the predictand; this sample is retrieved from the archives of the National Climatic Data Center. The family of the likelihood functions is estimated from a (relatively short) joint sample of the predictor vector and the predictand; this sample is retrieved from the same archive that the Meteorological Development Laboratory of the National Weather Service utilized to develop the AVN-MOS system. This talk gives a tutorial introduction to the principles and procedures behind the BPO, and highlights some results from the testing: a numerical example of the estimation of the BPO, and a comparative verification of the BPO forecasts and the MOS forecasts. It concludes with a list of demonstrated attributes of the BPO (vis- à-vis the MOS): more parsimonious definitions of predictors, more efficient extraction of predictive information, better representation of the distribution function of predictand, and equal or better performance (in terms of calibration and informativeness).

  5. Forecasting outbreaks of the Douglas-fir tussock moth from lower crown cocoon samples.

    Treesearch

    Richard R. Mason; Donald W. Scott; H. Gene Paul

    1993-01-01

    A predictive technique using a simple linear regression was developed to forecast the midcrown density of small tussock moth larvae from estimates of cocoon density in the previous generation. The regression estimator was derived from field samples of cocoons and larvae taken from a wide range of nonoutbreak tussock moth populations. The accuracy of the predictions was...

  6. The net benefits of human-ignited wildfire forecasting: the case of Tribal land units in the United States

    PubMed Central

    Prestemon, Jeffrey P.; Butry, David T.; Thomas, Douglas S.

    2017-01-01

    Research shows that some categories of human-ignited wildfires might be forecastable, due to their temporal clustering, with the possibility that resources could be pre-deployed to help reduce the incidence of such wildfires. We estimated several kinds of incendiary and other human-ignited wildfire forecast models at the weekly time step for tribal land units in the United States, evaluating their forecast skill out of sample. Analyses show that an Autoregressive Conditional Poisson (ACP) model of both incendiary and non-incendiary human-ignited wildfires is more accurate out of sample compared to alternatives, and the simplest of the ACP models performed the best. Additionally, an ensemble of these and simpler, less analytically intensive approaches performed even better. Wildfire hotspot forecast models using all model types were evaluated in a simulation mode to assess the net benefits of forecasts in the context of law enforcement resource reallocations. Our analyses show that such hotspot tools could yield large positive net benefits for the tribes in terms of suppression expenditures averted for incendiary wildfires but that the hotspot tools were less likely to be beneficial for addressing outbreaks of non-incendiary human-ignited wildfires. PMID:28769549

  7. The net benefits of human-ignited wildfire forecasting: the case of Tribal land units in the United States.

    PubMed

    Prestemon, Jeffrey P; Butry, David T; Thomas, Douglas S

    2016-01-01

    Research shows that some categories of human-ignited wildfires might be forecastable, due to their temporal clustering, with the possibility that resources could be pre-deployed to help reduce the incidence of such wildfires. We estimated several kinds of incendiary and other human-ignited wildfire forecast models at the weekly time step for tribal land units in the United States, evaluating their forecast skill out of sample. Analyses show that an Autoregressive Conditional Poisson (ACP) model of both incendiary and non-incendiary human-ignited wildfires is more accurate out of sample compared to alternatives, and the simplest of the ACP models performed the best. Additionally, an ensemble of these and simpler, less analytically intensive approaches performed even better. Wildfire hotspot forecast models using all model types were evaluated in a simulation mode to assess the net benefits of forecasts in the context of law enforcement resource reallocations. Our analyses show that such hotspot tools could yield large positive net benefits for the tribes in terms of suppression expenditures averted for incendiary wildfires but that the hotspot tools were less likely to be beneficial for addressing outbreaks of non-incendiary human-ignited wildfires.

  8. Bayesian quantitative precipitation forecasts in terms of quantiles

    NASA Astrophysics Data System (ADS)

    Bentzien, Sabrina; Friederichs, Petra

    2014-05-01

    Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into reliability, resolutions and uncertainty parts. A quantile reliability plot gives detailed insights in the predictive performance of the quantile forecasts.

  9. Targeted observations to improve tropical cyclone track forecasts in the Atlantic and eastern Pacific basins

    NASA Astrophysics Data System (ADS)

    Aberson, Sim David

    In 1997, the National Hurricane Center and the Hurricane Research Division began conducting operational synoptic surveillance missions with the Gulfstream IV-SP jet aircraft to improve operational forecast models. During the first two years, twenty-four missions were conducted around tropical cyclones threatening the continental United States, Puerto Rico, and the Virgin Islands. Global Positioning System dropwindsondes were released from the aircraft at 150--200 km intervals along the flight track in the tropical cyclone environment to obtain wind, temperature, and humidity profiles from flight level (around 150 hPa) to the surface. The observations were processed and formatted aboard the aircraft and transmitted to the National Centers for Environmental Prediction (NCEP). There, they were ingested into the Global Data Assimilation System that subsequently provides initial and time-dependent boundary conditions for numerical models that forecast tropical cyclone track and intensity. Three dynamical models were employed in testing the targeting and sampling strategies. With the assimilation into the numerical guidance of all the observations gathered during the surveillance missions, only the 12-h Geophysical Fluid Dynamics Laboratory Hurricane Model forecast showed statistically significant improvement. Neither the forecasts from the Aviation run of the Global Spectral Model nor the shallow-water VICBAR model were improved with the assimilation of the dropwindsonde data. This mediocre result is found to be due mainly to the difficulty in operationally quantifying the storm-motion vector used to create accurate synthetic data to represent the tropical cyclone vortex in the models. A secondary limit on forecast improvements from the surveillance missions is the limited amount of data provided by the one surveillance aircraft in regular missions. The inability of some surveillance missions to surround the tropical cyclone with dropwindsonde observations is a possible third limit, though the results are inconclusive. Due to limited aircraft resources, optimal observing strategies for these missions must be developed. Since observations in areas of decaying error modes are unlikely to have large impact on subsequent forecasts, such strategies should be based on taking observations in those geographic locations corresponding to the most rapidly growing error modes in the numerical models and on known deficiencies in current data assimilation systems. Here, the most rapidly growing modes are represented by areas of large forecast spread in the NCEP bred-mode global ensemble forecasting system. The sampling strategy requires sampling the entire target region at approximately the same resolution as the North American rawinsonde network to limit the possibly spurious spread of information from dropwindsonde observations into data-sparse regions where errors are likely to grow. When only the subset of data in these fully-sampled target regions is assimilated into the numerical models, statistically significant reduction of the track forecast errors of up to 25% within the critical first two days of the forecast are seen. These model improvements are comparable with the cumulative business-as-usual track forecast model improvements expected over eighteen years.

  10. Short-term acoustic forecasting via artificial neural networks for neonatal intensive care units.

    PubMed

    Young, Jason; Macke, Christopher J; Tsoukalas, Lefteri H

    2012-11-01

    Noise levels in hospitals, especially neonatal intensive care units (NICUs), have become of great concern for hospital designers. This paper details an artificial neural network (ANN) approach to forecasting the sound loads in NICUs. The ANN is used to learn the relationship between past, present, and future noise levels. By training the ANN with data specific to the location and device used to measure the sound, the ANN is able to produce reasonable predictions of noise levels in the NICU. Best case results show average absolute errors of 5.06 ± 4.04% when used to predict the noise levels one hour ahead, which correspond to 2.53 dBA ± 2.02 dBA. The ANN has the tendency to overpredict during periods of stability and underpredict during large transients. This forecasting algorithm could be of use in any application where prediction and prevention of harmful noise levels are of the utmost concern.

  11. Detecting and assessing Saharan dust contribution to PM10 loads: A pilot study within the EU-Life+10 project DIAPASON

    NASA Astrophysics Data System (ADS)

    Gobbi, Gian Paolo; Barnaba, Francesca; Bolignano, Andrea; Costabile, Francesca; Di Liberto, Luca; Dionisi, Davide; Drewnick, Frank; Lucarelli, Franco; Manigrasso, Maurizio; Nava, Silvia; Sauvage, Laurent; Sozzi, Roberto; Struckmeier, Caroline; Wille, Holger

    2015-04-01

    The EC LIFE+2010 DIAPASON Project (Desert dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs, www.diapason-life.eu) intends to contribute new methodologies to assess the role of aerosol advections of Saharan dust to the local PM loads recorded in Europe. To this goal, automated Polarization Lidar-Ceilometers (PLCs) were prototyped within DIAPASON to certify the presence of Saharan dust plumes and support evaluating their mass loadings in the lowermost atmosphere. The whole process also involves operational dust forecasts, as well as satellite and in-situ observations. Demonstration of the Project is implemented in the pilot region of Rome (Central Italy) where three networked DIAPASON PLCs started, in October 2013 a year-round, 24h/day monitoring of the altitude-resolved aerosol backscatter and depolarization profiles. Two intensive observational periods (IOPs) involving chemical analysis and detailed physical characterization of aerosol samples have also been carried out in this year-long campaign, namely in Fall 2013 and Spring 2014. These allowed for an extensive interpretation of the PLC observations, highlighting important synergies between the PLC and the in situ data. The presentation will address capabilities of the employed PLCs, observations agreement with model forecasts of dust advections, retrievals of aerosol properties and methodologies developed to detect Saharan advections and to evaluate the relevant mass contribution to PM10. This latter task is intended to provide suggestions on possible improvements to the current EC Guidelines (2011) on this matter. In fact, specific Guidelines are delivered by the European Commission to provide the Member States a common method to asses the Saharan dust contribution to the currently legislated PM-related Air Quality metrics. The DIAPASON experience shows that improvements can be proposed to make the current EC Methodology more robust and flexible. The methodology DIAPASON recommends has been designed and validated taking advantage of the PLC observations and highlights the benefits of the operational use of such systems in routine Air Quality applications. Concurrently, PLC activities are contributing to the COST Action "TOPROF", an European effort aiming at the setup and operational use of Lidar-Ceilometers networks for meteorological and safety purposes.

  12. A probabilistic drought forecasting framework: A combined dynamical and statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh

    In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less

  13. MAFALDA: An early warning modeling tool to forecast volcanic ash dispersal and deposition

    NASA Astrophysics Data System (ADS)

    Barsotti, S.; Nannipieri, L.; Neri, A.

    2008-12-01

    Forecasting the dispersal of ash from explosive volcanoes is a scientific challenge to modern volcanology. It also represents a fundamental step in mitigating the potential impact of volcanic ash on urban areas and transport routes near explosive volcanoes. To this end we developed a Web-based early warning modeling tool named MAFALDA (Modeling and Forecasting Ash Loading and Dispersal in the Atmosphere) able to quantitatively forecast ash concentrations in the air and on the ground. The main features of MAFALDA are the usage of (1) a dispersal model, named VOL-CALPUFF, that couples the column ascent phase with the ash cloud transport and (2) high-resolution weather forecasting data, the capability to run and merge multiple scenarios, and the Web-based structure of the procedure that makes it suitable as an early warning tool. MAFALDA produces plots for a detailed analysis of ash cloud dynamics and ground deposition, as well as synthetic 2-D maps of areas potentially affected by dangerous concentrations of ash. A first application of MAFALDA to the long-lasting weak plumes produced at Mt. Etna (Italy) is presented. A similar tool can be useful to civil protection authorities and volcanic observatories in reducing the impact of the eruptive events. MAFALDA can be accessed at http://mafalda.pi.ingv.it.

  14. Selective inspection planning with ageing forecast for sewer types.

    PubMed

    Baur, R; Herz, R

    2002-01-01

    Investments in sewer rehabilitation must be based on inspection and evaluation of sewer conditions with respect to the severity of sewer damage and to environmental risks. This paper deals with the problems of forecasting the condition of sewers in a network from a small sample of inspected sewers. Transition functions from one into the next poorer condition class, which were empirically derived from this sample, are used to forecast the condition of sewers. By the same procedure, transition functions were subsequently calibrated for sub-samples of different types of sewers. With these transition functions, the most probable date of entering a critical condition class can be forecast from sewer characteristics, such as material, period of construction, location, use for waste and/or storm water, profile, diameter and gradient. Results are shown for the estimates about the actual condition of the Dresden sewer network and its deterioration in case of doing nothing about it. A procedure is proposed for scheduling the inspection dates for sewers which have not yet been inspected and for those which have been inspected before.

  15. Long-term volcanic hazard forecasts based on Somma-Vesuvio past eruptive activity

    NASA Astrophysics Data System (ADS)

    Lirer, Lucio; Petrosino, Paola; Alberico, Ines; Postiglione, Immacolata

    2001-02-01

    Distributions of pyroclastic deposits from the main explosive events at Somma-Vesuvio during the 8,000-year B.P.-A.D. 1906 time-span have been analysed to provide maps of volcanic hazard for long-term eruption forecasting. In order to define hazard ratings, the spatial distributions and loads (kg/m2) exerted by the fall deposits on the roofs of buildings have been considered. A load higher than 300 kg/m2 is defined as destructive. The relationship load/frequency (the latter defined as the number of times that an area has been impacted by the deposition of fall deposits) is considered to be a suitable parameter for differentiating among areas according to hazard rating. Using past fall deposit distributions as the basis for future eruptive scenarios, the total area that could be affected by the products of a future Vesuvio explosive eruption is 1,500 km2. The perivolcanic area (274 km2) has the greatest hazard rating because it could be buried by pyroclastic flow deposits thicker than 0.5 m and up to several tens of metres in thickness. Currently, the perivolcanic area also has the highest risk because of the high exposed value, mainly arising from the high population density.

  16. Power Flow Simulations of a More Renewable California Grid Utilizing Wind and Solar Insolation Forecasting

    NASA Astrophysics Data System (ADS)

    Hart, E. K.; Jacobson, M. Z.; Dvorak, M. J.

    2008-12-01

    Time series power flow analyses of the California electricity grid are performed with extensive addition of intermittent renewable power. The study focuses on the effects of replacing non-renewable and imported (out-of-state) electricity with wind and solar power on the reliability of the transmission grid. Simulations are performed for specific days chosen throughout the year to capture seasonal fluctuations in load, wind, and insolation. Wind farm expansions and new wind farms are proposed based on regional wind resources and time-dependent wind power output is calculated using a meteorological model and the power curves of specific wind turbines. Solar power is incorporated both as centralized and distributed generation. Concentrating solar thermal plants are modeled using local insolation data and the efficiencies of pre-existing plants. Distributed generation from rooftop PV systems is included using regional insolation data, efficiencies of common PV systems, and census data. The additional power output of these technologies offsets power from large natural gas plants and is balanced for the purposes of load matching largely with hydroelectric power and by curtailment when necessary. A quantitative analysis of the effects of this significant shift in the electricity portfolio of the state of California on power availability and transmission line congestion, using a transmission load-flow model, is presented. A sensitivity analysis is also performed to determine the effects of forecasting errors in wind and insolation on load-matching and transmission line congestion.

  17. Analysis of temperature changes on three-phase synchronous generator using infrared: comparison between balanced and unbalanced load

    NASA Astrophysics Data System (ADS)

    Amien, S.; Yoga, W.; Fahmi, F.

    2018-02-01

    Synchronous generators are a major tool in an electrical energy generating systems, the load supplied by the generator is unbalanced. This paper discusses the effect of synchronous generator temperature on the condition of balanced load and unbalanced load, which will then be compared with the measurement result of both states of the generator. Unbalanced loads can be caused by various asymmetric disturbances in the power system and the failure of load forecasting studies so that the load distribution in each phase is not the same and causing the excessive heat of the generator. The method used in data collection was by using an infrared thermometer and resistance calculation method. The temperature comparison result between the resistive, inductive and capacitive loads in the highest temperature balance occured when the generator is loaded with a resistive load, where T = 31.9 ° C and t = 65 minutes. While in a state of unbalanced load the highest temperature occured when the generator is loaded with a capacitive load, where T = 40.1 ° C and t = 60 minutes. By understanding this behavior, we can maintain the generator for longer operation life.

  18. NREL Integrate: RCS-4-42326

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudgins, Andrew P.; Waight, Jim; Grover, Shailendra

    OMNETRIC Corp., Duke Energy, CPS Energy, and the University of Texas at San Antonio (UTSA) created a project team to execute the project 'OpenFMB Reference Architecture Demonstration.' The project included development and demonstration of concepts that will enable the electric utility grid to host larger penetrations of renewable resources. The project concept calls for the aggregation of renewable resources and loads into microgrids and the control of these microgrids with an implementation of the OpenFMB Reference Architecture. The production of power from the renewable resources that are appearing on the grid today is very closely linked to the weather. Themore » difficulty of forecasting the weather, which is well understood, leads to difficulty in forecasting the production of renewable resources. The current state of the art in forecasting the power production from renewables (solar PV and wind) are accuracies in the range of 12-25 percent NMAE. In contrast the demand for electricity aggregated to the system level, is easier to predict. The state of the art of demand forecasting done, 24 hours ahead, is about 2-3% MAPE. Forecasting the load to be supplied from conventional resources (demand minus generation from renewable resources) is thus very hard to forecast. This means that even a few hours before the time of consumption, there can be considerable uncertainty over what must be done to balance supply and demand. Adding to the problem of difficulty of forecasting, is the reality of the variability of the actual production of power from renewables. Due to the variability of wind speeds and solar insolation, the actual output of power from renewable resources can vary significantly over a short period of time. Gusts of winds result is variation of power output of wind turbines. The shadows of clouds moving over solar PV arrays result in the variation of power production of the array. This compounds the problem of balancing supply and demand in real time. Establishing a control system that can manage distribution systems with large penetrations of renewable resources is difficult due to two major issues: (1) the lack of standardization and interoperability between the vast array of equipment in operation and on the market, most of which use different and proprietary means of communication and (2) the magnitude of the network and the information it generates and consumes. The objective of this project is to provide the industry with a design concept and tools that will enable the electric power grid to overcome these barriers and support a larger penetration of clean energy from renewable resources.« less

  19. Using Forecasting to Predict Long-Term Resource Utilization for Web Services

    ERIC Educational Resources Information Center

    Yoas, Daniel W.

    2013-01-01

    Researchers have spent years understanding resource utilization to improve scheduling, load balancing, and system management through short-term prediction of resource utilization. Early research focused primarily on single operating systems; later, interest shifted to distributed systems and, finally, into web services. In each case researchers…

  20. 76 FR 66229 - Transmission Planning Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... Transmission Services, at all demand levels over the range of forecast system demands, under the contingency... any planned firm load that is not directly served by the elements that are removed from service as a... to plan for the loss of firm service for a single contingency, the Commission finds that their...

  1. The 30/20 GHZ net market assessment

    NASA Technical Reports Server (NTRS)

    Rogers, J. C.; Reiner, P.

    1980-01-01

    By creating a number of market scenarios variations dealing with network types, network sizes, and service price levels were analyzed for their impact on market demand. Each market scenario represents a market demand forecast with results for voice, data, and video service traffic expressed in peak load megabits per second.

  2. A time series model: First-order integer-valued autoregressive (INAR(1))

    NASA Astrophysics Data System (ADS)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  3. Regional PV power estimation and forecast to mitigate the impact of high photovoltaic penetration on electric grid.

    NASA Astrophysics Data System (ADS)

    Pierro, Marco; De Felice, Matteo; Maggioni, Enrico; Moser, David; Perotto, Alessandro; Spada, Francesco; Cornaro, Cristina

    2017-04-01

    The growing photovoltaic generation results in a stochastic variability of the electric demand that could compromise the stability of the grid and increase the amount of energy reserve and the energy imbalance cost. On regional scale, solar power estimation and forecast is becoming essential for Distribution System Operators, Transmission System Operator, energy traders, and aggregators of generation. Indeed the estimation of regional PV power can be used for PV power supervision and real time control of residual load. Mid-term PV power forecast can be employed for transmission scheduling to reduce energy imbalance and related cost of penalties, residual load tracking, trading optimization, secondary energy reserve assessment. In this context, a new upscaling method was developed and used for estimation and mid-term forecast of the photovoltaic distributed generation in a small area in the north of Italy under the control of a local DSO. The method was based on spatial clustering of the PV fleet and neural networks models that input satellite or numerical weather prediction data (centered on cluster centroids) to estimate or predict the regional solar generation. It requires a low computational effort and very few input information should be provided by users. The power estimation model achieved a RMSE of 3% of installed capacity. Intra-day forecast (from 1 to 4 hours) obtained a RMSE of 5% - 7% while the one and two days forecast achieve to a RMSE of 7% and 7.5%. A model to estimate the forecast error and the prediction intervals was also developed. The photovoltaic production in the considered region provided the 6.9% of the electric consumption in 2015. Since the PV penetration is very similar to the one observed at national level (7.9%), this is a good case study to analyse the impact of PV generation on the electric grid and the effects of PV power forecast on transmission scheduling and on secondary reserve estimation. It appears that, already with 7% of PV penetration, the distributed PV generation could have a great impact both on the DSO energy need and on the transmission scheduling capability. Indeed, for some hours of the days in summer time, the photovoltaic generation can provide from 50% to 75% of the energy that the local DSO should buy from Italian TSO to cover the electrical demand. Moreover, mid-term forecast can reduce the annual energy imbalance between the scheduled transmission and the actual one from 10% of the TSO energy supply (without considering the PV forecast) to 2%. Furthermore, it was shown that prediction intervals could be used not only to estimate the probability of a specific PV generation bid on the energy market, but also to reduce the energy reserve predicted for the next day. Two different methods for energy reserve estimation were developed and tested. The first is based on a clear sky model while the second makes use of the PV prediction intervals with the 95% of confidence level. The latter reduces the amount of the day-ahead energy reserve of 36% with respect the clear sky method.

  4. Time series regression and ARIMAX for forecasting currency flow at Bank Indonesia in Sulawesi region

    NASA Astrophysics Data System (ADS)

    Suharsono, Agus; Suhartono, Masyitha, Aulia; Anuravega, Arum

    2015-12-01

    The purpose of the study is to forecast the outflow and inflow of currency at Indonesian Central Bank or Bank Indonesia (BI) in Sulawesi Region. The currency outflow and inflow data tend to have a trend pattern which is influenced by calendar variation effects. Therefore, this research focuses to apply some forecasting methods that could handle calendar variation effects, i.e. Time Series Regression (TSR) and ARIMAX models, and compare the forecast accuracy with ARIMA model. The best model is selected based on the lowest of Root Mean Squares Errors (RMSE) at out-sample dataset. The results show that ARIMA is the best model for forecasting the currency outflow and inflow at South Sulawesi. Whereas, the best model for forecasting the currency outflow at Central Sulawesi and Southeast Sulawesi, and for forecasting the currency inflow at South Sulawesi and North Sulawesi is TSR. Additionally, ARIMAX is the best model for forecasting the currency outflow at North Sulawesi. Hence, the results show that more complex models do not neccessary yield more accurate forecast than the simpler one.

  5. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    NASA Astrophysics Data System (ADS)

    Soltanzadeh, I.; Azadi, M.; Vakili, G. A.

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  6. Global analysis of seasonal streamflow predictability using an ensemble prediction system and observations from 6192 small catchments worldwide

    NASA Astrophysics Data System (ADS)

    van Dijk, Albert I. J. M.; Peña-Arancibia, Jorge L.; Wood, Eric F.; Sheffield, Justin; Beck, Hylke E.

    2013-05-01

    Ideally, a seasonal streamflow forecasting system would ingest skilful climate forecasts and propagate these through calibrated hydrological models initialized with observed catchment conditions. At global scale, practical problems exist in each of these aspects. For the first time, we analyzed theoretical and actual skill in bimonthly streamflow forecasts from a global ensemble streamflow prediction (ESP) system. Forecasts were generated six times per year for 1979-2008 by an initialized hydrological model and an ensemble of 1° resolution daily climate estimates for the preceding 30 years. A post-ESP conditional sampling method was applied to 2.6% of forecasts, based on predictive relationships between precipitation and 1 of 21 climate indices prior to the forecast date. Theoretical skill was assessed against a reference run with historic forcing. Actual skill was assessed against streamflow records for 6192 small (<10,000 km2) catchments worldwide. The results show that initial catchment conditions provide the main source of skill. Post-ESP sampling enhanced skill in equatorial South America and Southeast Asia, particularly in terms of tercile probability skill, due to the persistence and influence of the El Niño Southern Oscillation. Actual skill was on average 54% of theoretical skill but considerably more for selected regions and times of year. The realized fraction of the theoretical skill probably depended primarily on the quality of precipitation estimates. Forecast skill could be predicted as the product of theoretical skill and historic model performance. Increases in seasonal forecast skill are likely to require improvement in the observation of precipitation and initial hydrological conditions.

  7. An early warning system for groundwater pollution based on the assessment of groundwater pollution risks.

    NASA Astrophysics Data System (ADS)

    Zhang, Weihong.; Zhao, Yongsheng; Hong, Mei; Guo, Xiaodong

    2009-04-01

    Groundwater pollution usually is complex and concealed, remediation of which is difficult, high cost, time-consuming, and ineffective. An early warning system for groundwater pollution is needed that detects groundwater quality problems and gets the information necessary to make sound decisions before massive groundwater quality degradation occurs. Groundwater pollution early warning were performed by considering comprehensively the current groundwater quality, groundwater quality varying trend and groundwater pollution risk . The map of the basic quality of the groundwater was obtained by fuzzy comprehensive evaluation or BP neural network evaluation. Based on multi-annual groundwater monitoring datasets, Water quality state in sometime of the future was forecasted using time-sequenced analyzing methods. Water quality varying trend was analyzed by Spearman's rank correlative coefficient.The relative risk map of groundwater pollution was estimated through a procedure that identifies, cell by cell,the values of three factors, that is inherent vulnerability, load risk of pollution source and contamination hazard. DRASTIC method was used to assess inherent vulnerability of aquifer. Load risk of pollution source was analyzed based on the potential of contamination and pollution degree. Assessment index of load risk of pollution source which involves the variety of pollution source, quantity of contaminants, releasing potential of pollutants, and distance were determined. The load risks of all sources considered by GIS overlay technology. Early warning model of groundwater pollution combined with ComGIS technology organically, the regional groundwater pollution early-warning information system was developed, and applied it into Qiqiha'er groundwater early warning. It can be used to evaluate current water quality, to forecast water quality changing trend, and to analyze space-time influencing range of groundwater quality by natural process and human activities. Keywords: groundwater pollution, early warning, aquifer vulnerability, pollution load, pollution risk, ComGIS

  8. Improving the Forecast Accuracy of an Ocean Observation and Prediction System by Adaptive Control of the Sensor Network

    NASA Astrophysics Data System (ADS)

    Talukder, A.; Panangadan, A. V.; Blumberg, A. F.; Herrington, T.; Georgas, N.

    2008-12-01

    The New York Harbor Observation and Prediction System (NYHOPS) is a real-time, estuarine and coastal ocean observing and modeling system for the New York Harbor and surrounding waters. Real-time measurements from in-situ mobile and stationary sensors in the NYHOPS networks are assimilated into marine forecasts in order to reduce the discrepancy with ground truth. The forecasts are obtained from the ECOMSED hydrodynamic model, a shallow water derivative of the Princeton Ocean Model. Currently, all sensors in the NYHOPS system are operated in a fixed mode with uniform sampling rates. This technology infusion effort demonstrates the use of Model Predictive Control (MPC) to autonomously adapt the operation of both mobile and stationary sensors in response to changing events that are -automatically detected from the ECOMSED forecasts. The controller focuses sensing resources on those regions that are expected to be impacted by the detected events. The MPC approach involves formulating the problem of calculating the optimal sensor parameters as a constrained multi-objective optimization problem. We have developed an objective function that takes into account the spatiotemporal relationship of the in-situ sensor locations and the locations of events detected by the model. Experiments in simulation were carried out using data collected during a freshwater flooding event. The location of the resulting freshwater plume was calculated from the corresponding model forecasts and was used by the MPC controller to derive control parameters for the sensing assets. The operational parameters that are controlled include the sampling rates of stationary sensors, paths of unmanned underwater vehicles (UUVs), and data transfer routes between sensors and the central modeling computer. The simulation experiments show that MPC-based sensor control reduces the RMS error in the forecast by a factor of 380% as compared to uniform sampling. The paths of multiple UUVs were simultaneously calculated such that measurements from on-board sensors would lead to maximal reduction in the forecast error after data assimilation. The MPC controller also reduces the consumption of system resources such as energy expended in sampling and wireless communication. The MPC-based control approach can be generalized to accept data from remote sensing satellites. This will enable in-situ sensors to be regulated using forecasts generated by assimilating local high resolution in-situ measurements with wide-area observations from remote sensing satellites.

  9. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced.more » The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  10. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Florita, Anthony R; Krishnan, Venkat K

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) ismore » analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.« less

  11. An actual load forecasting methodology by interval grey modeling based on the fractional calculus.

    PubMed

    Yang, Yang; Xue, Dingyü

    2017-07-17

    The operation processes for thermal power plant are measured by the real-time data, and a large number of historical interval data can be obtained from the dataset. Within defined periods of time, the interval information could provide important information for decision making and equipment maintenance. Actual load is one of the most important parameters, and the trends hidden in the historical data will show the overall operation status of the equipments. However, based on the interval grey parameter numbers, the modeling and prediction process is more complicated than the one with real numbers. In order not lose any information, the geometric coordinate features are used by the coordinates of area and middle point lines in this paper, which are proved with the same information as the original interval data. The grey prediction model for interval grey number by the fractional-order accumulation calculus is proposed. Compared with integer-order model, the proposed method could have more freedom with better performance for modeling and prediction, which can be widely used in the modeling process and prediction for the small amount interval historical industry sequence samples. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Forecasting stock market volatility: Do realized skewness and kurtosis help?

    NASA Astrophysics Data System (ADS)

    Mei, Dexiang; Liu, Jing; Ma, Feng; Chen, Wang

    2017-09-01

    In this study, we investigate the predictability of the realized skewness (RSK) and realized kurtosis (RKU) to stock market volatility, that has not been addressed in the existing studies. Out-of-sample results show that RSK, which can significantly improve forecast accuracy in mid- and long-term, is more powerful than RKU in forecasting volatility. Whereas these variables are useless in short-term forecasting. Furthermore, we employ the realized kernel (RK) for the robustness analysis and the conclusions are consistent with the RV measures. Our results are of great importance for portfolio allocation and financial risk management.

  13. Optimal Asset Distribution for Environmental Assessment and Forecasting Based on Observations, Adaptive Sampling, and Numerical Prediction

    DTIC Science & Technology

    2012-09-30

    and Forecasting Based on Observations, Adaptive Sampling, and Numerical Prediction Steven R. Ramp Soliton Ocean Services, Inc. 691 Country Club... Soliton Ocean Services, Inc,691 Country Club Drive,Monterey,CA,93924 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...shortwave. The results show that the incoming shortwave radiation was the dominant term, even when averaged over the dark hours, which accounts

  14. A preliminary study of the statistical analyses and sampling strategies associated with the integration of remote sensing capabilities into the current agricultural crop forecasting system

    NASA Technical Reports Server (NTRS)

    Sand, F.; Christie, R.

    1975-01-01

    Extending the crop survey application of remote sensing from small experimental regions to state and national levels requires that a sample of agricultural fields be chosen for remote sensing of crop acreage, and that a statistical estimate be formulated with measurable characteristics. The critical requirements for the success of the application are reviewed in this report. The problem of sampling in the presence of cloud cover is discussed. Integration of remotely sensed information about crops into current agricultural crop forecasting systems is treated on the basis of the USDA multiple frame survey concepts, with an assumed addition of a new frame derived from remote sensing. Evolution of a crop forecasting system which utilizes LANDSAT and future remote sensing systems is projected for the 1975-1990 time frame.

  15. Predicting vehicle fuel consumption patterns using floating vehicle data.

    PubMed

    Du, Yiman; Wu, Jianping; Yang, Senyan; Zhou, Liutong

    2017-09-01

    The status of energy consumption and air pollution in China is serious. It is important to analyze and predict the different fuel consumption of various types of vehicles under different influence factors. In order to fully describe the relationship between fuel consumption and the impact factors, massive amounts of floating vehicle data were used. The fuel consumption pattern and congestion pattern based on large samples of historical floating vehicle data were explored, drivers' information and vehicles' parameters from different group classification were probed, and the average velocity and average fuel consumption in the temporal dimension and spatial dimension were analyzed respectively. The fuel consumption forecasting model was established by using a Back Propagation Neural Network. Part of the sample set was used to train the forecasting model and the remaining part of the sample set was used as input to the forecasting model. Copyright © 2017. Published by Elsevier B.V.

  16. Impacts of the Midwestern Drought Forecasts of 2000.

    NASA Astrophysics Data System (ADS)

    Changnon, Stanley A.

    2002-10-01

    In March of 2000 (and again in April and May) NOAA issued long-range forecasts indicating that an existing Midwestern drought would continue and intensify through the upcoming summer. These forecasts received extensive media coverage and wide public attention. If the drought persisted and intensified during the summer of 2000, significant agricultural and water supply problems would occur. However, in late May, June, and July heavy rains fell throughout most of the Midwest, ending the drought in most areas and revealing that the forecast was incorrect for most of the Midwest. Significant media coverage was devoted to the `failed' forecast, with considerable speculation that major economic hardship had resulted from the forecast. This study assesses the effects of the failed drought forecast on agricultural and water agency actions in the Midwest. Assessment of the agricultural and water management sectors revealed notable commonalities. Most people surveyed were aware of the drought forecasts, and the information sources were diverse. One-third of those surveyed indicated they did nothing as a result of the forecasts. The decisions and actions taken by others as a result of the forecasts provided mixed impacts. The water resource actions such as conserving water, seeking new sources, and convening state drought groups resulted in little cost and were considered to be beneficial. However, in the three areas of agricultural impacts (crop production shifts, crop insurance purchases, and grain market choices), mainly negative outcomes occurred. The 13 March issuance of the forecast was too late for producers to make sizable changes in production practices or to alter insurance coverage greatly, and most forecast-based actions taken in these two areas were considered to be negative but financially minor losses. However, 48% of the 1017 producers sampled altered their normal crop marketing practices, which in 84% of the cases led to sizable losses in revenue. This loss can be extrapolated as $1.1 billion for the entire Midwest if the sample statistics are representative of the region. A common result of the failed drought forecast among its users was a loss of credibility in climate predictions and a reluctance to use them in the future. Credibility is a fragile commodity that is difficult to obtain and is easy to lose.

  17. Building Technology Forecast and Evaluation (BTFE). Volume 2. Evaluation of Two Structural Systems

    DTIC Science & Technology

    1990-11-01

    insulative foam ( expanded polystyrene ) strips between each truss. The assembly is held together with 14-gauge wires welded to the trusses on 2-in. centers...structural load bearing qualities expanded polystyrene . No taping and mudding. Ar. ~J~ .wplrtpd( at each irllnfrnPllo Tile I hin- set or float over

  18. Impact on Hurricane Track and Intensity Forecasts of GPS Dropwindsonde Observations from the First-Season Flights of the NOAA Gulfstream-IV Jet Aircraft.

    NASA Astrophysics Data System (ADS)

    Aberson, Sim D.; Franklin, James L.

    1999-03-01

    In 1997, the Tropical Prediction Center (TPC) began operational Gulfstream-IV jet aircraft missions to improve the numerical guidance for hurricanes threatening the continental United States, Puerto Rico, and the Virgin Islands. During these missions, the new generation of Global Positioning System dropwindsondes were released from the aircraft at 150-200-km intervals along the flight track in the environment of the tropical cyclone to obtain profiles of wind, temperature, and humidity from flight level to the surface. The observations were ingested into the global model at the National Centers for Environmental Prediction, which subsequently serves as initial and boundary conditions to other numerical tropical cyclone models. Because of a lack of tropical cyclone activity in the Atlantic basin, only five such missions were conducted during the inaugural 1997 hurricane season.Due to logistical constraints, sampling in all quadrants of the storm environment was accomplished in only one of the five cases during 1997. Nonetheless, the dropwindsonde observations improved mean track forecasts from the Geophysical Fluid Dynamics Laboratory hurricane model by as much as 32%, and the intensity forecasts by as much as 20% during the hurricane watch period (within 48 h of projected landfall). Forecasts from another dynamical tropical cyclone model (VICBAR) also showed modest improvements with the dropwindsonde observations. These improvements, if confirmed by a larger sample, represent a large step toward the forecast accuracy goals of TPC. The forecast track improvements are as large as those accumulated over the past 20-25 years, and those for forecast intensity provide further evidence that better synoptic-scale data can lead to more skillful dynamical tropical cyclone intensity forecasts.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler

    This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less

  20. Forecasting short-term data center network traffic load with convolutional neural networks.

    PubMed

    Mozo, Alberto; Ordozgoiti, Bruno; Gómez-Canaval, Sandra

    2018-01-01

    Efficient resource management in data centers is of central importance to content service providers as 90 percent of the network traffic is expected to go through them in the coming years. In this context we propose the use of convolutional neural networks (CNNs) to forecast short-term changes in the amount of traffic crossing a data center network. This value is an indicator of virtual machine activity and can be utilized to shape the data center infrastructure accordingly. The behaviour of network traffic at the seconds scale is highly chaotic and therefore traditional time-series-analysis approaches such as ARIMA fail to obtain accurate forecasts. We show that our convolutional neural network approach can exploit the non-linear regularities of network traffic, providing significant improvements with respect to the mean absolute and standard deviation of the data, and outperforming ARIMA by an increasingly significant margin as the forecasting granularity is above the 16-second resolution. In order to increase the accuracy of the forecasting model, we exploit the architecture of the CNNs using multiresolution input distributed among separate channels of the first convolutional layer. We validate our approach with an extensive set of experiments using a data set collected at the core network of an Internet Service Provider over a period of 5 months, totalling 70 days of traffic at the one-second resolution.

  1. Forecasting short-term data center network traffic load with convolutional neural networks

    PubMed Central

    Ordozgoiti, Bruno; Gómez-Canaval, Sandra

    2018-01-01

    Efficient resource management in data centers is of central importance to content service providers as 90 percent of the network traffic is expected to go through them in the coming years. In this context we propose the use of convolutional neural networks (CNNs) to forecast short-term changes in the amount of traffic crossing a data center network. This value is an indicator of virtual machine activity and can be utilized to shape the data center infrastructure accordingly. The behaviour of network traffic at the seconds scale is highly chaotic and therefore traditional time-series-analysis approaches such as ARIMA fail to obtain accurate forecasts. We show that our convolutional neural network approach can exploit the non-linear regularities of network traffic, providing significant improvements with respect to the mean absolute and standard deviation of the data, and outperforming ARIMA by an increasingly significant margin as the forecasting granularity is above the 16-second resolution. In order to increase the accuracy of the forecasting model, we exploit the architecture of the CNNs using multiresolution input distributed among separate channels of the first convolutional layer. We validate our approach with an extensive set of experiments using a data set collected at the core network of an Internet Service Provider over a period of 5 months, totalling 70 days of traffic at the one-second resolution. PMID:29408936

  2. Modeling and Forecasting Mortality With Economic Growth: A Multipopulation Approach.

    PubMed

    Boonen, Tim J; Li, Hong

    2017-10-01

    Research on mortality modeling of multiple populations focuses mainly on extrapolating past mortality trends and summarizing these trends by one or more common latent factors. This article proposes a multipopulation stochastic mortality model that uses the explanatory power of economic growth. In particular, we extend the Li and Lee model (Li and Lee 2005) by including economic growth, represented by the real gross domestic product (GDP) per capita, to capture the common mortality trend for a group of populations with similar socioeconomic conditions. We find that our proposed model provides a better in-sample fit and an out-of-sample forecast performance. Moreover, it generates lower (higher) forecasted period life expectancy for countries with high (low) GDP per capita than the Li and Lee model.

  3. Study on the medical meteorological forecast of the number of hypertension inpatient based on SVR

    NASA Astrophysics Data System (ADS)

    Zhai, Guangyu; Chai, Guorong; Zhang, Haifeng

    2017-06-01

    The purpose of this study is to build a hypertension prediction model by discussing the meteorological factors for hypertension incidence. The research method is selecting the standard data of relative humidity, air temperature, visibility, wind speed and air pressure of Lanzhou from 2010 to 2012(calculating the maximum, minimum and average value with 5 days as a unit ) as the input variables of Support Vector Regression(SVR) and the standard data of hypertension incidence of the same period as the output dependent variables to obtain the optimal prediction parameters by cross validation algorithm, then by SVR algorithm learning and training, a SVR forecast model for hypertension incidence is built. The result shows that the hypertension prediction model is composed of 15 input independent variables, the training accuracy is 0.005, the final error is 0.0026389. The forecast accuracy based on SVR model is 97.1429%, which is higher than statistical forecast equation and neural network prediction method. It is concluded that SVR model provides a new method for hypertension prediction with its simple calculation, small error as well as higher historical sample fitting and Independent sample forecast capability.

  4. Price elasticity matrix of demand in power system considering demand response programs

    NASA Astrophysics Data System (ADS)

    Qu, Xinyao; Hui, Hongxun; Yang, Shengchun; Li, Yaping; Ding, Yi

    2018-02-01

    The increasing renewable energy power generations have brought more intermittency and volatility to the electric power system. Demand-side resources can improve the consumption of renewable energy by demand response (DR), which becomes one of the important means to improve the reliability of power system. In price-based DR, the sensitivity analysis of customer’s power demand to the changing electricity prices is pivotal for setting reasonable prices and forecasting loads of power system. This paper studies the price elasticity matrix of demand (PEMD). An improved PEMD model is proposed based on elasticity effect weight, which can unify the rigid loads and flexible loads. Moreover, the structure of PEMD, which is decided by price policies and load types, and the calculation method of PEMD are also proposed. Several cases are studied to prove the effectiveness of this method.

  5. Scalable and balanced dynamic hybrid data assimilation

    NASA Astrophysics Data System (ADS)

    Kauranne, Tuomo; Amour, Idrissa; Gunia, Martin; Kallio, Kari; Lepistö, Ahti; Koponen, Sampsa

    2017-04-01

    Scalability of complex weather forecasting suites is dependent on the technical tools available for implementing highly parallel computational kernels, but to an equally large extent also on the dependence patterns between various components of the suite, such as observation processing, data assimilation and the forecast model. Scalability is a particular challenge for 4D variational assimilation methods that necessarily couple the forecast model into the assimilation process and subject this combination to an inherently serial quasi-Newton minimization process. Ensemble based assimilation methods are naturally more parallel, but large models force ensemble sizes to be small and that results in poor assimilation accuracy, somewhat akin to shooting with a shotgun in a million-dimensional space. The Variational Ensemble Kalman Filter (VEnKF) is an ensemble method that can attain the accuracy of 4D variational data assimilation with a small ensemble size. It achieves this by processing a Gaussian approximation of the current error covariance distribution, instead of a set of ensemble members, analogously to the Extended Kalman Filter EKF. Ensemble members are re-sampled every time a new set of observations is processed from a new approximation of that Gaussian distribution which makes VEnKF a dynamic assimilation method. After this a smoothing step is applied that turns VEnKF into a dynamic Variational Ensemble Kalman Smoother VEnKS. In this smoothing step, the same process is iterated with frequent re-sampling of the ensemble but now using past iterations as surrogate observations until the end result is a smooth and balanced model trajectory. In principle, VEnKF could suffer from similar scalability issues as 4D-Var. However, this can be avoided by isolating the forecast model completely from the minimization process by implementing the latter as a wrapper code whose only link to the model is calling for many parallel and totally independent model runs, all of them implemented as parallel model runs themselves. The only bottleneck in the process is the gathering and scattering of initial and final model state snapshots before and after the parallel runs which requires a very efficient and low-latency communication network. However, the volume of data communicated is small and the intervening minimization steps are only 3D-Var, which means their computational load is negligible compared with the fully parallel model runs. We present example results of scalable VEnKF with the 4D lake and shallow sea model COHERENS, assimilating simultaneously continuous in situ measurements in a single point and infrequent satellite images that cover a whole lake, with the fully scalable VEnKF.

  6. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity

    PubMed Central

    Ahn, Kwangwon

    2017-01-01

    We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts’ forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems. PMID:28498831

  7. Reasonable Forecasts

    ERIC Educational Resources Information Center

    Taylor, Kelley R.

    2010-01-01

    This article presents a sample legal battle that illustrates school officials' "reasonable forecasts" of substantial disruption in the school environment. In 2006, two students from a Texas high school came to school carrying purses decorated with images of the Confederate flag. The school district has a zero-tolerance policy for…

  8. A Japanese New Altimetry Mission, COMPIRA - Towards High Temporal and Spatial Sampling of Sea Surface Height Measurement

    NASA Astrophysics Data System (ADS)

    Ito, N.; Uematsu, A.; Yajima, Y.; Isoguchi, O.

    2014-12-01

    Japan Aerospace Exploration Agency (JAXA) is working on a conceptual study of altimeter mission named Coastal and Ocean measurement Mission with Precise and Innovative Radar Altimeter (COMPIRA), which will carry a wide-swath altimeter named Synthetic aperture radar (SAR) Height Imaging Oceanic Sensor with Advanced Interferometry (SHIOSAI). Capturing meso/submeso-scale phenomena is one of important objectives of the COMPIRA mission, as well as operational oceanography and fishery. For operational oceanography including coastal forecast, swath of SHIOSAI is selected to be 80 km in left and right sides to maximize temporal and spatial sampling of the sea surface height. Orbit specifications are also designed to be better sampling especially for mid-latitude region. That is, a spatial grid sampling is 5 km and an observation times per revisit period (about 10 days) is 2 to 3 times. In order to meet both sampling frequency and spatial coverage requirements as much as possible, orbit inclination was set relatively low, 51 degrees. Although this sampling frequency is, of course, not enough high to capture time evolution of coastal phenomena, an assimilation process would compensate its time evolution if 2D SSH fields was observed at least once within decal time scale of phenomena. JAXA has launched a framework called "Coastal forecast core team" to aim at developing coastal forecast system through pre-launch activities toward COMPIRA. Assimilation segment as well as satellite and in situ data provision will play an important role on these activities. As a first step, we evaluated effects of ocean current forecast improvement with COMPIRA-simulated wide-swath and high sampling sea surface heights (SSH) data. Simulated SSH data are generated from regional ocean numerical models and the COMPIRA orbit and error specifications. Then, identical twin experiments are conducted to investigate the effect of wide-swath SSH measurements on coastal forecast in the Tohoku Pacific coast region. The experiment shows that simulated sea surface current using COMPIRA data as an input data for assimilation well represents vortical feature, which cannot be reproduced by conventional nadir altimeters.

  9. Using Information Processing Techniques to Forecast, Schedule, and Deliver Sustainable Energy to Electric Vehicles

    NASA Astrophysics Data System (ADS)

    Pulusani, Praneeth R.

    As the number of electric vehicles on the road increases, current power grid infrastructure will not be able to handle the additional load. Some approaches in the area of Smart Grid research attempt to mitigate this, but those approaches alone will not be sufficient. Those approaches and traditional solution of increased power production can result in an insufficient and imbalanced power grid. It can lead to transformer blowouts, blackouts and blown fuses, etc. The proposed solution will supplement the ``Smart Grid'' to create a more sustainable power grid. To solve or mitigate the magnitude of the problem, measures can be taken that depend on weather forecast models. For instance, wind and solar forecasts can be used to create first order Markov chain models that will help predict the availability of additional power at certain times. These models will be used in conjunction with the information processing layer and bidirectional signal processing components of electric vehicle charging systems, to schedule the amount of energy transferred per time interval at various times. The research was divided into three distinct components: (1) Renewable Energy Supply Forecast Model, (2) Energy Demand Forecast from PEVs, and (3) Renewable Energy Resource Estimation. For the first component, power data from a local wind turbine, and weather forecast data from NOAA were used to develop a wind energy forecast model, using a first order Markov chain model as the foundation. In the second component, additional macro energy demand from PEVs in the Greater Rochester Area was forecasted by simulating concurrent driving routes. In the third component, historical data from renewable energy sources was analyzed to estimate the renewable resources needed to offset the energy demand from PEVs. The results from these models and components can be used in the smart grid applications for scheduling and delivering energy. Several solutions are discussed to mitigate the problem of overloading transformers, lack of energy supply, and higher utility costs.

  10. Adequacy assessment of composite generation and transmission systems incorporating wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Gao, Yi

    The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.

  11. Time series analysis of temporal trends in the pertussis incidence in Mainland China from 2005 to 2016.

    PubMed

    Zeng, Qianglin; Li, Dandan; Huang, Gui; Xia, Jin; Wang, Xiaoming; Zhang, Yamei; Tang, Wanping; Zhou, Hui

    2016-08-31

    Short-term forecast of pertussis incidence is helpful for advanced warning and planning resource needs for future epidemics. By utilizing the Auto-Regressive Integrated Moving Average (ARIMA) model and Exponential Smoothing (ETS) model as alterative models with R software, this paper analyzed data from Chinese Center for Disease Control and Prevention (China CDC) between January 2005 and June 2016. The ARIMA (0,1,0)(1,1,1)12 model (AICc = 1342.2 BIC = 1350.3) was selected as the best performing ARIMA model and the ETS (M,N,M) model (AICc = 1678.6, BIC = 1715.4) was selected as the best performing ETS model, and the ETS (M,N,M) model with the minimum RMSE was finally selected for in-sample-simulation and out-of-sample forecasting. Descriptive statistics showed that the reported number of pertussis cases by China CDC increased by 66.20% from 2005 (4058 cases) to 2015 (6744 cases). According to Hodrick-Prescott filter, there was an apparent cyclicity and seasonality in the pertussis reports. In out of sample forecasting, the model forecasted a relatively high incidence cases in 2016, which predicates an increasing risk of ongoing pertussis resurgence in the near future. In this regard, the ETS model would be a useful tool in simulating and forecasting the incidence of pertussis, and helping decision makers to take efficient decisions based on the advanced warning of disease incidence.

  12. Replacement Beef Cow Valuation under Data Availability Constraints

    PubMed Central

    Hagerman, Amy D.; Thompson, Jada M.; Ham, Charlotte; Johnson, Kamina K.

    2017-01-01

    Economists are often tasked with estimating the benefits or costs associated with livestock production losses; however, lack of available data or absence of consistent reporting can reduce the accuracy of these valuations. This work looks at three potential estimation techniques for determining the value for replacement beef cows with varying types of market data to proxy constrained data availability and discusses the potential margin of error for each technique. Oklahoma bred replacement cows are valued using hedonic pricing based on Oklahoma bred cow data—a best case scenario—vector error correction modeling (VECM) based on national cow sales data and cost of production (COP) based on just a representative enterprise budget and very limited sales data. Each method was then used to perform a within-sample forecast of 2016 January to December, and forecasts are compared with the 2016 monthly observed market prices in Oklahoma using the mean absolute percent error (MAPE). Hedonic pricing methods tend to overvalue for within-sample forecasting but performed best, as measured by MAPE for high quality cows. The VECM tended to undervalue cows but performed best for younger animals. COP performed well, compared with the more data intensive methods. Examining each method individually across eight representative replacement beef female types, the VECM forecast resulted in a MAPE under 10% for 33% of forecasted months, followed by hedonic pricing at 24% of the forecasted months and COP at 14% of the forecasted months for average quality beef females. For high quality females, the hedonic pricing method worked best producing a MAPE under 10% in 36% of the forecasted months followed by the COP method at 21% of months and the VECM at 14% of the forecasted months. These results suggested that livestock valuation method selection was not one-size-fits-all and may need to vary based not only on the data available but also on the characteristics (e.g., quality or age) of the livestock being valued. PMID:29164141

  13. Study on Electricity Business Expansion and Electricity Sales Based on Seasonal Adjustment

    NASA Astrophysics Data System (ADS)

    Zhang, Yumin; Han, Xueshan; Wang, Yong; Zhang, Li; Yang, Guangsen; Sun, Donglei; Wang, Bolun

    2017-05-01

    [1] proposed a novel analysis and forecast method of electricity business expansion based on Seasonal Adjustment, we extend this work to include the effect the micro and macro aspects, respectively. From micro aspect, we introduce the concept of load factor to forecast the stable value of electricity consumption of single new consumer after the installation of new capacity of the high-voltage transformer. From macro aspects, considering the growth of business expanding is also stimulated by the growth of electricity sales, it is necessary to analyse the antecedent relationship between business expanding and electricity sales. First, forecast electricity consumption of customer group and release rules of expanding capacity, respectively. Second, contrast the degree of fitting and prediction accuracy to find out the antecedence relationship and analyse the reason. Also, it can be used as a contrast to observe the influence of customer group in different ranges on the prediction precision. Finally, Simulation results indicate that the proposed method is accurate to help determine the value of expanding capacity and electricity consumption.

  14. Day-Ahead Short-Term Forecasting Electricity Load via Approximation

    NASA Astrophysics Data System (ADS)

    Khamitov, R. N.; Gritsay, A. S.; Tyunkov, D. A.; E Sinitsin, G.

    2017-04-01

    The method of short-term forecasting of a power consumption which can be applied to short-term forecasting of power consumption is offered. The offered model is based on sinusoidal function for the description of day and night cycles of power consumption. Function coefficients - the period and amplitude are set up is adaptive, considering dynamics of power consumption with use of an artificial neural network. The presented results are tested on real retrospective data of power supply company. The offered method can be especially useful if there are no opportunities of collection of interval indications of metering devices of consumers, and the power supply company operates with electrical supply points. The offered method can be used by any power supply company upon purchase of the electric power in the wholesale market. For this purpose, it is necessary to receive coefficients of approximation of sinusoidal function and to have retrospective data on power consumption on an interval not less than one year.

  15. Forecasting Space Weather-Induced GPS Performance Degradation Using Random Forest

    NASA Astrophysics Data System (ADS)

    Filjar, R.; Filic, M.; Milinkovic, F.

    2017-12-01

    Space weather and ionospheric dynamics have a profound effect on positioning performance of the Global Satellite Navigation System (GNSS). However, the quantification of that effect is still the subject of scientific activities around the world. In the latest contribution to the understanding of the space weather and ionospheric effects on satellite-based positioning performance, we conducted a study of several candidates for forecasting method for space weather-induced GPS positioning performance deterioration. First, a 5-days set of experimentally collected data was established, encompassing the space weather and ionospheric activity indices (including: the readings of the Sudden Ionospheric Disturbance (SID) monitors, components of geomagnetic field strength, global Kp index, Dst index, GPS-derived Total Electron Content (TEC) samples, standard deviation of TEC samples, and sunspot number) and observations of GPS positioning error components (northing, easting, and height positioning error) derived from the Adriatic Sea IGS reference stations' RINEX raw pseudorange files in quiet space weather periods. This data set was split into the training and test sub-sets. Then, a selected set of supervised machine learning methods based on Random Forest was applied to the experimentally collected data set in order to establish the appropriate regional (the Adriatic Sea) forecasting models for space weather-induced GPS positioning performance deterioration. The forecasting models were developed in the R/rattle statistical programming environment. The forecasting quality of the regional forecasting models developed was assessed, and the conclusions drawn on the advantages and shortcomings of the regional forecasting models for space weather-caused GNSS positioning performance deterioration.

  16. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormier, Dallas; Edra, Sherwin; Espinoza, Michael

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations,more » identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.« less

  17. The effect of real-time pricing on load shifting in a highly renewable power system dominated by generation from the renewable sources of wind and photovoltaics

    NASA Astrophysics Data System (ADS)

    Kies, Alexander; Brown, Tom; Schlachtberger, David; Schramm, Stefan

    2017-04-01

    The supply-demand imbalance is a major concern in the presence of large shares of highly variable renewable generation from sources like wind and photovoltaics (PV) in power systems. Other than the measures on the generation side, such as flexible backup generation or energy storage, sector coupling or demand side management are the most likely option to counter imbalances, therefore to ease the integration of renewable generation. Demand side management usually refers to load shifting, which comprises the reaction of electricity consumers to price fluctuations. In this work, we derive a novel methodology to model the interplay of load shifting and provided incentives via real-time pricing in highly renewable power systems. We use weather data to simulate generation from the renewable sources of wind and photovoltaics, as well as historical load data, split into different consumption categories, such as, heating, cooling, domestic, etc., to model a simplified power system. Together with renewable power forecast data, a simple market model and approaches to incorporate sector coupling [1] and load shifting [2,3], we model the interplay of incentives and load shifting for different scenarios (e.g., in dependency of the risk-aversion of consumers or the forecast horizon) and demonstrate the practical benefits of load shifting. First, we introduce the novel methodology and compare it with existing approaches. Secondly, we show results of numerical simulations on the effects of load shifting: It supports the integration of PV power by providing a storage, which characteristics can be described as "daily" and provides a significant amount of balancing potential. Lastly, we propose an experimental setup to obtain empirical data on end-consumer load-shifting behaviour in response to price incentives. References [1] Brown, T., Schlachtberger, D., Kies. A., Greiner, M., Sector coupling in a highly renewable European energy system, Proc. of the 15th International Workshop on Large-Scale Integration of Wind Power into Power Systems as well as on Transmission Networks for Offshore Wind Power Plants, Vienna, Austria, 15.-17. November 2016 [2] Kleinhans, D.: Towards a systematic characterization of the potential of demand side management, arXiv preprint arXiv:1401.4121, 2014 [3] Kies, A., Schyska, B. U., von Bremen, L., The Demand Side Management Potential to Balance a Highly Renewable European Power System. Energies, 9(11), 955, 2016

  18. An empirical investigation on the forecasting ability of mallows model averaging in a macro economic environment

    NASA Astrophysics Data System (ADS)

    Yin, Yip Chee; Hock-Eam, Lim

    2012-09-01

    This paper investigates the forecasting ability of Mallows Model Averaging (MMA) by conducting an empirical analysis of five Asia countries, Malaysia, Thailand, Philippines, Indonesia and China's GDP growth rate. Results reveal that MMA has no noticeable differences in predictive ability compared to the general autoregressive fractional integrated moving average model (ARFIMA) and its predictive ability is sensitive to the effect of financial crisis. MMA could be an alternative forecasting method for samples without recent outliers such as financial crisis.

  19. Synoptic scale forecast skill and systematic errors in the MASS 2.0 model. [Mesoscale Atmospheric Simulation System

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.

    1985-01-01

    The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.

  20. Analysis of vibrational load influence upon passengers in trains with a compulsory body tilt

    NASA Astrophysics Data System (ADS)

    Antipin, D. Ya; Kobishchanov, V. V.; Lapshin, V. F.; Mitrakov, A. S.; Shorokhov, S. G.

    2017-02-01

    The procedure for forecasting the vibrational load influence upon passengers of trains of rolling stocks equipped with a system of a compulsory body tilt on railroad curves is offered. The procedure is based on the use of computer simulation methods and application of solid-state models of anthropometrical mannequins. As a result of the carried out investigations, there are substantiated criteria of the comfort level estimate for passengers in the rolling-stock under consideration. The procedure is approved by the example of the promising domestic rolling stock with a compulsory body tilt on railroad curves.

  1. 7 CFR 1710.302 - Financial forecasts-power supply borrowers.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... facilities; (3) Provide an in-depth analysis of the regional markets for power if loan feasibility depends to any degree on a borrower's ability to sell surplus power while its system loads grow to meet the... sensitivity analysis if required by RUS pursuant to § 1710.300(d)(5). (e) The projections shall be coordinated...

  2. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  3. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE PAGES

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    2018-03-01

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  4. Entropy Econometrics for combining regional economic forecasts: A Data-Weighted Prior Estimator

    NASA Astrophysics Data System (ADS)

    Fernández-Vázquez, Esteban; Moreno, Blanca

    2017-10-01

    Forecast combination has been studied in econometrics for a long time, and the literature has shown the superior performance of forecast combination over individual predictions. However, there is still controversy on which is the best procedure to specify the forecast weights. This paper explores the possibility of using a procedure based on Entropy Econometrics, which allows setting the weights for the individual forecasts as a mixture of different alternatives. In particular, we examine the ability of the Data-Weighted Prior Estimator proposed by Golan (J Econom 101(1):165-193, 2001) to combine forecasting models in a context of small sample sizes, a relative common scenario when dealing with time series for regional economies. We test the validity of the proposed approach using a simulation exercise and a real-world example that aims at predicting gross regional product growth rates for a regional economy. The forecasting performance of the Data-Weighted Prior Estimator proposed is compared with other combining methods. The simulation results indicate that in scenarios of heavily ill-conditioned datasets the approach suggested dominates other forecast combination strategies. The empirical results are consistent with the conclusions found in the numerical experiment.

  5. Age differences in affective forecasting and experienced emotion surrounding the 2008 U.S. presidential election

    PubMed Central

    Scheibe, Susanne; Mata, Rui; Carstensen, Laura L.

    2012-01-01

    In everyday life, people frequently make decisions based on tacit or explicit forecasts about the emotional consequences associated with the possible choices. We investigated age differences in such forecasts and their accuracy by surveying voters about their expected and, subsequently, their actual emotional responses to the 2008 U.S. presidential election. A sample of 762 Democratic and Republican voters aged 20 to 80 years participated in a web-based study; 346 could be re-contacted two days after the election. Older adults forecasted lower increases in high-arousal emotions (e.g., excitement after winning; anger after losing) and larger increases in low-arousal emotions (e.g., sluggishness after losing) than younger adults. Age differences in actual responses to the election were consistent with forecasts, albeit less pervasive. Additionally, among supporters of the winning candidate, but not among supporters of the losing candidate, forecasting accuracy was enhanced with age, suggesting a positivity effect in affective forecasting. These results add to emerging findings about the role of valence and arousal in emotional aging and demonstrate age differences in affective forecasting about a real-world event with an emotionally-charged outcome. PMID:21547760

  6. Potential predictability and forecast skill in ensemble climate forecast: the skill-persistence rule

    NASA Astrophysics Data System (ADS)

    Jin, Y.; Rong, X.; Liu, Z.

    2017-12-01

    This study investigates the factors that impact the forecast skill for the real world (actual skill) and perfect model (perfect skill) in ensemble climate model forecast with a series of fully coupled general circulation model forecast experiments. It is found that the actual skill of sea surface temperature (SST) in seasonal forecast is substantially higher than the perfect skill on a large part of the tropical oceans, especially the tropical Indian Ocean and the central-eastern Pacific Ocean. The higher actual skill is found to be related to the higher observational SST persistence, suggesting a skill-persistence rule: a higher SST persistence in the real world than in the model could overwhelm the model bias to produce a higher forecast skill for the real world than for the perfect model. The relation between forecast skill and persistence is further examined using a first-order autoregressive model (AR1) analytically for theoretical solutions and numerically for analogue experiments. The AR1 model study shows that the skill-persistence rule is strictly valid in the case of infinite ensemble size, but can be distorted by the sampling error and non-AR1 processes.

  7. On the Likely Utility of Hybrid Weights Optimized for Variances in Hybrid Error Covariance Models

    NASA Astrophysics Data System (ADS)

    Satterfield, E.; Hodyss, D.; Kuhl, D.; Bishop, C. H.

    2017-12-01

    Because of imperfections in ensemble data assimilation schemes, one cannot assume that the ensemble covariance is equal to the true error covariance of a forecast. Previous work demonstrated how information about the distribution of true error variances given an ensemble sample variance can be revealed from an archive of (observation-minus-forecast, ensemble-variance) data pairs. Here, we derive a simple and intuitively compelling formula to obtain the mean of this distribution of true error variances given an ensemble sample variance from (observation-minus-forecast, ensemble-variance) data pairs produced by a single run of a data assimilation system. This formula takes the form of a Hybrid weighted average of the climatological forecast error variance and the ensemble sample variance. Here, we test the extent to which these readily obtainable weights can be used to rapidly optimize the covariance weights used in Hybrid data assimilation systems that employ weighted averages of static covariance models and flow-dependent ensemble based covariance models. Univariate data assimilation and multi-variate cycling ensemble data assimilation are considered. In both cases, it is found that our computationally efficient formula gives Hybrid weights that closely approximate the optimal weights found through the simple but computationally expensive process of testing every plausible combination of weights.

  8. Empirical prediction intervals improve energy forecasting

    PubMed Central

    Kaack, Lynn H.; Apt, Jay; Morgan, M. Granger; McSharry, Patrick

    2017-01-01

    Hundreds of organizations and analysts use energy projections, such as those contained in the US Energy Information Administration (EIA)’s Annual Energy Outlook (AEO), for investment and policy decisions. Retrospective analyses of past AEO projections have shown that observed values can differ from the projection by several hundred percent, and thus a thorough treatment of uncertainty is essential. We evaluate the out-of-sample forecasting performance of several empirical density forecasting methods, using the continuous ranked probability score (CRPS). The analysis confirms that a Gaussian density, estimated on past forecasting errors, gives comparatively accurate uncertainty estimates over a variety of energy quantities in the AEO, in particular outperforming scenario projections provided in the AEO. We report probabilistic uncertainties for 18 core quantities of the AEO 2016 projections. Our work frames how to produce, evaluate, and rank probabilistic forecasts in this setting. We propose a log transformation of forecast errors for price projections and a modified nonparametric empirical density forecasting method. Our findings give guidance on how to evaluate and communicate uncertainty in future energy outlooks. PMID:28760997

  9. Evaluating the performance of infectious disease forecasts: A comparison of climate-driven and seasonal dengue forecasts for Mexico.

    PubMed

    Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio

    2016-09-26

    Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.

  10. Evaluating the performance of infectious disease forecasts: A comparison of climate-driven and seasonal dengue forecasts for Mexico

    PubMed Central

    Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio

    2016-01-01

    Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707

  11. Asymmetric affective forecasting errors and their correlation with subjective well-being

    PubMed Central

    2018-01-01

    Aims Social scientists have postulated that the discrepancy between achievements and expectations affects individuals' subjective well-being. Still, little has been done to qualify and quantify such a psychological effect. Our empirical analysis assesses the consequences of positive and negative affective forecasting errors—the difference between realized and expected subjective well-being—on the subsequent level of subjective well-being. Data We use longitudinal data on a representative sample of 13,431 individuals from the German Socio-Economic Panel. In our sample, 52% of individuals are females, average age is 43 years, average years of education is 11.4 and 27% of our sample lives in East Germany. Subjective well-being (measured by self-reported life satisfaction) is assessed on a 0–10 discrete scale and its sample average is equal to 6.75 points. Methods We develop a simple theoretical framework to assess the consequences of positive and negative affective forecasting errors—the difference between realized and expected subjective well-being—on the subsequent level of subjective well-being, properly accounting for the endogenous adjustment of expectations to positive and negative affective forecasting errors, and use it to derive testable predictions. Given the theoretical framework, we estimate two panel-data equations, the first depicting the association between positive and negative affective forecasting errors and the successive level of subjective well-being and the second describing the correlation between subjective well-being expectations for the future and hedonic failures and successes. Our models control for individual fixed effects and a large battery of time-varying demographic characteristics, health and socio-economic status. Results and conclusions While surpassing expectations is uncorrelated with subjective well-being, failing to match expectations is negatively associated with subsequent realizations of subjective well-being. Expectations are positively (negatively) correlated to positive (negative) forecasting errors. We speculate that in the first case the positive adjustment in expectations is strong enough to cancel out the potential positive effects on subjective well-being of beaten expectations, while in the second case it is not, and individuals persistently bear the negative emotional consequences of not achieving expectations. PMID:29513685

  12. STS-114: Discovery L-2 Countdown Status Briefing

    NASA Technical Reports Server (NTRS)

    2005-01-01

    George Diller of NASA Public Affairs hosted this briefing. Pete Nickolenko, NASA Test Director; Scott Higgenbotham, STS-114 Payload-Mission Manager; Cathy Winters, Shuttle Weather Officer were present. Pete reports his team has completed the avionics system check ups, servicing of the cryogenic tanks will take about seven hours that day, and will perform engine system checks and pad close outs come evening. Pete also summarized other standard close out activities: check ups of the Orbiter and ground communications network, rotary service, structure retraction, and external tank load (ETL). Pete reported that the mission will be 12 days with two weather contingency days, and end of mission landing scheduled at Kennedy Space Center (KSC) at approximately 11:00 in the morning, Eastern time on July 25th. Scott briefly reported that all hardware is on board Discovery, closed out, and ready to fly. Cathy reported that hurricane Dennis moved to the North and looking forward to launch. She mentioned of a new hurricane looming and will be named Emily, spotted some crosswinds which will migrate to the west, there is 30% probability weather prohibiting launch. Cathy further gave current weather forecast supported with charts: the Launch Forecast, Tanking Forecast, SRB (Shuttle Solid Rocket Booster) Forecast, CONUS and TAL Launch Sites Forecast, and with 24 hours and 48 hours turn around plan. Launch constraints, weather, crosswinds, cloud cover, ground imagery system, launch countdown, launch crews, mission management simulations, launch team simulations were topics covered with the News Media.

  13. The assisted prediction modelling frame with hybridisation and ensemble for business risk forecasting and an implementation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie

    2015-08-01

    The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.

  14. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    PubMed

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  15. Forecasting future phosphorus export to the Laurentian Great Lakes from land-derived nutrient inputs

    NASA Astrophysics Data System (ADS)

    LaBeau, M. B.; Robertson, D. M.; Mayer, A. S.; Pijanowski, B. C.

    2011-12-01

    Anthropogenic use of the land through agricultural and urban activities has significantly increased phosphorus loading to rivers that flow to the Great Lakes. Phosphorus (P) is a critical element in the eutrophication of the freshwater ecosystems, most notably the Great Lakes. To better understand factors influencing phosphorus delivery to aquatic systems and thus their potential harmful effects to lake ecosystems, models that predict P export should incorporate account for changing changes in anthropogenic activities. Land-derived P from high yielding sources, such as agriculture and urban areas, affect eutrophication at various scales (e.g. specific bays to all of Lake Erie). SPARROW (SPAtially Referenced Regression On Watershed attributes) is a spatially explicit watershed model that has been used to understand linkages between land-derived sources and nutrient transport to the Great Lakes. The Great Lakes region is expected to experience a doubling of urbanized areas along with a ten percent increase in agricultural use over the next 40 years, which is likely to increase P loading. To determine how these changes will impact P loading, SPARROW have been developed that relate changes in land use to changes in nutrient sources, including relationships between row crop acreage and fertilizer intensity and urban land use and point source intensity. We used land use projections from the Land Transformation Model, a, spatially explicit, neural-net based land change model. Land use patterns from current to 2040 were used as input into HydroSPARROW, a forecasting tool that enables SPARROW to simulate the effects of various land-use and climate scenarios. Consequently, this work is focusing on understanding the effects of how specific agriculture and urbanization activities affect P loading in the watersheds of the Laurentian Great Lakes to potentially find strategies to reduce the extent and severity of future eutrophication.

  16. Forecasting stock return volatility: A comparison between the roles of short-term and long-term leverage effects

    NASA Astrophysics Data System (ADS)

    Pan, Zhiyuan; Liu, Li

    2018-02-01

    In this paper, we extend the GARCH-MIDAS model proposed by Engle et al. (2013) to account for the leverage effect in short-term and long-term volatility components. Our in-sample evidence suggests that both short-term and long-term negative returns can cause higher future volatility than positive returns. Out-of-sample results show that the predictive ability of GARCH-MIDAS is significantly improved after taking the leverage effect into account. The leverage effect for short-term volatility component plays more important role than the leverage effect for long-term volatility component in affecting out-of-sample forecasting performance.

  17. Survival Sales Forecasting.

    ERIC Educational Resources Information Center

    Paradiso, James; Stair, Kenneth

    Intended to provide insight into the dynamics of demand analysis, this paper presents an eight-step method for forecasting sales. Focusing on sales levels that must be achieved to enjoy targeted profits in favor of the usual approach of emphasizing how much will be sold within a given period, a sample situation is provided to illustrate this…

  18. The NWRA Classification Infrastructure: description and extension to the Discriminant Analysis Flare Forecasting System (DAFFS)

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, Graham; Wagner, Eric

    2018-04-01

    A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.

  19. Species-area relationships and extinction forecasts.

    PubMed

    Halley, John M; Sgardeli, Vasiliki; Monokrousos, Nikolaos

    2013-05-01

    The species-area relationship (SAR) predicts that smaller areas contain fewer species. This is the basis of the SAR method that has been used to forecast large numbers of species committed to extinction every year due to deforestation. The method has a number of issues that must be handled with care to avoid error. These include the functional form of the SAR, the choice of equation parameters, the sampling procedure used, extinction debt, and forest regeneration. Concerns about the accuracy of the SAR technique often cite errors not much larger than the natural scatter of the SAR itself. Such errors do not undermine the credibility of forecasts predicting large numbers of extinctions, although they may be a serious obstacle in other SAR applications. Very large errors can arise from misinterpretation of extinction debt, inappropriate functional form, and ignoring forest regeneration. Major challenges remain to understand better the relationship between sampling protocol and the functional form of SARs and the dynamics of relaxation, especially in continental areas, and to widen the testing of extinction forecasts. © 2013 New York Academy of Sciences.

  20. Forecasting poductivity in forest fire suppression operations: A methodological approach based on suppression difficulty analysis and documented experience

    Treesearch

    Francisco Rodríguez y Silva; Armando González-Cabán

    2013-01-01

    The abandonment of land, the high energy load generated and accumulated by vegetation covers, climate change and interface scenarios in Mediterranean forest ecosystems are demanding serious attention to forest fire conditions. This is particularly true when dealing with the budget requirements for undertaking protection programs related to the state of current and...

  1. C/NOFS remote sensing of ionospheric reflectance

    NASA Astrophysics Data System (ADS)

    Burke, W. J.; Pfaff, R. F.; Martinis, C. R.; Gentile, L. C.

    2016-05-01

    Alfvén waves play critical roles in the electrodynamic coupling of plasmas at magnetically conjugate regions in near-Earth space. Associated electric (E*) and magnetic (δB*) field perturbations sampled by sensors on satellites in low-Earth orbits are generally superpositions of incident and reflected waves. However, lack of knowledge about ionospheric reflection coefficients (α) hinders understanding of generator outputs and load absorption of Alfvén wave energies. Here we demonstrate a new method for estimating α using satellite measurements of ambient E* and δB* then apply it to a case in which the Communication/Navigation Outage Forecasting System (C/NOFS) satellite flew conjugate to the field of view of a 630.0 nm all-sky imager at El Leoncito, Argentina, while medium-scale traveling ionosphere disturbances were detected in its field of view. In regions of relatively large amplitudes of E* and δB*, calculated α values ranged between 0.67 and 0.88. This implies that due to impedance mismatches, the generator ionosphere puts out significantly more electromagnetic energy than the load can absorb. Our analysis also uncovered caveats concerning the method's range of applicability in regions of low E* and δB*. The method can be validated in future satellite-based auroral studies where energetic particle precipitation fluxes can be used to make independent estimates of α.

  2. C/NOFS Remote Sensing of Ionospheric Reflectance

    NASA Technical Reports Server (NTRS)

    Burke, W. J.; Pfaff, Robert F.; Martinis, C. R.; Gentile, L. C.

    2016-01-01

    Alfvn waves play critical roles in the electrodynamic coupling of plasmas at magnetically conjugate regions in near-Earth space. Associated electric (E*) and magnetic (dec B*) field perturbations sampled by sensors on satellites in low-Earth orbits are generally super positions of incident and reflected waves. However, lack of knowledge about ionospheric reflection coefficients (alpha) hinders understanding of generator outputs and load absorption of Alfvn wave energies. Here we demonstrate a new method for estimating using satellite measurements of ambient E* and dec B* then apply it to a case in which the Communication Navigation Outage Forecasting System (CNOFS) satellite flew conjugate to the field of view of a 630.0 nm all-sky imager at El Leoncito, Argentina, while medium-scale traveling ionosphere disturbances were detected in its field of view. In regions of relatively large amplitudes of E* and B*,calculated values ranged between 0.67 and 0.88. This implies that due to impedance mismatches, the generator ionosphere puts out significantly more electromagnetic energy than the load can absorb. Our analysis also uncovered caveats concerning the methods range of applicability in regions of low E* and B*. The method can be validated in future satellite-based auroral studies where energetic particle precipitation fluxes can be used to make independent estimates of alpha.

  3. Evaluating the Impacts of Real-Time Pricing on the Cost and Value of Wind Generation

    DOE PAGES

    Siohansi, Ramteen

    2010-05-01

    One of the costs associated with integrating wind generation into a power system is the cost of redispatching the system in real-time due to day-ahead wind resource forecast errors. One possible way of reducing these redispatch costs is to introduce demand response in the form of real-time pricing (RTP), which could allow electricity demand to respond to actual real-time wind resource availability using price signals. A day-ahead unit commitment model with day-ahead wind forecasts and a real-time dispatch model with actual wind resource availability is used to estimate system operations in a high wind penetration scenario. System operations are comparedmore » to a perfect foresight benchmark, in which actual wind resource availability is known day-ahead. The results show that wind integration costs with fixed demands can be high, both due to real-time redispatch costs and lost load. It is demonstrated that introducing RTP can reduce redispatch costs and eliminate loss of load events. Finally, social surplus with wind generation and RTP is compared to a system with neither and the results demonstrate that introducing wind and RTP into a market can result in superadditive surplus gains.« less

  4. Prediction-based manufacturing center self-adaptive demand side energy optimization in cyber physical systems

    NASA Astrophysics Data System (ADS)

    Sun, Xinyao; Wang, Xue; Wu, Jiangwei; Liu, Youda

    2014-05-01

    Cyber physical systems(CPS) recently emerge as a new technology which can provide promising approaches to demand side management(DSM), an important capability in industrial power systems. Meanwhile, the manufacturing center is a typical industrial power subsystem with dozens of high energy consumption devices which have complex physical dynamics. DSM, integrated with CPS, is an effective methodology for solving energy optimization problems in manufacturing center. This paper presents a prediction-based manufacturing center self-adaptive energy optimization method for demand side management in cyber physical systems. To gain prior knowledge of DSM operating results, a sparse Bayesian learning based componential forecasting method is introduced to predict 24-hour electric load levels for specific industrial areas in China. From this data, a pricing strategy is designed based on short-term load forecasting results. To minimize total energy costs while guaranteeing manufacturing center service quality, an adaptive demand side energy optimization algorithm is presented. The proposed scheme is tested in a machining center energy optimization experiment. An AMI sensing system is then used to measure the demand side energy consumption of the manufacturing center. Based on the data collected from the sensing system, the load prediction-based energy optimization scheme is implemented. By employing both the PSO and the CPSO method, the problem of DSM in the manufacturing center is solved. The results of the experiment show the self-adaptive CPSO energy optimization method enhances optimization by 5% compared with the traditional PSO optimization method.

  5. Optimization modeling of U.S. renewable electricity deployment using local input variables

    NASA Astrophysics Data System (ADS)

    Bernstein, Adam

    For the past five years, state Renewable Portfolio Standard (RPS) laws have been a primary driver of renewable electricity (RE) deployments in the United States. However, four key trends currently developing: (i) lower natural gas prices, (ii) slower growth in electricity demand, (iii) challenges of system balancing intermittent RE within the U.S. transmission regions, and (iv) fewer economical sites for RE development, may limit the efficacy of RPS laws over the remainder of the current RPS statutes' lifetime. An outsized proportion of U.S. RE build occurs in a small number of favorable locations, increasing the effects of these variables on marginal RE capacity additions. A state-by-state analysis is necessary to study the U.S. electric sector and to generate technology specific generation forecasts. We used LP optimization modeling similar to the National Renewable Energy Laboratory (NREL) Renewable Energy Development System (ReEDS) to forecast RE deployment across the 8 U.S. states with the largest electricity load, and found state-level RE projections to Year 2031 significantly lower than thoseimplied in the Energy Information Administration (EIA) 2013 Annual Energy Outlook forecast. Additionally, the majority of states do not achieve their RPS targets in our forecast. Combined with the tendency of prior research and RE forecasts to focus on larger national and global scale models, we posit that further bottom-up state and local analysis is needed for more accurate policy assessment, forecasting, and ongoing revision of variables as parameter values evolve through time. Current optimization software eliminates much of the need for algorithm coding and programming, allowing for rapid model construction and updating across many customized state and local RE parameters. Further, our results can be tested against the empirical outcomes that will be observed over the coming years, and the forecast deviation from the actuals can be attributed to discrete parameter variances.

  6. Product demand forecasts using wavelet kernel support vector machine and particle swarm optimization in manufacture system

    NASA Astrophysics Data System (ADS)

    Wu, Qi

    2010-03-01

    Demand forecasts play a crucial role in supply chain management. The future demand for a certain product is the basis for the respective replenishment systems. Aiming at demand series with small samples, seasonal character, nonlinearity, randomicity and fuzziness, the existing support vector kernel does not approach the random curve of the sales time series in the space (quadratic continuous integral space). In this paper, we present a hybrid intelligent system combining the wavelet kernel support vector machine and particle swarm optimization for demand forecasting. The results of application in car sale series forecasting show that the forecasting approach based on the hybrid PSOWv-SVM model is effective and feasible, the comparison between the method proposed in this paper and other ones is also given, which proves that this method is, for the discussed example, better than hybrid PSOv-SVM and other traditional methods.

  7. Forecasting urban water demand: A meta-regression analysis.

    PubMed

    Sebri, Maamar

    2016-12-01

    Water managers and planners require accurate water demand forecasts over the short-, medium- and long-term for many purposes. These range from assessing water supply needs over spatial and temporal patterns to optimizing future investments and planning future allocations across competing sectors. This study surveys the empirical literature on the urban water demand forecasting using the meta-analytical approach. Specifically, using more than 600 estimates, a meta-regression analysis is conducted to identify explanations of cross-studies variation in accuracy of urban water demand forecasting. Our study finds that accuracy depends significantly on study characteristics, including demand periodicity, modeling method, forecasting horizon, model specification and sample size. The meta-regression results remain robust to different estimators employed as well as to a series of sensitivity checks performed. The importance of these findings lies in the conclusions and implications drawn out for regulators and policymakers and for academics alike. Copyright © 2016. Published by Elsevier Ltd.

  8. A Bayesian spatio-temporal model for forecasting Anaplasma species seroprevalence in domestic dogs within the contiguous United States.

    PubMed

    Liu, Yan; Watson, Stella C; Gettings, Jenna R; Lund, Robert B; Nordone, Shila K; Yabsley, Michael J; McMahan, Christopher S

    2017-01-01

    This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011-2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases.

  9. [Development of an analyzing system for soil parameters based on NIR spectroscopy].

    PubMed

    Zheng, Li-Hua; Li, Min-Zan; Sun, Hong

    2009-10-01

    A rapid estimation system for soil parameters based on spectral analysis was developed by using object-oriented (OO) technology. A class of SOIL was designed. The instance of the SOIL class is the object of the soil samples with the particular type, specific physical properties and spectral characteristics. Through extracting the effective information from the modeling spectral data of soil object, a map model was established between the soil parameters and its spectral data, while it was possible to save the mapping model parameters in the database of the model. When forecasting the content of any soil parameter, the corresponding prediction model of this parameter can be selected with the same soil type and the similar soil physical properties of objects. And after the object of target soil samples was carried into the prediction model and processed by the system, the accurate forecasting content of the target soil samples could be obtained. The system includes modules such as file operations, spectra pretreatment, sample analysis, calibrating and validating, and samples content forecasting. The system was designed to run out of equipment. The parameters and spectral data files (*.xls) of the known soil samples can be input into the system. Due to various data pretreatment being selected according to the concrete conditions, the results of predicting content will appear in the terminal and the forecasting model can be stored in the model database. The system reads the predicting models and their parameters are saved in the model database from the module interface, and then the data of the tested samples are transferred into the selected model. Finally the content of soil parameters can be predicted by the developed system. The system was programmed with Visual C++6.0 and Matlab 7.0. And the Access XP was used to create and manage the model database.

  10. Four dimensional variational assimilation of in-situ and remote-sensing aerosol data

    NASA Astrophysics Data System (ADS)

    Nieradzik, L. P.; Elbern, H.

    2012-04-01

    Aerosols play an increasingly important role in atmospheric modelling. They have a strong influence on the radiative transfer balance and a significant impact on human health. Their origin is various and so are its effects. Most of the measurement sites in Europe account for an integrated aerosol load PMx (Particulate Matter of less than x μm in diameter) which does not give any qualitative information on the composition of the aerosol. Since very different constituents contribute to PMx, like e.g. mineral dust derived from desert storms or sea salt, it is necessary to make aerosol forecasts not only of load, but also type resolved. The method of four dimensional variational data assimilation (4Dvar) is a widely known technique to enhance forecast skills of CTMs (Chemistry-Transport-Models) by ingesting in-situ and, especially, remote-sensing measurements. The EURAD-IM (EURopean Air pollution Dispersion - Inverse Model), containing a full adjoint gas-phase model, has been expanded with an adjoint of the MADE (Modal Aerosol Dynamics model for Europe) to optimise initial and boundary values for aerosols using 4Dvar. A forward and an adjoint radiative transfer model is driven by the EURAD-IM as mapping between BLAOT (Boundary Layer Aerosol Optical Thickness) and internal aerosol species. Furthermore, its condensation scheme has been bypassed by an HDMR (High-Dimensional-Model-Representation) to ensure differentiability. In this study both in-situ measured PMx as well as satellite retrieved aerosol optical thicknesses have been assimilated and the effect on forecast performance has been investigated. The source of BLAOT is the aerosol retrieval system SYNAER (SYNergetic AErosol Retrieval) from DLR-DFD that retrieves AOT by making use of both AATSR/SCIAMACHY and AVHRR/GOME-2 data respectively. Its strengths are a large spatial coverage, near real-time availability, and the classification of five intrinsic aerosol species, namely water-solubles, water-insolubles, soot, sea salt, and mineral dust which are furthermore size resolved in terms of modes. The skill of the aerosol 4Dvar system was tested in two episodes: 1) July through August 2003, a dry period with strong wildfire activity in Europe, and 2) October through November 2008, the period of the ZEPTER-2 (Second ZEPpelin based Tropospheric photochemical chemistry expERiment) measurement campaign in the area of Lake Constance. In the latter case one-way nesting has been applied from a horizontal grid resolution of 45 km down to 5 km. Overall, the results showed a significant increase in forecast quality of tropospheric aerosol loads.

  11. Combination of synoptical-analogous and dynamical methods to increase skill score of monthly air temperature forecasts over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Khan, Valentina; Tscepelev, Valery; Vilfand, Roman; Kulikova, Irina; Kruglova, Ekaterina; Tischenko, Vladimir

    2016-04-01

    Long-range forecasts at monthly-seasonal time scale are in great demand of socio-economic sectors for exploiting climate-related risks and opportunities. At the same time, the quality of long-range forecasts is not fully responding to user application necessities. Different approaches, including combination of different prognostic models, are used in forecast centers to increase the prediction skill for specific regions and globally. In the present study, two forecasting methods are considered which are exploited in operational practice of Hydrometeorological Center of Russia. One of them is synoptical-analogous method of forecasting of surface air temperature at monthly scale. Another one is dynamical system based on the global semi-Lagrangian model SL-AV, developed in collaboration of Institute of Numerical Mathematics and Hydrometeorological Centre of Russia. The seasonal version of this model has been used to issue global and regional forecasts at monthly-seasonal time scales. This study presents results of the evaluation of surface air temperature forecasts generated with using above mentioned synoptical-statistical and dynamical models, and their combination to potentially increase skill score over Northern Eurasia. The test sample of operational forecasts is encompassing period from 2010 through 2015. The seasonal and interannual variability of skill scores of these methods has been discussed. It was noticed that the quality of all forecasts is highly dependent on the inertia of macro-circulation processes. The skill scores of forecasts are decreasing during significant alterations of synoptical fields for both dynamical and empirical schemes. Procedure of combination of forecasts from different methods, in some cases, has demonstrated its effectiveness. For this study the support has been provided by Grant of Russian Science Foundation (№14-37-00053).

  12. Development of binomial sequential sampling plans for forecasting Listronotus maculicollis (Coleoptera: Curculionidae) larvae based on the relationship to adult counts and turfgrass damage.

    PubMed

    McGraw, Benjamin A; Koppenhöfer, Albrecht M

    2009-06-01

    Binomial sequential sampling plans were developed to forecast weevil Listronotus maculicollis Kirby (Coleoptera: Curculionidae), larval damage to golf course turfgrass and aid in the development of integrated pest management programs for the weevil. Populations of emerging overwintered adults were sampled over a 2-yr period to determine the relationship between adult counts, larval density, and turfgrass damage. Larval density and composition of preferred host plants (Poa annua L.) significantly affected the expression of turfgrass damage. Multiple regression indicates that damage may occur in moderately mixed P. annua stands with as few as 10 larvae per 0.09 m2. However, > 150 larvae were required before damage became apparent in pure Agrostis stolonifera L. plots. Adult counts during peaks in emergence as well as cumulative counts across the emergence period were significantly correlated to future densities of larvae. Eight binomial sequential sampling plans based on two tally thresholds for classifying infestation (T = 1 and two adults) and four adult density thresholds (0.5, 0.85, 1.15, and 1.35 per 3.34 m2) were developed to forecast the likelihood of turfgrass damage by using adult counts during peak emergence. Resampling for validation of sample plans software was used to validate sampling plans with field-collected data sets. All sampling plans were found to deliver accurate classifications (correct decisions were made between 84.4 and 96.8%) in a practical timeframe (average sampling cost < 22.7 min).

  13. Forecasting the realized volatility of the Chinese stock market: Do the G7 stock markets help?

    NASA Astrophysics Data System (ADS)

    Peng, Huan; Chen, Ruoxun; Mei, Dexiang; Diao, Xiaohua

    2018-07-01

    In this paper, we use a comprehensive look to investigate whether the G7 stock markets can contain predictive information to help in forecasting the Chinese stock market volatility. Our out-of-sample empirical results indicate the kitchen sink (HAR-RV-SK) model is able to attain better performance than the benchmark model (HAR-RV) and other models, implying that the G7 stock markets can help in predicting the one-day volatility of the Chinese stock market. Moreover, the kitchen sink strategy can beat the strategy of the simple combination forecasts. Finally, the G7 stock markets can indeed contain useful information, which can increase the accuracy forecasts of the Chinese stock market.

  14. Structural changes and out-of-sample prediction of realized range-based variance in the stock market

    NASA Astrophysics Data System (ADS)

    Gong, Xu; Lin, Boqiang

    2018-03-01

    This paper aims to examine the effects of structural changes on forecasting the realized range-based variance in the stock market. Considering structural changes in variance in the stock market, we develop the HAR-RRV-SC model on the basis of the HAR-RRV model. Subsequently, the HAR-RRV and HAR-RRV-SC models are used to forecast the realized range-based variance of S&P 500 Index. We find that there are many structural changes in variance in the U.S. stock market, and the period after the financial crisis contains more structural change points than the period before the financial crisis. The out-of-sample results show that the HAR-RRV-SC model significantly outperforms the HAR-BV model when they are employed to forecast the 1-day, 1-week, and 1-month realized range-based variances, which means that structural changes can improve out-of-sample prediction of realized range-based variance. The out-of-sample results remain robust across the alternative rolling fixed-window, the alternative threshold value in ICSS algorithm, and the alternative benchmark models. More importantly, we believe that considering structural changes can help improve the out-of-sample performances of most of other existing HAR-RRV-type models in addition to the models used in this paper.

  15. Optimal selection and placement of green infrastructure to reduce impacts of land use change and climate change on hydrology and water quality: An application to the Trail Creek Watershed, Indiana.

    PubMed

    Liu, Yaoze; Theller, Lawrence O; Pijanowski, Bryan C; Engel, Bernard A

    2016-05-15

    The adverse impacts of urbanization and climate change on hydrology and water quality can be mitigated by applying green infrastructure practices. In this study, the impacts of land use change and climate change on hydrology and water quality in the 153.2 km(2) Trail Creek watershed located in northwest Indiana were estimated using the Long-Term Hydrologic Impact Assessment-Low Impact Development 2.1 (L-THIA-LID 2.1) model for the following environmental concerns: runoff volume, Total Suspended Solids (TSS), Total Phosphorous (TP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx). Using a recent 2001 land use map and 2050 land use forecasts, we found that land use change resulted in increased runoff volume and pollutant loads (8.0% to 17.9% increase). Climate change reduced runoff and nonpoint source pollutant loads (5.6% to 10.2% reduction). The 2050 forecasted land use with current rainfall resulted in the largest runoff volume and pollutant loads. The optimal selection and placement of green infrastructure practices using L-THIA-LID 2.1 model were conducted. Costs of applying green infrastructure were estimated using the L-THIA-LID 2.1 model considering construction, maintenance, and opportunity costs. To attain the same runoff volume and pollutant loads as in 2001 land uses for 2050 land uses, the runoff volume, TSS, TP, TKN, and NOx for 2050 needed to be reduced by 10.8%, 14.4%, 13.1%, 15.2%, and 9.0%, respectively. The corresponding annual costs of implementing green infrastructure to achieve the goals were $2.1, $0.8, $1.6, $1.9, and $0.8 million, respectively. Annual costs of reducing 2050 runoff volume/pollutant loads were estimated, and results show green infrastructure annual cost greatly increased for larger reductions in runoff volume and pollutant loads. During optimization, the most cost-efficient green infrastructure practices were selected and implementation levels increased for greater reductions of runoff and nonpoint source pollutants. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Forecasting Performance of Grey Prediction for Education Expenditure and School Enrollment

    ERIC Educational Resources Information Center

    Tang, Hui-Wen Vivian; Yin, Mu-Shang

    2012-01-01

    GM(1,1) and GM(1,1) rolling models derived from grey system theory were estimated using time-series data from projection studies by National Center for Education Statistics (NCES). An out-of-sample forecasting competition between the two grey prediction models and exponential smoothing used by NCES was conducted for education expenditure and…

  17. Constraints on Rational Model Weighting, Blending and Selecting when Constructing Probability Forecasts given Multiple Models

    NASA Astrophysics Data System (ADS)

    Higgins, S. M. W.; Du, H. L.; Smith, L. A.

    2012-04-01

    Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.

  18. Interevent times in a new alarm-based earthquake forecasting model

    NASA Astrophysics Data System (ADS)

    Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed

    2013-09-01

    This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the occurrence region of the 2011 Mw 9.0 Tohoku earthquake, whereas the RI method did not. Cases where a period of quiescent seismicity occurred before the target event often lead to low MR scores, meaning that the target event was not predicted and indicating that our model could be further improved by taking into account quiescent periods in the alarm strategy.

  19. Forecasting Individual Headache Attacks Using Perceived Stress: Development of a Multivariable Prediction Model for Persons With Episodic Migraine.

    PubMed

    Houle, Timothy T; Turner, Dana P; Golding, Adrienne N; Porter, John A H; Martin, Vincent T; Penzien, Donald B; Tegeler, Charles H

    2017-07-01

    To develop and validate a prediction model that forecasts future migraine attacks for an individual headache sufferer. Many headache patients and physicians believe that precipitants of headache can be identified and avoided or managed to reduce the frequency of headache attacks. Of the numerous candidate triggers, perceived stress has received considerable attention for its association with the onset of headache in episodic and chronic headache sufferers. However, no evidence is available to support forecasting headache attacks within individuals using any of the candidate headache triggers. This longitudinal cohort with forecasting model development study enrolled 100 participants with episodic migraine with or without aura, and N = 95 contributed 4626 days of electronic diary data and were included in the analysis. Individual headache forecasts were derived from current headache state and current levels of stress using several aspects of the Daily Stress Inventory, a measure of daily hassles that is completed at the end of each day. The primary outcome measure was the presence/absence of any headache attack (head pain > 0 on a numerical rating scale of 0-10) over the next 24 h period. After removing missing data (n = 431 days), participants in the study experienced a headache attack on 1613/4195 (38.5%) days. A generalized linear mixed-effects forecast model using either the frequency of stressful events or the perceived intensity of these events fit the data well. This simple forecasting model possessed promising predictive utility with an AUC of 0.73 (95% CI 0.71-0.75) in the training sample and an AUC of 0.65 (95% CI 0.6-0.67) in a leave-one-out validation sample. This forecasting model had a Brier score of 0.202 and possessed good calibration between forecasted probabilities and observed frequencies but had only low levels of resolution (ie, sharpness). This study demonstrates that future headache attacks can be forecasted for a diverse group of individuals over time. Future work will enhance prediction through improvements in the assessment of stress as well as the development of other candidate domains to use in the models. © 2017 American Headache Society.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakafuji, Dora; Gouveia, Lauren

    This project supports development of the next generation, integrated energy management infrastructure (EMS) able to incorporate advance visualization of behind-the-meter distributed resource information and probabilistic renewable energy generation forecasts to inform real-time operational decisions. The project involves end-users and active feedback from an Utility Advisory Team (UAT) to help inform how information can be used to enhance operational functions (e.g. unit commitment, load forecasting, Automatic Generation Control (AGC) reserve monitoring, ramp alerts) within two major EMS platforms. Objectives include: Engaging utility operations personnel to develop user input on displays, set expectations, test and review; Developing ease of use and timelinessmore » metrics for measuring enhancements; Developing prototype integrated capabilities within two operational EMS environments; Demonstrating an integrated decision analysis platform with real-time wind and solar forecasting information and timely distributed resource information; Seamlessly integrating new 4-dimensional information into operations without increasing workload and complexities; Developing sufficient analytics to inform and confidently transform and adopt new operating practices and procedures; Disseminating project lessons learned through industry sponsored workshops and conferences;Building on collaborative utility-vendor partnership and industry capabilities« less

  1. Deformation and failure of single- and multi-phase silicate liquids: seismic precursors and mechanical work

    NASA Astrophysics Data System (ADS)

    Vasseur, Jeremie; Lavallée, Yan; Hess, Kai-Uwe; Wassermann, Joachim; Dingwell, Donald B.

    2013-04-01

    Along with many others, volcanic unrest is regarded as a catastrophic material failure phenomenon and is often preceded by diverse precursory signals. Although a volcanic system intrinsically behave in a non-linear and stochastic way, these precursors display systematic evolutionary trends to upcoming eruptions. Seismic signals in particular are in general dramatically increasing prior to an eruption and have been extensively reported to show accelerating rates through time, as well as in the laboratory before failure of rock samples. At the lab-scale, acoustic emissions (AE) are high frequency transient stress waves used to track fracture initiation and propagation inside a rock sample. Synthesized glass samples featuring a range of porosities (0 - 30%) and natural rock samples from volcán de Colima, Mexico, have been failed under high temperature uniaxial compression experiments at constant stresses and strain rates. Using the monitored AEs and the generated mechanical work during deformation, we investigated the evolutionary trends of energy patterns associated to different degrees of heterogeneity. We observed that the failure of dense, poorly porous glasses is achieved by exceeding elevated strength and thus requires a significant accumulation of strain, meaning only pervasive small-scale cracking is occurring. More porous glasses as well as volcanic samples need much lower applied stress and deformation to fail, as fractures are nucleating, propagating and coalescing into localized large-scale cracks, taking the advantage of the existence of numerous defects (voids for glasses, voids and crystals for volcanic rocks). These observations demonstrate that the mechanical work generated through cracking is efficiently distributed inside denser and more homogeneous samples, as underlined by the overall lower AE energy released during experiments. In contrast, the quicker and larger AE energy released during the loading of heterogeneous samples shows that the mechanical work tends to concentrate in specific weak regions facilitating dynamical failure of the material through dissipation of the accumulated strain energy. Applying a statistical Global Linearization Method (GLM) in multi-phase silicate liquids samples leads to a maximum likelihood power-law fit of the accelerating rates of released AEs. The calculated α exponent of the famous empirical Failure Forecast Method (FFM) tends to decrease from 2 towards 1 with increasing porosity, suggesting a shift towards an idealized exponential-like acceleration. Single-phase silicate liquids behave more elastically during deformation without much cracking and suddenly releasing their accumulated strain energy at failure, implying less clear trends in monitored AEs. In a predictive prospective, these results support the fact that failure forecasting power is enhanced by the presence of heterogeneities inside a material.

  2. Distribution-Agnostic Stochastic Optimal Power Flow for Distribution Grids: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler

    2016-09-01

    This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less

  3. Evaluation of the synoptic and mesoscale predictive capabilities of a mesoscale atmospheric simulation system

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K.; Keyser, D. A.; Mccumber, M. C.

    1983-01-01

    The overall performance characteristics of a limited area, hydrostatic, fine (52 km) mesh, primitive equation, numerical weather prediction model are determined in anticipation of satellite data assimilations with the model. The synoptic and mesoscale predictive capabilities of version 2.0 of this model, the Mesoscale Atmospheric Simulation System (MASS 2.0), were evaluated. The two part study is based on a sample of approximately thirty 12h and 24h forecasts of atmospheric flow patterns during spring and early summer. The synoptic scale evaluation results benchmark the performance of MASS 2.0 against that of an operational, synoptic scale weather prediction model, the Limited area Fine Mesh (LFM). The large sample allows for the calculation of statistically significant measures of forecast accuracy and the determination of systematic model errors. The synoptic scale benchmark is required before unsmoothed mesoscale forecast fields can be seriously considered.

  4. The predictive content of CBOE crude oil volatility index

    NASA Astrophysics Data System (ADS)

    Chen, Hongtao; Liu, Li; Li, Xiaolei

    2018-02-01

    Volatility forecasting is an important issue in the area of econophysics. The information content of implied volatility for financial return volatility has been well documented in the literature but very few studies focus on oil volatility. In this paper, we show that the CBOE crude oil volatility index (OVX) has predictive ability for spot volatility of WTI and Brent oil returns, from both in-sample and out-of-sample perspectives. Including OVX-based implied volatility in GARCH-type volatility models can improve forecasting accuracy most of time. The predictability from OVX to spot volatility is also found for longer forecasting horizons of 5 days and 20 days. The simple GARCH(1,1) and fractionally integrated GARCH with OVX performs significantly better than the other OVX models and all 6 univariate GARCH-type models without OVX. Robustness test results suggest that OVX provides different information from as short-term interest rate.

  5. Projection of Patient Condition Code Distributions Based on Mechanism of Injury

    DTIC Science & Technology

    2003-01-01

    The Medical Readiness and Strategic Plan (MRSP)1998-20041 requires that the military services develop a method for linking real world patient load...data with modern Patient Condition (PC) codes to enable planners to forecast medical workload and resource requirements. Determination of the likely...various levels of medical care. Medical planners and logisticians plan for medical contingencies based on anticipated patient streams, distributions of

  6. 7 CFR 1710.207 - RUS criteria for approval of load forecasts by distribution borrowers not required to maintain an...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... distribution borrowers that are unaffiliated with a power supply borrower, or by distribution borrowers that are members of a power supply borrower that has a total utility plant less than $500 million and that is not itself a member of another power supply borrower with a total utility plant of $500 million or...

  7. The Ability of Analysts' Recommendations to Predict Optimistic and Pessimistic Forecasts

    PubMed Central

    Biglari, Vahid; Alfan, Ervina Binti; Ahmad, Rubi Binti; Hajian, Najmeh

    2013-01-01

    Previous researches show that buy (growth) companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs). This behavior however, is not inherent in sell (non-growth) companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM) with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005–2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature. PMID:24146741

  8. The ability of analysts' recommendations to predict optimistic and pessimistic forecasts.

    PubMed

    Biglari, Vahid; Alfan, Ervina Binti; Ahmad, Rubi Binti; Hajian, Najmeh

    2013-01-01

    Previous researches show that buy (growth) companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs). This behavior however, is not inherent in sell (non-growth) companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM) with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005-2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature.

  9. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    NASA Astrophysics Data System (ADS)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  10. Forecasting the climate response to volcanic eruptions: prediction skill related to stratospheric aerosol forcing

    NASA Astrophysics Data System (ADS)

    Ménégoz, M.; Bilbao, R.; Bellprat, O.; Guemas, V.; Doblas-Reyes, F. J.

    2018-06-01

    The last major volcanic eruptions, the Agung in 1963, El Chichon in 1982 and Pinatubo in 1991, were each associated with a cooling of the troposphere that has been observed over large continental areas and over the Western Pacific, the Indian Ocean and the Southern Atlantic. Simultaneously, Eastern tropical Pacific temperatures increased due to prevailing El Niño conditions. Here we show that the pattern of these near-surface temperature anomalies is partly reproduced with decadal simulations of the EC-Earth model initialised with climate observations and forced with an estimate of the observed volcanic aerosol optical thickness. Sensitivity experiments highlight a cooling induced by the volcanic forcing, whereas El Niño events following the eruptions would have occurred even without volcanic eruptions. Focusing on the period 1961–2001, the main source of skill of this decadal forecast system during the first 2 years is related to the initialisation of the model. The contribution of the initialisation to the skill becomes smaller than the contribution of the volcanic forcing after two years, the latter being substantial in the Western Pacific, the Indian Ocean and the Western Atlantic. Two simple protocols for real time forecasts are investigated: using the forcing of a past volcanic eruption to simulate the forcing of a new one, and applying a two-year exponential decay to the initial stratospheric aerosol load observed at the beginning of the forecast. This second protocol applied in retrospective forecasts allows a partial reproduction of the skill attained with observed forcing.

  11. Support vector machine for day ahead electricity price forecasting

    NASA Astrophysics Data System (ADS)

    Razak, Intan Azmira binti Wan Abdul; Abidin, Izham bin Zainal; Siah, Yap Keem; Rahman, Titik Khawa binti Abdul; Lada, M. Y.; Ramani, Anis Niza binti; Nasir, M. N. M.; Ahmad, Arfah binti

    2015-05-01

    Electricity price forecasting has become an important part of power system operation and planning. In a pool- based electric energy market, producers submit selling bids consisting in energy blocks and their corresponding minimum selling prices to the market operator. Meanwhile, consumers submit buying bids consisting in energy blocks and their corresponding maximum buying prices to the market operator. Hence, both producers and consumers use day ahead price forecasts to derive their respective bidding strategies to the electricity market yet reduce the cost of electricity. However, forecasting electricity prices is a complex task because price series is a non-stationary and highly volatile series. Many factors cause for price spikes such as volatility in load and fuel price as well as power import to and export from outside the market through long term contract. This paper introduces an approach of machine learning algorithm for day ahead electricity price forecasting with Least Square Support Vector Machine (LS-SVM). Previous day data of Hourly Ontario Electricity Price (HOEP), generation's price and demand from Ontario power market are used as the inputs for training data. The simulation is held using LSSVMlab in Matlab with the training and testing data of 2004. SVM that widely used for classification and regression has great generalization ability with structured risk minimization principle rather than empirical risk minimization. Moreover, same parameter settings in trained SVM give same results that absolutely reduce simulation process compared to other techniques such as neural network and time series. The mean absolute percentage error (MAPE) for the proposed model shows that SVM performs well compared to neural network.

  12. Forecasting long-range atmospheric transport episodes of polychlorinated biphenyls using FLEXPART

    NASA Astrophysics Data System (ADS)

    Halse, Anne Karine; Eckhardt, Sabine; Schlabach, Martin; Stohl, Andreas; Breivik, Knut

    2013-06-01

    The analysis of concentrations of persistent organic pollutants (POPs) in ambient air is costly and can only be done for a limited number of samples. It is thus beneficial to maximize the information content of the samples analyzed via a targeted observation strategy. Using polychlorinated biphenyls (PCBs) as an example, a forecasting system to predict and evaluate long-range atmospheric transport (LRAT) episodes of POPs at a remote site in southern Norway has been developed. The system uses the Lagrangian particle transport model FLEXPART, and can be used for triggering extra ("targeted") sampling when LRAT episodes are predicted to occur. The system was evaluated by comparing targeted samples collected over 12-25 h during individual LRAT episodes with monitoring samples regularly collected over one day per week throughout a year. Measured concentrations in all targeted samples were above the 75th percentile of the concentrations obtained from the regular monitoring program and included the highest measured values of all samples. This clearly demonstrates the success of the targeted sampling strategy.

  13. Scalar model for frictional precursors dynamics

    PubMed Central

    Taloni, Alessandro; Benassi, Andrea; Sandfeld, Stefan; Zapperi, Stefano

    2015-01-01

    Recent experiments indicate that frictional sliding occurs by nucleation of detachment fronts at the contact interface that may appear well before the onset of global sliding. This intriguing precursory activity is not accounted for by traditional friction theories but is extremely important for friction dominated geophysical phenomena as earthquakes, landslides or avalanches. Here we simulate the onset of slip of a three dimensional elastic body resting on a surface and show that experimentally observed frictional precursors depend in a complex non-universal way on the sample geometry and loading conditions. Our model satisfies Archard's law and Amontons' first and second laws, reproducing with remarkable precision the real contact area dynamics, the precursors' envelope dynamics prior to sliding, and the normal and shear internal stress distributions close to the interfacial surface. Moreover, it allows to assess which features can be attributed to the elastic equilibrium, and which are attributed to the out-of-equilibrium dynamics, suggesting that precursory activity is an intrinsically quasi-static physical process. A direct calculation of the evolution of the Coulomb stress before and during precursors nucleation shows large variations across the sample, explaining why earthquake forecasting methods based only on accumulated slip and Coulomb stress monitoring are often ineffective. PMID:25640079

  14. Scalar model for frictional precursors dynamics.

    PubMed

    Taloni, Alessandro; Benassi, Andrea; Sandfeld, Stefan; Zapperi, Stefano

    2015-02-02

    Recent experiments indicate that frictional sliding occurs by nucleation of detachment fronts at the contact interface that may appear well before the onset of global sliding. This intriguing precursory activity is not accounted for by traditional friction theories but is extremely important for friction dominated geophysical phenomena as earthquakes, landslides or avalanches. Here we simulate the onset of slip of a three dimensional elastic body resting on a surface and show that experimentally observed frictional precursors depend in a complex non-universal way on the sample geometry and loading conditions. Our model satisfies Archard's law and Amontons' first and second laws, reproducing with remarkable precision the real contact area dynamics, the precursors' envelope dynamics prior to sliding, and the normal and shear internal stress distributions close to the interfacial surface. Moreover, it allows to assess which features can be attributed to the elastic equilibrium, and which are attributed to the out-of-equilibrium dynamics, suggesting that precursory activity is an intrinsically quasi-static physical process. A direct calculation of the evolution of the Coulomb stress before and during precursors nucleation shows large variations across the sample, explaining why earthquake forecasting methods based only on accumulated slip and Coulomb stress monitoring are often ineffective.

  15. Satellite Altimetry based River Forecasting of Transboundary Flow

    NASA Astrophysics Data System (ADS)

    Hossain, F.; Siddique-E-Akbor, A.; Lee, H.; Shum, C.; Biancamaria, S.

    2012-12-01

    Forecasting of this transboundary flow in downstream nations however remains notoriously difficult due to the lack of basin-wide in-situ hydrologic measurements or its real-time sharing among nations. In addition, human regulation of upstream flow through diversion projects and dams, make hydrologic models less effective for forecasting on their own. Using the Ganges-Brahmaputra (GB) basin as an example, this study assesses the feasibility of using JASON-2 satellite altimetry for forecasting such transboundary flow at locations further inside the downstream nation of Bangladesh by propagating forecasts derived from upstream (Indian) locations through a hydrodynamic river model. The 5-day forecast of river levels at upstream boundary points inside Bangladesh are used to initialize daily simulation of the hydrodynamic river model and yield the 5-day forecast river level further downstream inside Bangladesh. The forecast river levels are then compared with the 5-day-later "now cast" simulation by the river model based on in-situ river level at the upstream boundary points in Bangladesh. Future directions for satellite-based forecasting of flow are also briefly overviewed.round tracks or virtual stations of JASON-2 (J2) altimeter over the GB basin shown in yellow lines. The locations where the track crosses a river and used for deriving forecasting rating curves is shown with a circle and station number (magenta- Brahmaputra basin; blue - Ganges basin). Circles without a station number represent the broader view of sampling by JASON-2 if all the ground tracks on main stem rivers and neighboring tributaries of Ganges and Brahmaputra are considered.

  16. A GLM Post-processor to Adjust Ensemble Forecast Traces

    NASA Astrophysics Data System (ADS)

    Thiemann, M.; Day, G. N.; Schaake, J. C.; Draijer, S.; Wang, L.

    2011-12-01

    The skill of hydrologic ensemble forecasts has improved in the last years through a better understanding of climate variability, better climate forecasts and new data assimilation techniques. Having been extensively utilized for probabilistic water supply forecasting, interest is developing to utilize these forecasts in operational decision making. Hydrologic ensemble forecast members typically have inherent biases in flow timing and volume caused by (1) structural errors in the models used, (2) systematic errors in the data used to calibrate those models, (3) uncertain initial hydrologic conditions, and (4) uncertainties in the forcing datasets. Furthermore, hydrologic models have often not been developed for operational decision points and ensemble forecasts are thus not always available where needed. A statistical post-processor can be used to address these issues. The post-processor should (1) correct for systematic biases in flow timing and volume, (2) preserve the skill of the available raw forecasts, (3) preserve spatial and temporal correlation as well as the uncertainty in the forecasted flow data, (4) produce adjusted forecast ensembles that represent the variability of the observed hydrograph to be predicted, and (5) preserve individual forecast traces as equally likely. The post-processor should also allow for the translation of available ensemble forecasts to hydrologically similar locations where forecasts are not available. This paper introduces an ensemble post-processor (EPP) developed in support of New York City water supply operations. The EPP employs a general linear model (GLM) to (1) adjust available ensemble forecast traces and (2) create new ensembles for (nearby) locations where only historic flow observations are available. The EPP is calibrated by developing daily and aggregated statistical relationships form historical flow observations and model simulations. These are then used in operation to obtain the conditional probability density function (PDF) of the observations to be predicted, thus jointly adjusting individual ensemble members. These steps are executed in a normalized transformed space ('z'-space) to account for the strong non-linearity in the flow observations involved. A data window centered on each calibration date is used to minimize impacts from sampling errors and data noise. Testing on datasets from California and New York suggests that the EPP can successfully minimize biases in ensemble forecasts, while preserving the raw forecast skill in a 'days to weeks' forecast horizon and reproducing the variability of climatology for 'weeks to years' forecast horizons.

  17. Exploring What Determines the Use of Forecasts of Varying Time Periods in Guanacaste, Costa Rica

    NASA Astrophysics Data System (ADS)

    Babcock, M.; Wong-Parodi, G.; Grossmann, I.; Small, M. J.

    2016-12-01

    Weather and climate forecasts are promoted as ways to improve water management, especially in the face of changing environmental conditions. However, studies indicate many stakeholders who may benefit from such information do not use it. This study sought to better understand which personal factors (e.g., trust in forecast sources, perceptions of accuracy) were important determinants of the use of 4-day, 3-month, and 12-month rainfall forecasts by stakeholders in water management-related sectors in the seasonally dry province of Guanacaste, Costa Rica. From August to October 2015, we surveyed 87 stakeholders from a mix of government agencies, local water committees, large farms, tourist businesses, environmental NGO's, and the public. The result of an exploratory factor analysis suggests that trust in "informal" forecast sources (traditional methods, family advice) and in "formal" sources (government, university and private company science) are independent of each other. The result of logistic regression analyses suggest that 1) greater understanding of forecasts is associated with a greater probability of 4-day and 3-month forecast use, but not 12-month forecast use, 2) a greater probability of 3-month forecast use is associated with a lower level of trust in "informal" sources, and 3), feeling less secure about water resources, and regularly using many sources of information (and specifically formal meetings and reports) are each associated with a greater probability of using 12-month forecasts. While limited by the sample size, and affected by the factoring method and regression model assumptions, these results do appear to suggest that while forecasts of all times scales are used to some extent, local decision makers' decisions to use 4-day and 3-month forecasts appear to be more intrinsically motivated (based on their level of understanding and trust) and the use of 12-month forecasts seems to be more motivated by a sense of requirement or mandate.

  18. Forecasting Emergency Department Crowding: An External, Multi-Center Evaluation

    PubMed Central

    Hoot, Nathan R.; Epstein, Stephen K.; Allen, Todd L.; Jones, Spencer S.; Baumlin, Kevin M.; Chawla, Neal; Lee, Anna T.; Pines, Jesse M.; Klair, Amandeep K.; Gordon, Bradley D.; Flottemesch, Thomas J.; LeBlanc, Larry J.; Jones, Ian; Levin, Scott R.; Zhou, Chuan; Gadd, Cynthia S.; Aronsky, Dominik

    2009-01-01

    Objective To apply a previously described tool to forecast ED crowding at multiple institutions, and to assess its generalizability for predicting the near-future waiting count, occupancy level, and boarding count. Methods The ForecastED tool was validated using historical data from five institutions external to the development site. A sliding-window design separated the data for parameter estimation and forecast validation. Observations were sampled at consecutive 10-minute intervals during 12 months (n = 52,560) at four sites and 10 months (n = 44,064) at the fifth. Three outcome measures – the waiting count, occupancy level, and boarding count – were forecast 2, 4, 6, and 8 hours beyond each observation, and forecasts were compared to observed data at corresponding times. The reliability and calibration were measured following previously described methods. After linear calibration, the forecasting accuracy was measured using the median absolute error (MAE). Results The tool was successfully used for five different sites. Its forecasts were more reliable, better calibrated, and more accurate at 2 hours than at 8 hours. The reliability and calibration of the tool were similar between the original development site and external sites; the boarding count was an exception, which was less reliable at four out of five sites. Some variability in accuracy existed among institutions; when forecasting 4 hours into the future, the MAE of the waiting count ranged between 0.6 and 3.1 patients, the MAE of the occupancy level ranged between 9.0 and 14.5% of beds, and the MAE of the boarding count ranged between 0.9 and 2.7 patients. Conclusion The ForecastED tool generated potentially useful forecasts of input and throughput measures of ED crowding at five external sites, without modifying the underlying assumptions. Noting the limitation that this was not a real-time validation, ongoing research will focus on integrating the tool with ED information systems. PMID:19716629

  19. Real-time prediction of atmospheric Lagrangian coherent structures based on forecast data: An application and error analysis

    NASA Astrophysics Data System (ADS)

    BozorgMagham, Amir E.; Ross, Shane D.; Schmale, David G.

    2013-09-01

    The language of Lagrangian coherent structures (LCSs) provides a new means for studying transport and mixing of passive particles advected by an atmospheric flow field. Recent observations suggest that LCSs govern the large-scale atmospheric motion of airborne microorganisms, paving the way for more efficient models and management strategies for the spread of infectious diseases affecting plants, domestic animals, and humans. In addition, having reliable predictions of the timing of hyperbolic LCSs may contribute to improved aerobiological sampling of microorganisms with unmanned aerial vehicles and LCS-based early warning systems. Chaotic atmospheric dynamics lead to unavoidable forecasting errors in the wind velocity field, which compounds errors in LCS forecasting. In this study, we reveal the cumulative effects of errors of (short-term) wind field forecasts on the finite-time Lyapunov exponent (FTLE) fields and the associated LCSs when realistic forecast plans impose certain limits on the forecasting parameters. Objectives of this paper are to (a) quantify the accuracy of prediction of FTLE-LCS features and (b) determine the sensitivity of such predictions to forecasting parameters. Results indicate that forecasts of attracting LCSs exhibit less divergence from the archive-based LCSs than the repelling features. This result is important since attracting LCSs are the backbone of long-lived features in moving fluids. We also show under what circumstances one can trust the forecast results if one merely wants to know if an LCS passed over a region and does not need to precisely know the passage time.

  20. The clustering-based case-based reasoning for imbalanced business failure prediction: a hybrid approach through integrating unsupervised process with supervised process

    NASA Astrophysics Data System (ADS)

    Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie

    2014-05-01

    Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.

  1. Automated flare forecasting using a statistical learning technique

    NASA Astrophysics Data System (ADS)

    Yuan, Yuan; Shih, Frank Y.; Jing, Ju; Wang, Hai-Min

    2010-08-01

    We present a new method for automatically forecasting the occurrence of solar flares based on photospheric magnetic measurements. The method is a cascading combination of an ordinal logistic regression model and a support vector machine classifier. The predictive variables are three photospheric magnetic parameters, i.e., the total unsigned magnetic flux, length of the strong-gradient magnetic polarity inversion line, and total magnetic energy dissipation. The output is true or false for the occurrence of a certain level of flares within 24 hours. Experimental results, from a sample of 230 active regions between 1996 and 2005, show the accuracies of a 24-hour flare forecast to be 0.86, 0.72, 0.65 and 0.84 respectively for the four different levels. Comparison shows an improvement in the accuracy of X-class flare forecasting.

  2. Evaluating the performance of the Lee-Carter method and its variants in modelling and forecasting Malaysian mortality

    NASA Astrophysics Data System (ADS)

    Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.

    2014-12-01

    This study investigated the performance of the Lee-Carter (LC) method and it variants in modeling and forecasting Malaysia mortality. These include the original LC, the Lee-Miller (LM) variant and the Booth-Maindonald-Smith (BMS) variant. These methods were evaluated using Malaysia's mortality data which was measured based on age specific death rates (ASDR) for 1971 to 2009 for overall population while those for 1980-2009 were used in separate models for male and female population. The performance of the variants has been examined in term of the goodness of fit of the models and forecasting accuracy. Comparison was made based on several criteria namely, mean square error (MSE), root mean square error (RMSE), mean absolute deviation (MAD) and mean absolute percentage error (MAPE). The results indicate that BMS method was outperformed in in-sample fitting for overall population and when the models were fitted separately for male and female population. However, in the case of out-sample forecast accuracy, BMS method only best when the data were fitted to overall population. When the data were fitted separately for male and female, LCnone performed better for male population and LM method is good for female population.

  3. Economic consequences of improved temperature forecasts: An experiment with the Florida citrus growers (control group results). [weather forecasting

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A demonstration experiment is being planned to show that frost and freeze prediction improvements are possible utilizing timely Synchronous Meteorological Satellite temperature measurements and that this information can affect Florida citrus grower operations and decisions. An economic experiment was carried out which will monitor citrus growers' decisions, actions, costs and losses, and meteorological forecasts and actual weather events and will establish the economic benefits of improved temperature forecasts. A summary is given of the economic experiment, the results obtained to date, and the work which still remains to be done. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service, and Federal Crop Insurance Corp., resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements.

  4. Analysis of continuous GPS measurements from southern Victoria Land, Antarctica

    USGS Publications Warehouse

    Willis, Michael J.

    2007-01-01

    Several years of continuous data have been collected at remote bedrock Global Positioning System (GPS) sites in southern Victoria Land, Antarctica. Annual to sub-annual variations are observed in the position time-series. An atmospheric pressure loading (APL) effect is calculated from pressure field anomalies supplied by the European Centre for Medium-Range Weather Forecasts (ECMWF) model loading an elastic Earth model. The predicted APL signal has a moderate correlation with the vertical position time-series at McMurdo, Ross Island (International Global Navigation Satellite System Service (IGS) station MCM4), produced using a global solution. In contrast, a local solution in which MCM4 is the fiducial site generates a vertical time series for a remote site in Victoria Land (Cape Roberts, ROB4) which exhibits a low, inverse correlation with the predicted atmospheric pressure loading signal. If, in the future, known and well modeled geophysical loads can be separated from the time-series, then local hydrological loading, of interest for glaciological and climate applications, can potentially be extracted from the GPS time-series.

  5. Time Relevance of Convective Weather Forecast for Air Traffic Automation

    NASA Technical Reports Server (NTRS)

    Chan, William N.

    2006-01-01

    The Federal Aviation Administration (FAA) is handling nearly 120,000 flights a day through its Air Traffic Management (ATM) system and air traffic congestion is expected to increse substantially over the next 20 years. Weather-induced impacts to throughput and efficiency are the leading cause of flight delays accounting for 70% of all delays with convective weather accounting for 60% of all weather related delays. To support the Next Generation Air Traffic System goal of operating at 3X current capacity in the NAS, ATC decision support tools are being developed to create advisories to assist controllers in all weather constraints. Initial development of these decision support tools did not integrate information regarding weather constraints such as thunderstorms and relied on an additional system to provide that information. Future Decision Support Tools should move towards an integrated system where weather constraints are factored into the advisory of a Decision Support Tool (DST). Several groups such at NASA-Ames, Lincoln Laboratories, and MITRE are integrating convective weather data with DSTs. A survey of current convective weather forecast and observation data show they span a wide range of temporal and spatial resolutions. Short range convective observations can be obtained every 5 mins with longer range forecasts out to several days updated every 6 hrs. Today, the short range forecasts of less than 2 hours have a temporal resolution of 5 mins. Beyond 2 hours, forecasts have much lower temporal. resolution of typically 1 hour. Spatial resolutions vary from 1km for short range to 40km for longer range forecasts. Improving the accuracy of long range convective forecasts is a major challenge. A report published by the National Research Council states improvements for convective forecasts for the 2 to 6 hour time frame will only be achieved for a limited set of convective phenomena in the next 5 to 10 years. Improved longer range forecasts will be probabilistic as opposed to the deterministic shorter range forecasts. Despite the known low level of confidence with respect to long range convective forecasts, these data are still useful to a DST routing algorithm. It is better to develop an aircraft route using the best information available than no information. The temporally coarse long range forecast data needs to be interpolated to be useful to a DST. A DST uses aircraft trajectory predictions that need to be evaluated for impacts by convective storms. Each time-step of a trajectory prediction n&s to be checked against weather data. For the case of coarse temporal data, there needs to be a method fill in weather data where there is none. Simply using the coarse weather data without any interpolation can result in DST routes that are impacted by regions of strong convection. Increasing the temporal resolution of these data can be achieved but result in a large dataset that may prove to be an operational challenge in transmission and loading by a DST. Currently, it takes about 7mins retrieve a 7mb RUC2 forecast file from NOAA at NASA-Ames Research Center. A prototype NCWF6 1 hour forecast is about 3mb in size. A Six hour NCWFG forecast with a 1hr forecast time-step will be about l8mb (6 x 3mb). A 6 hour NCWF6 forecast with a l5min forecast time-step will be about 7mb (24 x 3mb). Based on the time it takes to retrieve a 7mb RUC2 forecast, it will take approximately 70mins to retrieve a 6 hour NCWF forecast with 15min time steps. Until those issues are addressed, there is a need to develop an algorithm that interpolates between these temporally coarse long range forecasts. This paper describes a method of how to use low temporal resolution probabilistic weather forecasts in a DST. The beginning of this paper is a description of some convective weather forecast and observation products followed by an example of how weather data are used by a DST. The subsequent sections will describe probabilistic forecasts followed by a descrtion of a method to use low temporal resolution probabilistic weather forecasts by providing a relevance value to these data outside of their valid times.

  6. A Bayesian spatio-temporal model for forecasting Anaplasma species seroprevalence in domestic dogs within the contiguous United States

    PubMed Central

    Liu, Yan; Watson, Stella C.; Gettings, Jenna R.; Lund, Robert B.; Nordone, Shila K.; McMahan, Christopher S.

    2017-01-01

    This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast’s construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011–2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year’s regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011–2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases. PMID:28738085

  7. Potential predictability and forecast skill in ensemble climate forecast: a skill-persistence rule

    NASA Astrophysics Data System (ADS)

    Jin, Yishuai; Rong, Xinyao; Liu, Zhengyu

    2017-12-01

    This study investigates the factors relationship between the forecast skills for the real world (actual skill) and perfect model (perfect skill) in ensemble climate model forecast with a series of fully coupled general circulation model forecast experiments. It is found that the actual skill for sea surface temperature (SST) in seasonal forecast is substantially higher than the perfect skill on a large part of the tropical oceans, especially the tropical Indian Ocean and the central-eastern Pacific Ocean. The higher actual skill is found to be related to the higher observational SST persistence, suggesting a skill-persistence rule: a higher SST persistence in the real world than in the model could overwhelm the model bias to produce a higher forecast skill for the real world than for the perfect model. The relation between forecast skill and persistence is further proved using a first-order autoregressive model (AR1) analytically for theoretical solutions and numerically for analogue experiments. The AR1 model study shows that the skill-persistence rule is strictly valid in the case of infinite ensemble size, but could be distorted by sampling errors and non-AR1 processes. This study suggests that the so called "perfect skill" is model dependent and cannot serve as an accurate estimate of the true upper limit of real world prediction skill, unless the model can capture at least the persistence property of the observation.

  8. Forecast Inaccuracies in Power Plant Projects From Project Managers' Perspectives

    NASA Astrophysics Data System (ADS)

    Sanabria, Orlando

    Guided by organizational theory, this phenomenological study explored the factors affecting forecast preparation and inaccuracies during the construction of fossil fuel-fired power plants in the United States. Forecast inaccuracies can create financial stress and uncertain profits during the project construction phase. A combination of purposeful and snowball sampling supported the selection of participants. Twenty project managers with over 15 years of experience in power generation and project experience across the United States were interviewed within a 2-month period. From the inductive codification and descriptive analysis, 5 themes emerged: (a) project monitoring, (b) cost control, (c) management review frequency, (d) factors to achieve a precise forecast, and (e) factors causing forecast inaccuracies. The findings of the study showed the factors necessary to achieve a precise forecast includes a detailed project schedule, accurate labor cost estimates, monthly project reviews and risk assessment, and proper utilization of accounting systems to monitor costs. The primary factors reported as causing forecast inaccuracies were cost overruns by subcontractors, scope gaps, labor cost and availability of labor, and equipment and material cost. Results of this study could improve planning accuracy and the effective use of resources during construction of power plants. The study results could contribute to social change by providing a framework to project managers to lessen forecast inaccuracies, and promote construction of power plants that will generate employment opportunities and economic development.

  9. Systematic Evaluation of Stochastic Methods in Power System Scheduling and Dispatch with Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yishen; Zhou, Zhi; Liu, Cong

    2016-08-01

    As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less

  10. Convective Weather Forecast Quality Metrics for Air Traffic Management Decision-Making

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano B.; Gyarfas, Brett; Chan, William N.; Meyn, Larry A.

    2006-01-01

    Since numerical weather prediction models are unable to accurately forecast the severity and the location of the storm cells several hours into the future when compared with observation data, there has been a growing interest in probabilistic description of convective weather. The classical approach for generating uncertainty bounds consists of integrating the state equations and covariance propagation equations forward in time. This step is readily recognized as the process update step of the Kalman Filter algorithm. The second well known method, known as the Monte Carlo method, consists of generating output samples by driving the forecast algorithm with input samples selected from distributions. The statistical properties of the distributions of the output samples are then used for defining the uncertainty bounds of the output variables. This method is computationally expensive for a complex model compared to the covariance propagation method. The main advantage of the Monte Carlo method is that a complex non-linear model can be easily handled. Recently, a few different methods for probabilistic forecasting have appeared in the literature. A method for computing probability of convection in a region using forecast data is described in Ref. 5. Probability at a grid location is computed as the fraction of grid points, within a box of specified dimensions around the grid location, with forecast convection precipitation exceeding a specified threshold. The main limitation of this method is that the results are dependent on the chosen dimensions of the box. The examples presented Ref. 5 show that this process is equivalent to low-pass filtering of the forecast data with a finite support spatial filter. References 6 and 7 describe the technique for computing percentage coverage within a 92 x 92 square-kilometer box and assigning the value to the center 4 x 4 square-kilometer box. This technique is same as that described in Ref. 5. Characterizing the forecast, following the process described in Refs. 5 through 7, in terms of percentage coverage or confidence level is notionally sound compared to characterizing in terms of probabilities because the probability of the forecast being correct can only be determined using actual observations. References 5 through 7 only use the forecast data and not the observations. The method for computing the probability of detection, false alarm ratio and several forecast quality metrics (Skill Scores) using both the forecast and observation data are given in Ref. 2. This paper extends the statistical verification method in Ref. 2 to determine co-occurrence probabilities. The method consists of computing the probability that a severe weather cell (grid location) is detected in the observation data in the neighborhood of the severe weather cell in the forecast data. Probabilities of occurrence at the grid location and in its neighborhood with higher severity, and with lower severity in the observation data compared to that in the forecast data are examined. The method proposed in Refs. 5 through 7 is used for computing the probability that a certain number of cells in the neighborhood of severe weather cells in the forecast data are seen as severe weather cells in the observation data. Finally, the probability of existence of gaps in the observation data in the neighborhood of severe weather cells in forecast data is computed. Gaps are defined as openings between severe weather cells through which an aircraft can safely fly to its intended destination. The rest of the paper is organized as follows. Section II summarizes the statistical verification method described in Ref. 2. The extension of this method for computing the co-occurrence probabilities in discussed in Section HI. Numerical examples using NCWF forecast data and NCWD observation data are presented in Section III to elucidate the characteristics of the co-occurrence probabilities. This section also discusses the procedure for computing throbabilities that the severity of convection in the observation data will be higher or lower in the neighborhood of grid locations compared to that indicated at the grid locations in the forecast data. The probability of coverage of neighborhood grid cells is also described via examples in this section. Section IV discusses the gap detection algorithm and presents a numerical example to illustrate the method. The locations of the detected gaps in the observation data are used along with the locations of convective weather cells in the forecast data to determine the probability of existence of gaps in the neighborhood of these cells. Finally, the paper is concluded in Section V.

  11. Forecasting the value-at-risk of Chinese stock market using the HARQ model and extreme value theory

    NASA Astrophysics Data System (ADS)

    Liu, Guangqiang; Wei, Yu; Chen, Yongfei; Yu, Jiang; Hu, Yang

    2018-06-01

    Using intraday data of the CSI300 index, this paper discusses value-at-risk (VaR) forecasting of the Chinese stock market from the perspective of high-frequency volatility models. First, we measure the realized volatility (RV) with 5-minute high-frequency returns of the CSI300 index and then model it with the newly introduced heterogeneous autoregressive quarticity (HARQ) model, which can handle the time-varying coefficients of the HAR model. Second, we forecast the out-of-sample VaR of the CSI300 index by combining the HARQ model and extreme value theory (EVT). Finally, using several popular backtesting methods, we compare the VaR forecasting accuracy of HARQ model with other traditional HAR-type models, such as HAR, HAR-J, CHAR, and SHAR. The empirical results show that the novel HARQ model can beat other HAR-type models in forecasting the VaR of the Chinese stock market at various risk levels.

  12. Municipal water consumption forecast accuracy

    NASA Astrophysics Data System (ADS)

    Fullerton, Thomas M.; Molina, Angel L.

    2010-06-01

    Municipal water consumption planning is an active area of research because of infrastructure construction and maintenance costs, supply constraints, and water quality assurance. In spite of that, relatively few water forecast accuracy assessments have been completed to date, although some internal documentation may exist as part of the proprietary "grey literature." This study utilizes a data set of previously published municipal consumption forecasts to partially fill that gap in the empirical water economics literature. Previously published municipal water econometric forecasts for three public utilities are examined for predictive accuracy against two random walk benchmarks commonly used in regional analyses. Descriptive metrics used to quantify forecast accuracy include root-mean-square error and Theil inequality statistics. Formal statistical assessments are completed using four-pronged error differential regression F tests. Similar to studies for other metropolitan econometric forecasts in areas with similar demographic and labor market characteristics, model predictive performances for the municipal water aggregates in this effort are mixed for each of the municipalities included in the sample. Given the competitiveness of the benchmarks, analysts should employ care when utilizing econometric forecasts of municipal water consumption for planning purposes, comparing them to recent historical observations and trends to insure reliability. Comparative results using data from other markets, including regions facing differing labor and demographic conditions, would also be helpful.

  13. Evaluation of the product ratio coherent model in forecasting mortality rates and life expectancy at births by States

    NASA Astrophysics Data System (ADS)

    Shair, Syazreen Niza; Yusof, Aida Yuzi; Asmuni, Nurin Haniah

    2017-05-01

    Coherent mortality forecasting models have recently received increasing attention particularly in their application to sub-populations. The advantage of coherent models over independent models is the ability to forecast a non-divergent mortality for two or more sub-populations. One of the coherent models was recently developed by [1] known as the product-ratio model. This model is an extension version of the functional independent model from [2]. The product-ratio model has been applied in a developed country, Australia [1] and has been extended in a developing nation, Malaysia [3]. While [3] accounted for coherency of mortality rates between gender and ethnic group, the coherency between states in Malaysia has never been explored. This paper will forecast the mortality rates of Malaysian sub-populations according to states using the product ratio coherent model and its independent version— the functional independent model. The forecast accuracies of two different models are evaluated using the out-of-sample error measurements— the mean absolute forecast error (MAFE) for age-specific death rates and the mean forecast error (MFE) for the life expectancy at birth. We employ Malaysian mortality time series data from 1991 to 2014, segregated by age, gender and states.

  14. Calibration and combination of monthly near-surface temperature and precipitation predictions over Europe

    NASA Astrophysics Data System (ADS)

    Rodrigues, Luis R. L.; Doblas-Reyes, Francisco J.; Coelho, Caio A. S.

    2018-02-01

    A Bayesian method known as the Forecast Assimilation (FA) was used to calibrate and combine monthly near-surface temperature and precipitation outputs from seasonal dynamical forecast systems. The simple multimodel (SMM), a method that combines predictions with equal weights, was used as a benchmark. This research focuses on Europe and adjacent regions for predictions initialized in May and November, covering the boreal summer and winter months. The forecast quality of the FA and SMM as well as the single seasonal dynamical forecast systems was assessed using deterministic and probabilistic measures. A non-parametric bootstrap method was used to account for the sampling uncertainty of the forecast quality measures. We show that the FA performs as well as or better than the SMM in regions where the dynamical forecast systems were able to represent the main modes of climate covariability. An illustration with the near-surface temperature over North Atlantic, the Mediterranean Sea and Middle-East in summer months associated with the well predicted first mode of climate covariability is offered. However, the main modes of climate covariability are not well represented in most situations discussed in this study as the seasonal dynamical forecast systems have limited skill when predicting the European climate. In these situations, the SMM performs better more often.

  15. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    NASA Astrophysics Data System (ADS)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  16. The promise of air cargo-system aspects and vehicle design

    NASA Technical Reports Server (NTRS)

    Whitehead, A. H., Jr.

    1977-01-01

    A review of the current operation of the air cargo system is presented and the prospects for the future are discussed. Attention is given to air cargo demand forecasts, the economics of air cargo transport, the development of an integrated air cargo system, and the evolution of airfreighter design. Particular emphasis is placed on the span-distributed load concept, examining the Boeing, Douglas, and Lockheed spanloaders.

  17. WRF-Chem Model Simulations of Arizona Dust Storms

    NASA Astrophysics Data System (ADS)

    Mohebbi, A.; Chang, H. I.; Hondula, D.

    2017-12-01

    The online Weather Research and Forecasting model with coupled chemistry module (WRF-Chem) is applied to simulate the transport, deposition and emission of the dust aerosols in an intense dust outbreak event that took place on July 5th, 2011 over Arizona. Goddard Chemistry Aerosol Radiation and Transport (GOCART), Air Force Weather Agency (AFWA), and University of Cologne (UoC) parameterization schemes for dust emission were evaluated. The model was found to simulate well the synoptic meteorological conditions also widely documented in previous studies. The chemistry module performance in reproducing the atmospheric desert dust load was evaluated using the horizontal field of the Aerosol Optical Depth (AOD) from Moderate Resolution Imaging Spectro (MODIS) radiometer Terra/Aqua and Aerosol Robotic Network (AERONET) satellites employing standard Dark Target (DT) and Deep Blue (DB) algorithms. To assess the temporal variability of the dust storm, Particulate Matter mass concentration data (PM10 and PM2.5) from Arizona Department of Environmental Quality (AZDEQ) ground-based air quality stations were used. The promising performance of WRF-Chem indicate that the model is capable of simulating the right timing and loading of a dust event in the planetary-boundary-layer (PBL) which can be used to forecast approaching severe dust events and to communicate an effective early warning.

  18. LIDAR detection of forest fire smoke above Sofia

    NASA Astrophysics Data System (ADS)

    Grigorov, Ivan; Deleva, Atanaska; Stoyanov, Dimitar; Kolev, Nikolay; Kolarov, Georgi

    2015-01-01

    The distribution of aerosol load in the atmosphere due to two forest fires near Sofia (the capital city of Bulgaria) was studied using two aerosol lidars which operated at 510.6 nm and 1064 nm. Experimental data is presented as 2D-heatmaps of the evolution of attenuated backscatter coefficient profiles and mean profile of the aerosol backscatter coefficient, calculated for each lidar observation. Backscatter related Angstrom exponent was used as a criterion in particle size estimation of detected smoke layers. Calculated minimal values at altitudes where the aerosol layer was observed corresponded to predominant fraction of coarse aerosol. Dust-transport forecast maps and calculations of backward trajectories were employed to make conclusions about aerosol's origin. They confirmed the local transport of smoke aerosol over the city and lidar station. DREAM forecast maps predicted neither cloud cover, nor Saharan load in the air above Sofia on the days of measurements. The results of lidar observations are discussed in conjunction with meteorological situation, aiming to better explain the reason for the observed aerosol stratification. The data of regular radio sounding of the atmosphere showed a characteristic behavior with small differences of the values between the air temperature and dew-point temperature profiles at aerosol smoke layer altitude. So the resulting stratification revealed the existence of atmospheric layers with aerosol trapping properties.

  19. Survey of spatial data needs and land use forecasting methods in the electric utility industry

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A representative sample of the electric utility industry in the United States was surveyed to determine industry need for spatial data (specifically LANDSAT and other remotely sensed data) and the methods used by the industry to forecast land use changes and future energy demand. Information was acquired through interviews, written questionnaires, and reports (both published and internal).

  20. Forecasting intentional wildfires using temporal and spatiotemporal autocorrelations

    Treesearch

    Jeffrey P. Prestemon; María L. Chas-Amil; Julia M. Touza; Scott L. Goodrick

    2012-01-01

    We report daily time series models containing both temporal and spatiotemporal lags, which are applied to forecasting intentional wildfires in Galicia, Spain. Models are estimated independently for each of the 19 forest districts in Galicia using a 1999–2003 training dataset and evaluated out-of-sample with a 2004–06 dataset. Poisson autoregressive models of order P –...

  1. Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard

    NASA Astrophysics Data System (ADS)

    Voronin, K. S.

    2016-10-01

    Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.

  2. Developing International Guidelines on Volcanic Hazard Assessments for Nuclear Facilities

    NASA Astrophysics Data System (ADS)

    Connor, Charles

    2014-05-01

    Worldwide, tremendous progress has been made in recent decades in forecasting volcanic events, such as episodes of volcanic unrest, eruptions, and the potential impacts of eruptions. Generally these forecasts are divided into two categories. Short-term forecasts are prepared in response to unrest at volcanoes, rely on geophysical monitoring and related observations, and have the goal of forecasting events on timescales of hours to weeks to provide time for evacuation of people, shutdown of facilities, and implementation of related safety measures. Long-term forecasts are prepared to better understand the potential impacts of volcanism in the future and to plan for potential volcanic activity. Long-term forecasts are particularly useful to better understand and communicate the potential consequences of volcanic events for populated areas around volcanoes and for siting critical infrastructure, such as nuclear facilities. Recent work by an international team, through the auspices of the International Atomic Energy Agency, has focused on developing guidelines for long-term volcanic hazard assessments. These guidelines have now been implemented for hazard assessment for nuclear facilities in nations including Indonesia, the Philippines, Armenia, Chile, and the United States. One any time scale, all volcanic hazard assessments rely on a geologically reasonable conceptual model of volcanism. Such conceptual models are usually built upon years or decades of geological studies of specific volcanic systems, analogous systems, and development of a process-level understanding of volcanic activity. Conceptual models are used to bound potential rates of volcanic activity, potential magnitudes of eruptions, and to understand temporal and spatial trends in volcanic activity. It is these conceptual models that provide essential justification for assumptions made in statistical model development and the application of numerical models to generate quantitative forecasts. It is a tremendous challenge in quantitative volcanic hazard assessments to encompass alternative conceptual models, and to create models that are robust to evolving understanding of specific volcanic systems by the scientific community. A central question in volcanic hazards forecasts is quantifying rates of volcanic activity. Especially for long-dormant volcanic systems, data from the geologic record may be sparse, individual events may be missing or unrecognized in the geologic record, patterns of activity may be episodic or otherwise nonstationary. This leads to uncertainty in forecasting long-term rates of activity. Hazard assessments strive to quantify such uncertainty, for example by comparing observed rates of activity with alternative parametric and nonparametric models. Numerical models are presented that characterize the spatial distribution of potential volcanic events. These spatial density models serve as the basis for application of numerical models of specific phenomena such as development of lava flow, tephra fallout, and a host of other volcanic phenomena. Monte Carlo techniques (random sampling, stratified sampling, importance sampling) are methods used to sample vent location and other key eruption parameters, such as eruption volume, magma rheology, and eruption column height for probabilistic models. The development of coupled scenarios (e.g., the probability of tephra accumulation on a slope resulting in subsequent debris flows) is also assessed through these methods, usually with the aid of event trees. The primary products of long-term forecasts are a statistical model of the conditional probability of the potential effects of volcanism, should an eruption occur, and the probability of such activity occurring. It is emphasized that hazard forecasting is an iterative process, and board consideration must be given to alternative conceptual models of volcanism, weighting of volcanological data in the analyses, and alternative statistical and numerical models. This structure is amenable to expert elicitation in order to weight alternative models and to explore alternative scenarios.

  3. Trends in the predictive performance of raw ensemble weather forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Scheuerer, Michael; Pappenberger, Florian; Bogner, Konrad; Haiden, Thomas

    2015-04-01

    Over the last two decades the paradigm in weather forecasting has shifted from being deterministic to probabilistic. Accordingly, numerical weather prediction (NWP) models have been run increasingly as ensemble forecasting systems. The goal of such ensemble forecasts is to approximate the forecast probability distribution by a finite sample of scenarios. Global ensemble forecast systems, like the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble, are prone to probabilistic biases, and are therefore not reliable. They particularly tend to be underdispersive for surface weather parameters. Hence, statistical post-processing is required in order to obtain reliable and sharp forecasts. In this study we apply statistical post-processing to ensemble forecasts of near-surface temperature, 24-hour precipitation totals, and near-surface wind speed from the global ECMWF model. Our main objective is to evaluate the evolution of the difference in skill between the raw ensemble and the post-processed forecasts. The ECMWF ensemble is under continuous development, and hence its forecast skill improves over time. Parts of these improvements may be due to a reduction of probabilistic bias. Thus, we first hypothesize that the gain by post-processing decreases over time. Based on ECMWF forecasts from January 2002 to March 2014 and corresponding observations from globally distributed stations we generate post-processed forecasts by ensemble model output statistics (EMOS) for each station and variable. Parameter estimates are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over rolling training periods that consist of the n days preceding the initialization dates. Given the higher average skill in terms of CRPS of the post-processed forecasts for all three variables, we analyze the evolution of the difference in skill between raw ensemble and EMOS forecasts. The fact that the gap in skill remains almost constant over time, especially for near-surface wind speed, suggests that improvements to the atmospheric model have an effect quite different from what calibration by statistical post-processing is doing. That is, they are increasing potential skill. Thus this study indicates that (a) further model development is important even if one is just interested in point forecasts, and (b) statistical post-processing is important because it will keep adding skill in the foreseeable future.

  4. Does the OVX matter for volatility forecasting? Evidence from the crude oil market

    NASA Astrophysics Data System (ADS)

    Lv, Wendai

    2018-02-01

    In this paper, I investigate that whether the OVX and its truncated parts with a certain threshold can significantly help in forecasting the oil futures price volatility basing on the Heterogeneous Autoregressive model of Realized Volatility (HAR-RV). In-sample estimation results show that the OVX has a significantly positive impact on futures volatility. The impact of large OVX on future volatility has slightly powerful compared to the small ones. Moreover, the HARQ-RV model outperforms the HAR-RV in predicting the oil futures volatility. More importantly, the decomposed OVX have more powerful in forecasting the oil futures price volatility compared to the OVX itself.

  5. Medium- and long-term electric power demand forecasting based on the big data of smart city

    NASA Astrophysics Data System (ADS)

    Wei, Zhanmeng; Li, Xiyuan; Li, Xizhong; Hu, Qinghe; Zhang, Haiyang; Cui, Pengjie

    2017-08-01

    Based on the smart city, this paper proposed a new electric power demand forecasting model, which integrates external data such as meteorological information, geographic information, population information, enterprise information and economic information into the big database, and uses an improved algorithm to analyse the electric power demand and provide decision support for decision makers. The data mining technology is used to synthesize kinds of information, and the information of electric power customers is analysed optimally. The scientific forecasting is made based on the trend of electricity demand, and a smart city in north-eastern China is taken as a sample.

  6. Towards Optimal Operation of the Reservoir System in Upper Yellow River: Incorporating Long- and Short-term Operations and Using Rolling Updated Hydrologic Forecast Information

    NASA Astrophysics Data System (ADS)

    Si, Y.; Li, X.; Li, T.; Huang, Y.; Yin, D.

    2016-12-01

    The cascade reservoirs in Upper Yellow River (UYR), one of the largest hydropower bases in China, play a vital role in peak load and frequency regulation for Northwest China Power Grid. The joint operation of this system has been put forward for years whereas has not come into effect due to management difficulties and inflow uncertainties, and thus there is still considerable improvement room for hydropower production. This study presents a decision support framework incorporating long- and short-term operation of the reservoir system. For long-term operation, we maximize hydropower production of the reservoir system using historical hydrological data of multiple years, and derive operating rule curves for storage reservoirs. For short-term operation, we develop a program consisting of three modules, namely hydrologic forecast module, reservoir operation module and coordination module. The coordination module is responsible for calling the hydrologic forecast module to acquire predicted inflow within a short-term horizon, and transferring the information to the reservoir operation module to generate optimal release decision. With the hydrologic forecast information updated, the rolling short-term optimization is iterated until the end of operation period, where the long-term operating curves serve as the ending storage target. As an application, the Digital Yellow River Integrated Model (referred to as "DYRIM", which is specially designed for runoff-sediment simulation in the Yellow River basin by Tsinghua University) is used in the hydrologic forecast module, and the successive linear programming (SLP) in the reservoir operation module. The application in the reservoir system of UYR demonstrates that the framework can effectively support real-time decision making, and ensure both computational accuracy and speed. Furthermore, it is worth noting that the general framework can be extended to any other reservoir system with any or combination of hydrological model(s) to forecast and any solver to optimize the operation of reservoir system.

  7. Heterogeneity: The key to failure forecasting

    PubMed Central

    Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.

    2015-01-01

    Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power. PMID:26307196

  8. Heterogeneity: The key to failure forecasting.

    PubMed

    Vasseur, Jérémie; Wadsworth, Fabian B; Lavallée, Yan; Bell, Andrew F; Main, Ian G; Dingwell, Donald B

    2015-08-26

    Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.

  9. Heterogeneity: The key to failure forecasting

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian B.; Lavallée, Yan; Bell, Andrew F.; Main, Ian G.; Dingwell, Donald B.

    2015-08-01

    Elastic waves are generated when brittle materials are subjected to increasing strain. Their number and energy increase non-linearly, ending in a system-sized catastrophic failure event. Accelerating rates of geophysical signals (e.g., seismicity and deformation) preceding large-scale dynamic failure can serve as proxies for damage accumulation in the Failure Forecast Method (FFM). Here we test the hypothesis that the style and mechanisms of deformation, and the accuracy of the FFM, are both tightly controlled by the degree of microstructural heterogeneity of the material under stress. We generate a suite of synthetic samples with variable heterogeneity, controlled by the gas volume fraction. We experimentally demonstrate that the accuracy of failure prediction increases drastically with the degree of material heterogeneity. These results have significant implications in a broad range of material-based disciplines for which failure forecasting is of central importance. In particular, the FFM has been used with only variable success to forecast failure scenarios both in the field (volcanic eruptions and landslides) and in the laboratory (rock and magma failure). Our results show that this variability may be explained, and the reliability and accuracy of forecast quantified significantly improved, by accounting for material heterogeneity as a first-order control on forecasting power.

  10. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  11. Comparison of discrete Fourier transform (DFT) and principal component analysis/DFT as forecasting tools for absorbance time series received by UV-visible probes installed in urban sewer systems.

    PubMed

    Plazas-Nossa, Leonardo; Torres, Andrés

    2014-01-01

    The objective of this work is to introduce a forecasting method for UV-Vis spectrometry time series that combines principal component analysis (PCA) and discrete Fourier transform (DFT), and to compare the results obtained with those obtained by using DFT. Three time series for three different study sites were used: (i) Salitre wastewater treatment plant (WWTP) in Bogotá; (ii) Gibraltar pumping station in Bogotá; and (iii) San Fernando WWTP in Itagüí (in the south part of Medellín). Each of these time series had an equal number of samples (1051). In general terms, the results obtained are hardly generalizable, as they seem to be highly dependent on specific water system dynamics; however, some trends can be outlined: (i) for UV range, DFT and PCA/DFT forecasting accuracy were almost the same; (ii) for visible range, the PCA/DFT forecasting procedure proposed gives systematically lower forecasting errors and variability than those obtained with the DFT procedure; and (iii) for short forecasting times the PCA/DFT procedure proposed is more suitable than the DFT procedure, according to processing times obtained.

  12. Integrated Wind Power Planning Tool

    NASA Astrophysics Data System (ADS)

    Rosgaard, Martin; Giebel, Gregor; Skov Nielsen, Torben; Hahmann, Andrea; Sørensen, Poul; Madsen, Henrik

    2013-04-01

    This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the title "Integrated Wind Power Planning Tool". The goal is to integrate a mesoscale numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. Using the state-of-the-art mesoscale NWP model Weather Research & Forecasting model (WRF) the forecast error is sought quantified in dependence of the time scale involved. This task constitutes a preparative study for later implementation of features accounting for NWP forecast errors in the DTU Wind Energy maintained Corwind code - a long term wind power planning tool. Within the framework of PSO 10464 research related to operational short term wind power prediction will be carried out, including a comparison of forecast quality at different mesoscale NWP model resolutions and development of a statistical wind power prediction tool taking input from WRF. The short term prediction part of the project is carried out in collaboration with ENFOR A/S; a Danish company that specialises in forecasting and optimisation for the energy sector. The integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated short term prediction tool constitutes scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator. The need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2025 from the current 20%.

  13. A versatile data-visualization application for the Norwegian flood forecasting service

    NASA Astrophysics Data System (ADS)

    Kobierska, Florian; Langsholt, Elin G.; Hamududu, Byman H.; Engeland, Kolbjørn

    2017-04-01

    - General motivation A graphical user interface has been developed to visualize multi-model hydrological forecasts at the flood forecasting service of the Norwegian water and energy directorate. It is based on the R 'shiny' package, with which interactive web applications can quickly be prototyped. The app queries multiple data sources, building a comprehensive infographics dashboard for the decision maker. - Main features of the app The visualization application comprises several tabs, each built with different functionality and focus. A map of forecast stations gives a rapid insight of the flood situation and serves, concurrently, as a map station selection (based on the 'leaflet' package). The map selection is linked to multi-panel forecast plots which can present input, state or runoff parameters. Another tab focuses on past model performance and calibration runs. - Software design choices The application was programmed with a focus on flexibility regarding data-sources. The parsing of text-based model results was explicitly separated from the app (in the separate R package 'NVEDATA'), so that it only loads standardized RData binary files. We focused on allowing re-usability in other contexts by structuring the app into specific 'shiny' modules. The code was bundled into an R package, which is available on GitHub. - Documentation efforts A documentation website is under development. For easier collaboration, we chose to host it on the 'GitHub Pages' branch of the repository and build it automatically with a continuous integration service. The aim is to gather all information about the flood forecasting methodology at NVE in one location. This encompasses details on each hydrological model used as well as the documentation of the data-visualization application. - Outlook for further development The ability to select a group of stations by filtering a table (i.e. past performance, past major flood events, catchment parameters) and exporting it to the forecast tab could be of interest for detailed model analysis. The design choices for this app were motivated by a need for extensibility and modularity and those qualities will be tested and improved as new datasets need integrating into this to​ol.

  14. Impact of sampling strategy on stream load estimates in till landscape of the Midwest

    USGS Publications Warehouse

    Vidon, P.; Hubbard, L.E.; Soyeux, E.

    2009-01-01

    Accurately estimating various solute loads in streams during storms is critical to accurately determine maximum daily loads for regulatory purposes. This study investigates the impact of sampling strategy on solute load estimates in streams in the US Midwest. Three different solute types (nitrate, magnesium, and dissolved organic carbon (DOC)) and three sampling strategies are assessed. Regardless of the method, the average error on nitrate loads is higher than for magnesium or DOC loads, and all three methods generally underestimate DOC loads and overestimate magnesium loads. Increasing sampling frequency only slightly improves the accuracy of solute load estimates but generally improves the precision of load calculations. This type of investigation is critical for water management and environmental assessment so error on solute load calculations can be taken into account by landscape managers, and sampling strategies optimized as a function of monitoring objectives. ?? 2008 Springer Science+Business Media B.V.

  15. Using regression methods to estimate stream phosphorus loads at the Illinois River, Arkansas

    USGS Publications Warehouse

    Haggard, B.E.; Soerens, T.S.; Green, W.R.; Richards, R.P.

    2003-01-01

    The development of total maximum daily loads (TMDLs) requires evaluating existing constituent loads in streams. Accurate estimates of constituent loads are needed to calibrate watershed and reservoir models for TMDL development. The best approach to estimate constituent loads is high frequency sampling, particularly during storm events, and mass integration of constituents passing a point in a stream. Most often, resources are limited and discrete water quality samples are collected on fixed intervals and sometimes supplemented with directed sampling during storm events. When resources are limited, mass integration is not an accurate means to determine constituent loads and other load estimation techniques such as regression models are used. The objective of this work was to determine a minimum number of water-quality samples needed to provide constituent concentration data adequate to estimate constituent loads at a large stream. Twenty sets of water quality samples with and without supplemental storm samples were randomly selected at various fixed intervals from a database at the Illinois River, northwest Arkansas. The random sets were used to estimate total phosphorus (TP) loads using regression models. The regression-based annual TP loads were compared to the integrated annual TP load estimated using all the data. At a minimum, monthly sampling plus supplemental storm samples (six samples per year) was needed to produce a root mean square error of less than 15%. Water quality samples should be collected at least semi-monthly (every 15 days) in studies less than two years if seasonal time factors are to be used in the regression models. Annual TP loads estimated from independently collected discrete water quality samples further demonstrated the utility of using regression models to estimate annual TP loads in this stream system.

  16. High-resolution empirical geomagnetic field model TS07D: Investigating run-on-request and forecasting modes of operation

    NASA Astrophysics Data System (ADS)

    Stephens, G. K.; Sitnov, M. I.; Ukhorskiy, A. Y.; Vandegriff, J. D.; Tsyganenko, N. A.

    2010-12-01

    The dramatic increase of the geomagnetic field data volume available due to many recent missions, including GOES, Polar, Geotail, Cluster, and THEMIS, required at some point the appropriate qualitative transition in the empirical modeling tools. Classical empirical models, such as T96 and T02, used few custom-tailored modules to represent major magnetospheric current systems and simple data binning or loading-unloading inputs for their fitting with data and the subsequent applications. They have been replaced by more systematic expansions of the equatorial and field-aligned current contributions as well as by the advanced data-mining algorithms searching for events with the global activity parameters, such as the Sym-H index, similar to those at the time of interest, as is done in the model TS07D (Tsyganenko and Sitnov, 2007; Sitnov et al., 2008). The necessity to mine and fit data dynamically, with the individual subset of the database being used to reproduce the geomagnetic field pattern at every new moment in time, requires the corresponding transition in the use of the new empirical geomagnetic field models. It becomes more similar to runs-on-request offered by the Community Coordinated Modeling Center for many first principles MHD and kinetic codes. To provide this mode of operation for the TS07D model a new web-based modeling tool has been created and tested at the JHU/APL (http://geomag_field.jhuapl.edu/model/), and we discuss the first results of its performance testing and validation, including in-sample and out-of-sample modeling of a number of CME- and CIR-driven magnetic storms. We also report on the first tests of the forecasting version of the TS07D model, where the magnetospheric part of the macro-parameters involved in the data-binning process (Sym-H index and its trend parameter) are replaced by their solar wind-based analogs obtained using the Burton-McPherron-Russell approach.

  17. Contribution of dust and anthropogenic pollution to aerosol optical depth in South Korea during Spring/Summer 2016

    NASA Astrophysics Data System (ADS)

    Beyersdorf, A. J.; Corr, C.; Hite, J. R.; Jordan, C.; Nenes, A.; Thornhill, K. L., II; Winstead, E.; Anderson, B. E.

    2017-12-01

    Aerosol pollution is a major problem over the Korean peninsula during spring and summer each year. Spring coincides with peak transport of dust and biomass-burning aerosol transport from East Asia. These sources coupled with persistently high concentrations of local anthropogenic pollution and urban aerosols transported from upwind regions create complex, spatially inhomogeneous mixtures of aerosol types especially during periods of high aerosol loading. In order to improve diagnostic and forecasting capabilities for these high loading events using remote sensors and models, the NASA Korea-US Air Quality Study (KORUS-AQ) provided detailed evaluation of the vertical, spatial, and temporal variations in pollution during May and June 2016. Aerosol measurements from an instrumented aircraft are used to determine the relative abundance and properties of anthropogenic aerosol and dust in South Korea. Of particular interest are differences in the Seoul Metropolitan Area as a function of location and day. Based on preliminary analysis, aerosol over central Seoul were more absorbing than measurements east of Seoul (Taewha Forest) suggesting primary emissions dominate over Seoul while secondary aerosol production occurs as the aerosol is transported downwind. Dust transport will be determined based on a wing-mounted probe in combination with filter samples. Sub-micron anthropogenic data is more completely studied including optical and size measurements, composition, and cloud activity.

  18. Vibro-Acoustic Forecasts for STS (Space Transportation System) Launches at V23, Vandenberg AFB: Results Summary and the Payload Preparation Room

    DTIC Science & Technology

    1985-05-08

    Displacement Time Series Forecasts for Channels 2 Through 8 and (b) Channels 9 Through 16 33 17. Sample PSD Plots for ( a ) Levels 99 and 119 East Cell Rail...for Sensors at ( a ) Levels 99 and 119 on the West Cell Rail, (b) Level 69 on the East and West Cell Rail Footings, and (c) Level 99 on the West Cell Rail...17. Sample PSD Plots for ( a ) Levels 99 and 119 East Cell Rail Locations, (b) level 69 Sensors on the East and West Cell Rail Footings, and (c) Level

  19. The Global Precipitation Mission

    NASA Technical Reports Server (NTRS)

    Braun, Scott; Kummerow, Christian

    2000-01-01

    The Global Precipitation Mission (GPM), expected to begin around 2006, is a follow-up to the Tropical Rainfall Measuring Mission (TRMM). Unlike TRMM, which primarily samples the tropics, GPM will sample both the tropics and mid-latitudes. The primary, or core, satellite will be a single, enhanced TRMM satellite that can quantify the 3-D spatial distributions of precipitation and its associated latent heat release. The core satellite will be complemented by a constellation of very small and inexpensive drones with passive microwave instruments that will sample the rainfall with sufficient frequency to be not only of climate interest, but also have local, short-term impacts by providing global rainfall coverage at approx. 3 h intervals. The data is expected to have substantial impact upon quantitative precipitation estimation/forecasting and data assimilation into global and mesoscale numerical models. Based upon previous studies of rainfall data assimilation, GPM is expected to lead to significant improvements in forecasts of extratropical and tropical cyclones. For example, GPM rainfall data can provide improved initialization of frontal systems over the Pacific and Atlantic Oceans. The purpose of this talk is to provide information about GPM to the USWRP (U.S. Weather Research Program) community and to discuss impacts on quantitative precipitation estimation/forecasting and data assimilation.

  20. Real-Time Analysis of a Sensor's Data for Automated Decision Making in an IoT-Based Smart Home.

    PubMed

    Khan, Nida Saddaf; Ghani, Sayeed; Haider, Sajjad

    2018-05-25

    IoT devices frequently generate large volumes of streaming data and in order to take advantage of this data, their temporal patterns must be learned and identified. Streaming data analysis has become popular after being successfully used in many applications including forecasting electricity load, stock market prices, weather conditions, etc. Artificial Neural Networks (ANNs) have been successfully utilized in understanding the embedded interesting patterns/behaviors in the data and forecasting the future values based on it. One such pattern is modelled and learned in the present study to identify the occurrence of a specific pattern in a Water Management System (WMS). This prediction aids in making an automatic decision support system, to switch OFF a hydraulic suction pump at the appropriate time. Three types of ANN, namely Multi-Input Multi-Output (MIMO), Multi-Input Single-Output (MISO), and Recurrent Neural Network (RNN) have been compared, for multi-step-ahead forecasting, on a sensor's streaming data. Experiments have shown that RNN has the best performance among three models and based on its prediction, a system can be implemented to make the best decision with 86% accuracy.

  1. Optical properties of volcanic ash: improving remote sensing observations.

    NASA Astrophysics Data System (ADS)

    Whelley, Patrick; Colarco, Peter; Aquila, Valentina; Krotkov, Nickolay; Bleacher, Jake; Garry, Brent; Young, Kelsey; Rocha Lima, Adriana; Martins, Vanderlei; Carn, Simon

    2016-04-01

    Many times each year explosive volcanic eruptions loft ash into the atmosphere. Global travel and trade rely on aircraft vulnerable to encounters with airborne ash. Volcanic ash advisory centers (VAACs) rely on dispersion forecasts and satellite data to issue timely warnings. To improve ash forecasts model developers and satellite data providers need realistic information about volcanic ash microphysical and optical properties. In anticipation of future large eruptions we can study smaller events to improve our remote sensing and modeling skills so when the next Pinatubo 1991 or larger eruption occurs, ash can confidently be tracked in a quantitative way. At distances >100km from their sources, drifting ash plumes, often above meteorological clouds, are not easily detected from conventional remote sensing platforms, save deriving their quantitative characteristics, such as mass density. Quantitative interpretation of these observations depends on a priori knowledge of the spectral optical properties of the ash in UV (>0.3μm) and TIR wavelengths (>10μm). Incorrect assumptions about the optical properties result in large errors in inferred column mass loading and size distribution, which misguide operational ash forecasts. Similarly, simulating ash properties in global climate models also requires some knowledge of optical properties to improve aerosol speciation.

  2. STS-121: Discovery L-1 Countdown Status Briefing

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Bruce Buckingham, NASA Public Affairs, introduces Jeff Spaulding, NASA Test Director; Debbie Hahn, STS-121 Payload Manager; and Kathy Winters, Shuttle Weather Officer. Spaulding gives his opening statement on this one day prior to the launching of the Space Shuttle Discovery. He discusses the following topics: 1) Launch of the Space Shuttle Discovery; 2) Weather; 3) Load over of onboard reactants; 4) Hold time for liquid hydrogen; 5) Stowage of Mid-deck completion; 6) Check-out of onboard and ground network systems; 7) Launch windows; 8) Mission duration; 9) Extravehicular (EVA) plans; 10) Space Shuttle landing day; and 11) Scrub turn-around plans. Hahn presents and discusses a short video of the STS-121 payload flow. Kathy Winters gives her weather forecast for launch. She then presents a slide presentation on the following weather conditions for the Space Shuttle Discovery: 1) STS-121 Tanking Forecast; 2) Launch Forecast; 3) SRB Recovery; 4) CONUS Launch; 5) TAL Launch; 6) 24 Hour Delay; 7) CONUS 24 Hour; 8) TAL 24 Hour; 9) 48 Hour Launch; 10) CONUS 48 Hour; and 11) TAL 48 Hour. The briefing ends with a question and answer period from the media.

  3. Assessing skill of a global bimonthly streamflow ensemble prediction system

    NASA Astrophysics Data System (ADS)

    van Dijk, A. I.; Peña-Arancibia, J.; Sheffield, J.; Wood, E. F.

    2011-12-01

    Ideally, a seasonal streamflow forecasting system might be conceived of as a system that ingests skillful climate forecasts from general circulation models and propagates these through thoroughly calibrated hydrological models that are initialised using hydrometric observations. In practice, there are practical problems with each of these aspects. Instead, we analysed whether a comparatively simple hydrological model-based Ensemble Prediction System (EPS) can provide global bimonthly streamflow forecasts with some skill and if so, under what circumstances the greatest skill may be expected. The system tested produces ensemble forecasts for each of six annual bimonthly periods based on the previous 30 years of global daily gridded 1° resolution climate variables and an initialised global hydrological model. To incorporate some of the skill derived from ocean conditions, a post-EPS analog method was used to sample from the ensemble based on El Niño Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation (PDO) index values observed prior to the forecast. Forecasts skill was assessed through a hind-casting experiment for the period 1979-2008. Potential skill was calculated with reference to a model run with the actual forcing for the forecast period (the 'perfect' model) and was compared to actual forecast skill calculated for each of the six forecast times for an average 411 Australian and 51 pan-tropical catchments. Significant potential skill in bimonthly forecasts was largely limited to northern regions during the snow melt period, seasonally wet tropical regions at the transition of wet to dry season, and the Indonesian region where rainfall is well correlated to ENSO. The actual skill was approximately 34-50% of the potential skill. We attribute this primarily to limitations in the model structure, parameterisation and global forcing data. Use of better climate forecasts and remote sensing observations of initial catchment conditions should help to increase actual skill in future. Future work also could address the potential skill gain from using weather and climate forecasts and from a calibrated and/or alternative hydrological model or model ensemble. The approach and data might be useful as a benchmark for joint seasonal forecasting experiments planned under GEWEX.

  4. Monthly forecasting of agricultural pests in Switzerland

    NASA Astrophysics Data System (ADS)

    Hirschi, M.; Dubrovsky, M.; Spirig, C.; Samietz, J.; Calanca, P.; Weigel, A. P.; Fischer, A. M.; Rotach, M. W.

    2012-04-01

    Given the repercussions of pests and diseases on agricultural production, detailed forecasting tools have been developed to simulate the degree of infestation depending on actual weather conditions. The life cycle of pests is most successfully predicted if the micro-climate of the immediate environment (habitat) of the causative organisms can be simulated. Sub-seasonal pest forecasts therefore require weather information for the relevant habitats and the appropriate time scale. The pest forecasting system SOPRA (www.sopra.info) currently in operation in Switzerland relies on such detailed weather information, using hourly weather observations up to the day the forecast is issued, but only a climatology for the forecasting period. Here, we aim at improving the skill of SOPRA forecasts by transforming the weekly information provided by ECMWF monthly forecasts (MOFCs) into hourly weather series as required for the prediction of upcoming life phases of the codling moth, the major insect pest in apple orchards worldwide. Due to the probabilistic nature of operational monthly forecasts and the limited spatial and temporal resolution, their information needs to be post-processed for use in a pest model. In this study, we developed a statistical downscaling approach for MOFCs that includes the following steps: (i) application of a stochastic weather generator to generate a large pool of daily weather series consistent with the climate at a specific location, (ii) a subsequent re-sampling of weather series from this pool to optimally represent the evolution of the weekly MOFC anomalies, and (iii) a final extension to hourly weather series suitable for the pest forecasting model. Results show a clear improvement in the forecast skill of occurrences of upcoming codling moth life phases when incorporating MOFCs as compared to the operational pest forecasting system. This is true both in terms of root mean squared errors and of the continuous rank probability scores of the probabilistic forecasts vs. the mean absolute errors of the deterministic system. Also, the application of the climate conserving recalibration (CCR, Weigel et al. 2009) technique allows for successful correction of the under-confidence in the forecasted occurrences of codling moth life phases. Reference: Weigel, A. P.; Liniger, M. A. & Appenzeller, C. (2009). Seasonal Ensemble Forecasts: Are Recalibrated Single Models Better than Multimodels? Mon. Wea. Rev., 137, 1460-1479.

  5. Forecasting volcanic ash dispersal and coeval resuspension during the April-May 2015 Calbuco eruption

    NASA Astrophysics Data System (ADS)

    Reckziegel, F.; Bustos, E.; Mingari, L.; Báez, W.; Villarosa, G.; Folch, A.; Collini, E.; Viramonte, J.; Romero, J.; Osores, S.

    2016-07-01

    Atmospheric dispersion of volcanic ash from explosive eruptions or from subsequent fallout deposit resuspension causes a range of impacts and disruptions on human activities and ecosystems. The April-May 2015 Calbuco eruption in Chile involved eruption and resuspension activities. We overview the chronology, effects, and products resulting from these events, in order to validate an operational forecast strategy for tephra dispersal. The modelling strategy builds on coupling the meteorological Weather Research and Forecasting (WRF/ARW) model with the FALL3D dispersal model for eruptive and resuspension processes. The eruption modelling considers two distinct particle granulometries, a preliminary first guess distribution used operationally when no field data was available yet, and a refined distribution based on field measurements. Volcanological inputs were inferred from eruption reports and results from an Argentina-Chilean ash sample data network, which performed in-situ sampling during the eruption. In order to validate the modelling strategy, results were compared with satellite retrievals and ground deposit measurements. Results indicate that the WRF-FALL3D modelling system can provide reasonable forecasts in both eruption and resuspension modes, particularly when the adjusted granulometry is considered. The study also highlights the importance of having dedicated datasets of active volcanoes furnishing first-guess model inputs during the early stages of an eruption.

  6. Combining Participatory Influenza Surveillance with Modeling and Forecasting: Three Alternative Approaches.

    PubMed

    Brownstein, John S; Chu, Shuyu; Marathe, Achla; Marathe, Madhav V; Nguyen, Andre T; Paolotti, Daniela; Perra, Nicola; Perrotta, Daniela; Santillana, Mauricio; Swarup, Samarth; Tizzoni, Michele; Vespignani, Alessandro; Vullikanti, Anil Kumar S; Wilson, Mandy L; Zhang, Qian

    2017-11-01

    Influenza outbreaks affect millions of people every year and its surveillance is usually carried out in developed countries through a network of sentinel doctors who report the weekly number of Influenza-like Illness cases observed among the visited patients. Monitoring and forecasting the evolution of these outbreaks supports decision makers in designing effective interventions and allocating resources to mitigate their impact. Describe the existing participatory surveillance approaches that have been used for modeling and forecasting of the seasonal influenza epidemic, and how they can help strengthen real-time epidemic science and provide a more rigorous understanding of epidemic conditions. We describe three different participatory surveillance systems, WISDM (Widely Internet Sourced Distributed Monitoring), Influenzanet and Flu Near You (FNY), and show how modeling and simulation can be or has been combined with participatory disease surveillance to: i) measure the non-response bias in a participatory surveillance sample using WISDM; and ii) nowcast and forecast influenza activity in different parts of the world (using Influenzanet and Flu Near You). WISDM-based results measure the participatory and sample bias for three epidemic metrics i.e. attack rate, peak infection rate, and time-to-peak, and find the participatory bias to be the largest component of the total bias. The Influenzanet platform shows that digital participatory surveillance data combined with a realistic data-driven epidemiological model can provide both short-term and long-term forecasts of epidemic intensities, and the ground truth data lie within the 95 percent confidence intervals for most weeks. The statistical accuracy of the ensemble forecasts increase as the season progresses. The Flu Near You platform shows that participatory surveillance data provide accurate short-term flu activity forecasts and influenza activity predictions. The correlation of the HealthMap Flu Trends estimates with the observed CDC ILI rates is 0.99 for 2013-2015. Additional data sources lead to an error reduction of about 40% when compared to the estimates of the model that only incorporates CDC historical information. While the advantages of participatory surveillance, compared to traditional surveillance, include its timeliness, lower costs, and broader reach, it is limited by a lack of control over the characteristics of the population sample. Modeling and simulation can help overcome this limitation as well as provide real-time and long-term forecasting of influenza activity in data-poor parts of the world. ©John S Brownstein, Shuyu Chu, Achla Marathe, Madhav V Marathe, Andre T Nguyen, Daniela Paolotti, Nicola Perra, Daniela Perrotta, Mauricio Santillana, Samarth Swarup, Michele Tizzoni, Alessandro Vespignani, Anil Kumar S Vullikanti, Mandy L Wilson, Qian Zhang. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 01.11.2017.

  7. The Invasive Species Forecasting System (ISFS): An iRODS-Based, Cloud-Enabled Decision Support System for Invasive Species Habitat Suitability Modeling

    NASA Technical Reports Server (NTRS)

    Gill, Roger; Schnase, John L.

    2012-01-01

    The Invasive Species Forecasting System (ISFS) is an online decision support system that allows users to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of interest, such as a national park, monument, forest, or refuge. Target customers for ISFS are natural resource managers and decision makers who have a need for scientifically valid, model- based predictions of the habitat suitability of plant species of management concern. In a joint project involving NASA and the Maryland Department of Natural Resources, ISFS has been used to model the potential distribution of Wavyleaf Basketgrass in Maryland's Chesapeake Bay Watershed. Maximum entropy techniques are used to generate predictive maps using predictor datasets derived from remotely sensed data and climate simulation outputs. The workflow to run a model is implemented in an iRODS microservice using a custom ISFS file driver that clips and re-projects data to geographic regions of interest, then shells out to perform MaxEnt processing on the input data. When the model completes, all output files and maps from the model run are registered in iRODS and made accessible to the user. The ISFS user interface is a web browser that uses the iRODS PHP client to interact with the ISFS/iRODS- server. ISFS is designed to reside in a VMware virtual machine running SLES 11 and iRODS 3.0. The ISFS virtual machine is hosted in a VMware vSphere private cloud infrastructure to deliver the online service.

  8. Impacts of Climate Change on Electricity Consumption in Baden-Wuerttemberg

    NASA Astrophysics Data System (ADS)

    Mimler, S.

    2009-04-01

    Changes in electricity consumption due to changes in mean air temperatures were examined for the German federal state Baden-Wuerttemberg. Unlike in most recent studies on future electricity demand variations due to climate change, other load influencing factors like the economic, technological and demographic situation were fixed to the state of 2006. This allows isolating the climate change effect on electricity demand. The analysis was realised in two major steps. Firstly, an electricity forecast model based on multiple regressions was estimated on the region of Baden-Wuerttemberg by using historical load and temperature data. The estimation of the forecast model provides information on the temperature sensitivity of electricity demand in the given region. The overall heating and cooling gradients are estimated with -59 and 84 MW / °C respectively. These results already point out a low temperature sensitivity of demand in the region of Baden-Wuerttemberg mostly due to a low share of households equipped with electric heating and air conditioning systems. Secondly, near surface air temperature data of the regional climate model REMO [1] was used to simulate load curves for the control period 1971 to 2000 and for three future scenarios 2006 to 2035, 2036 to 2065 and 2066 to 2095. The results show that the overall load decreases throughout all future scenario periods in comparison to the control period. This is due to a higher decrease in heating than increase in cooling load. Nevertheless, the weather dependent part in Baden-Wuerttemberg loads only accounts for 0.05 % of the average load level. Within this weather dependent part, the heating load decreases are highest in June to September concentrated on the day times evening and afternoon. The cooling period broadens from May to September in the control period to April to October by 2095. The highest relative increases occur in October. Regarding day times, the increase in cooling load is concentrated on afternoons, evenings and nights. [1] Jacob, D. (2005a), "REMO A1B Scenario run, UBA project, 0.088 degree resolution, run no.006211, 1H data", World Data Center for Climate, CERA-DB "REMO_UBA_A1B_1_R006211_1H", http://cera-www.dkrz.de/WDCC/ui/Compact.jsp? acronym=REMO_UBA_A1B_1_R006211_1H Jacob, D. (2005b), "REMO climate of the 20th century run, UBA project, 0.088 degree resolution, run no. 006210, 1H data", World Data Center for Climate, CERA-DB "REMO_UBA_C20_1_R006210_1H", http://cera-www.dkrz.de/WDCC/ui/Compact. jsp?acronym=REMO_UBA_C20_1_R006210_1H

  9. Predicting the risk of cucurbit downy mildew in the eastern United States using an integrated aerobiological model

    NASA Astrophysics Data System (ADS)

    Neufeld, K. N.; Keinath, A. P.; Gugino, B. K.; McGrath, M. T.; Sikora, E. J.; Miller, S. A.; Ivey, M. L.; Langston, D. B.; Dutta, B.; Keever, T.; Sims, A.; Ojiambo, P. S.

    2017-11-01

    Cucurbit downy mildew caused by the obligate oomycete, Pseudoperonospora cubensis, is considered one of the most economically important diseases of cucurbits worldwide. In the continental United States, the pathogen overwinters in southern Florida and along the coast of the Gulf of Mexico. Outbreaks of the disease in northern states occur annually via long-distance aerial transport of sporangia from infected source fields. An integrated aerobiological modeling system has been developed to predict the risk of disease occurrence and to facilitate timely use of fungicides for disease management. The forecasting system, which combines information on known inoculum sources, long-distance atmospheric spore transport and spore deposition modules, was tested to determine its accuracy in predicting risk of disease outbreak. Rainwater samples at disease monitoring sites in Alabama, Georgia, Louisiana, New York, North Carolina, Ohio, Pennsylvania and South Carolina were collected weekly from planting to the first appearance of symptoms at the field sites during the 2013, 2014, and 2015 growing seasons. A conventional PCR assay with primers specific to P. cubensis was used to detect the presence of sporangia in rain water samples. Disease forecasts were monitored and recorded for each site after each rain event until initial disease symptoms appeared. The pathogen was detected in 38 of the 187 rainwater samples collected during the study period. The forecasting system correctly predicted the risk of disease outbreak based on the presence of sporangia or appearance of initial disease symptoms with an overall accuracy rate of 66 and 75%, respectively. In addition, the probability that the forecasting system correctly classified the presence or absence of disease was ≥ 73%. The true skill statistic calculated based on the appearance of disease symptoms in cucurbit field plantings ranged from 0.42 to 0.58, indicating that the disease forecasting system had an acceptable to good performance in predicting the risk of cucurbit downy mildew outbreak in the eastern United States.

  10. Predicting the risk of cucurbit downy mildew in the eastern United States using an integrated aerobiological model.

    PubMed

    Neufeld, K N; Keinath, A P; Gugino, B K; McGrath, M T; Sikora, E J; Miller, S A; Ivey, M L; Langston, D B; Dutta, B; Keever, T; Sims, A; Ojiambo, P S

    2018-04-01

    Cucurbit downy mildew caused by the obligate oomycete, Pseudoperonospora cubensis, is considered one of the most economically important diseases of cucurbits worldwide. In the continental United States, the pathogen overwinters in southern Florida and along the coast of the Gulf of Mexico. Outbreaks of the disease in northern states occur annually via long-distance aerial transport of sporangia from infected source fields. An integrated aerobiological modeling system has been developed to predict the risk of disease occurrence and to facilitate timely use of fungicides for disease management. The forecasting system, which combines information on known inoculum sources, long-distance atmospheric spore transport and spore deposition modules, was tested to determine its accuracy in predicting risk of disease outbreak. Rainwater samples at disease monitoring sites in Alabama, Georgia, Louisiana, New York, North Carolina, Ohio, Pennsylvania and South Carolina were collected weekly from planting to the first appearance of symptoms at the field sites during the 2013, 2014, and 2015 growing seasons. A conventional PCR assay with primers specific to P. cubensis was used to detect the presence of sporangia in rain water samples. Disease forecasts were monitored and recorded for each site after each rain event until initial disease symptoms appeared. The pathogen was detected in 38 of the 187 rainwater samples collected during the study period. The forecasting system correctly predicted the risk of disease outbreak based on the presence of sporangia or appearance of initial disease symptoms with an overall accuracy rate of 66 and 75%, respectively. In addition, the probability that the forecasting system correctly classified the presence or absence of disease was ≥ 73%. The true skill statistic calculated based on the appearance of disease symptoms in cucurbit field plantings ranged from 0.42 to 0.58, indicating that the disease forecasting system had an acceptable to good performance in predicting the risk of cucurbit downy mildew outbreak in the eastern United States.

  11. An Overview of ANN Application in the Power Industry

    NASA Technical Reports Server (NTRS)

    Niebur, D.

    1995-01-01

    The paper presents a survey on the development and experience with artificial neural net (ANN) applications for electric power systems, with emphasis on operational systems. The organization and constraints of electric utilities are reviewed, motivations for investigating ANN are identified, and a current assessment is given from the experience of 2400 projects using ANN for load forecasting, alarm processing, fault detection, component fault diagnosis, static and dynamic security analysis, system planning, and operation planning.

  12. Evaluation of precipitation nowcasting techniques for the Alpine region

    NASA Astrophysics Data System (ADS)

    Panziera, L.; Mandapaka, P.; Atencia, A.; Hering, A.; Germann, U.; Gabella, M.; Buzzi, M.

    2010-09-01

    This study presents a large sample evaluation of different nowcasting systems over the Southern Swiss Alps. Radar observations are taken as a reference against which to assess the performance of the following short-term quantitative precipitation forecasting methods: -Eulerian persistence: the current radar image is taken as forecast. -Lagrangian persistence: precipitation patterns are advected following the field of storm motion (the MAPLE algorithm is used). -NORA: novel nowcasting system which exploits the presence of the orographic forcing; by comparing meteorological predictors estimated in real-time with those from the large historical data set, the events with the highest resemblance are picked to produce the forecast. -COSMO2, the limited area numerical model operationally used at MeteoSwiss -Blending of the aforementioned nowcasting tools precipitation forecasts. The investigation is aimed to set up a probabilistic radar rainfall runoff model experiment for steep Alpine catchments as part of the European research project IMPRINTS.

  13. Forecasting intense geomagnetic activity using interplanetary magnetic field data

    NASA Astrophysics Data System (ADS)

    Saiz, E.; Cid, C.; Cerrato, Y.

    2008-12-01

    Southward interplanetary magnetic fields are considered traces of geoeffectiveness since they are a main agent of magnetic reconnection of solar wind and magnetosphere. The first part of this work revises the ability to forecast intense geomagnetic activity using different procedures available in the literature. The study shows that current methods do not succeed in making confident predictions. This fact led us to develop a new forecasting procedure, which provides trustworthy results in predicting large variations of Dst index over a sample of 10 years of observations and is based on the value Bz only. The proposed forecasting method appears as a worthy tool for space weather purposes because it is not affected by the lack of solar wind plasma data, which usually occurs during severe geomagnetic activity. Moreover, the results obtained guide us to provide a new interpretation of the physical mechanisms involved in the interaction between the solar wind and the magnetosphere using Faraday's law.

  14. [Prediction model of meteorological grade of wheat stripe rust in winter-reproductive area, Sichuan Basin, China].

    PubMed

    Guo, Xiang; Wang, Ming Tian; Zhang, Guo Zhi

    2017-12-01

    The winter reproductive areas of Puccinia striiformis var. striiformis in Sichuan Basin are often the places mostly affected by wheat stripe rust. With data on the meteorological condition and stripe rust situation at typical stations in the winter reproductive area in Sichuan Basin from 1999 to 2016, this paper classified the meteorological conditions inducing wheat stripe rust into 5 grades, based on the incidence area ratio of the disease. The meteorological factors which were biologically related to wheat stripe rust were determined through multiple analytical methods, and a meteorological grade model for forecasting wheat stripe rust was created. The result showed that wheat stripe rust in Sichuan Basin was significantly correlated with many meteorological factors, such as the ave-rage (maximum and minimum) temperature, precipitation and its anomaly percentage, relative humidity and its anomaly percentage, average wind speed and sunshine duration. Among these, the average temperature and the anomaly percentage of relative humidity were the determining factors. According to a historical retrospective test, the accuracy of the forecast based on the model was 64% for samples in the county-level test, and 89% for samples in the municipal-level test. In a meteorological grade forecast of wheat stripe rust in the winter reproductive areas in Sichuan Basin in 2017, the prediction was accurate for 62.8% of the samples, with 27.9% error by one grade and only 9.3% error by two or more grades. As a result, the model could deliver satisfactory forecast results, and predicate future wheat stripe rust from a meteorological point of view.

  15. Comparative study of four time series methods in forecasting typhoid fever incidence in China.

    PubMed

    Zhang, Xingyu; Liu, Yuanyuan; Yang, Min; Zhang, Tao; Young, Alistair A; Li, Xiaosong

    2013-01-01

    Accurate incidence forecasting of infectious disease is critical for early prevention and for better government strategic planning. In this paper, we present a comprehensive study of different forecasting methods based on the monthly incidence of typhoid fever. The seasonal autoregressive integrated moving average (SARIMA) model and three different models inspired by neural networks, namely, back propagation neural networks (BPNN), radial basis function neural networks (RBFNN), and Elman recurrent neural networks (ERNN) were compared. The differences as well as the advantages and disadvantages, among the SARIMA model and the neural networks were summarized and discussed. The data obtained for 2005 to 2009 and for 2010 from the Chinese Center for Disease Control and Prevention were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The results showed that RBFNN obtained the smallest MAE, MAPE and MSE in both the modeling and forecasting processes. The performances of the four models ranked in descending order were: RBFNN, ERNN, BPNN and the SARIMA model.

  16. Multiple "buy buttons" in the brain: Forecasting chocolate sales at point-of-sale based on functional brain activation using fMRI.

    PubMed

    Kühn, Simone; Strelow, Enrique; Gallinat, Jürgen

    2016-08-01

    We set out to forecast consumer behaviour in a supermarket based on functional magnetic resonance imaging (fMRI). Data was collected while participants viewed six chocolate bar communications and product pictures before and after each communication. Then self-reports liking judgement were collected. fMRI data was extracted from a priori selected brain regions: nucleus accumbens, medial orbitofrontal cortex, amygdala, hippocampus, inferior frontal gyrus, dorsomedial prefrontal cortex assumed to contribute positively and dorsolateral prefrontal cortex and insula were hypothesized to contribute negatively to sales. The resulting values were rank ordered. After our fMRI-based forecast an instore test was conducted in a supermarket on n=63.617 shoppers. Changes in sales were best forecasted by fMRI signal during communication viewing, second best by a comparison of brain signal during product viewing before and after communication and least by explicit liking judgements. The results demonstrate the feasibility of applying neuroimaging methods in a relatively small sample to correctly forecast sales changes at point-of-sale. Copyright © 2016. Published by Elsevier Inc.

  17. Comparative Study of Four Time Series Methods in Forecasting Typhoid Fever Incidence in China

    PubMed Central

    Zhang, Xingyu; Liu, Yuanyuan; Yang, Min; Zhang, Tao; Young, Alistair A.; Li, Xiaosong

    2013-01-01

    Accurate incidence forecasting of infectious disease is critical for early prevention and for better government strategic planning. In this paper, we present a comprehensive study of different forecasting methods based on the monthly incidence of typhoid fever. The seasonal autoregressive integrated moving average (SARIMA) model and three different models inspired by neural networks, namely, back propagation neural networks (BPNN), radial basis function neural networks (RBFNN), and Elman recurrent neural networks (ERNN) were compared. The differences as well as the advantages and disadvantages, among the SARIMA model and the neural networks were summarized and discussed. The data obtained for 2005 to 2009 and for 2010 from the Chinese Center for Disease Control and Prevention were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The results showed that RBFNN obtained the smallest MAE, MAPE and MSE in both the modeling and forecasting processes. The performances of the four models ranked in descending order were: RBFNN, ERNN, BPNN and the SARIMA model. PMID:23650546

  18. Statistical and Machine Learning forecasting methods: Concerns and ways forward

    PubMed Central

    Makridakis, Spyros; Assimakopoulos, Vassilios

    2018-01-01

    Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions. PMID:29584784

  19. Value-at-Risk forecasts by a spatiotemporal model in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Gong, Pu; Weng, Yingliang

    2016-01-01

    This paper generalizes a recently proposed spatial autoregressive model and introduces a spatiotemporal model for forecasting stock returns. We support the view that stock returns are affected not only by the absolute values of factors such as firm size, book-to-market ratio and momentum but also by the relative values of factors like trading volume ranking and market capitalization ranking in each period. This article studies a new method for constructing stocks' reference groups; the method is called quartile method. Applying the method empirically to the Shanghai Stock Exchange 50 Index, we compare the daily volatility forecasting performance and the out-of-sample forecasting performance of Value-at-Risk (VaR) estimated by different models. The empirical results show that the spatiotemporal model performs surprisingly well in terms of capturing spatial dependences among individual stocks, and it produces more accurate VaR forecasts than the other three models introduced in the previous literature. Moreover, the findings indicate that both allowing for serial correlation in the disturbances and using time-varying spatial weight matrices can greatly improve the predictive accuracy of a spatial autoregressive model.

  20. Prediction on sunspot activity based on fuzzy information granulation and support vector machine

    NASA Astrophysics Data System (ADS)

    Peng, Lingling; Yan, Haisheng; Yang, Zhigang

    2018-04-01

    In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.

  1. Forecasting carbon dioxide emissions.

    PubMed

    Zhao, Xiaobing; Du, Ding

    2015-09-01

    This study extends the literature on forecasting carbon dioxide (CO2) emissions by applying the reduced-form econometrics approach of Schmalensee et al. (1998) to a more recent sample period, the post-1997 period. Using the post-1997 period is motivated by the observation that the strengthening pace of global climate policy may have been accelerated since 1997. Based on our parameter estimates, we project 25% reduction in CO2 emissions by 2050 according to an economic and population growth scenario that is more consistent with recent global trends. Our forecasts are conservative due to that we do not have sufficient data to fully take into account recent developments in the global economy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2014-05-01

    This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.

  3. Helicopter noise prediction - The current status and future direction

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.; Farassat, F.

    1992-01-01

    The paper takes stock of the progress, assesses the current prediction capabilities, and forecasts the direction of future helicopter noise prediction research. The acoustic analogy approach, specifically, theories based on the Ffowcs Williams-Hawkings equations, are the most widely used for deterministic noise sources. Thickness and loading noise can be routinely predicted given good plane motion and blade loading inputs. Blade-vortex interaction noise can also be predicted well with measured input data, but prediction of airloads with the high spatial and temporal resolution required for BVI is still difficult. Current semiempirical broadband noise predictions are useful and reasonably accurate. New prediction methods based on a Kirchhoff formula and direct computation appear to be very promising, but are currently very demanding computationally.

  4. [Methodic approaches to evaluation of microclimate at workplace, with application of various types of protective clothing against occupational hazards].

    PubMed

    Prokopenko, L V; Afanas'eva, R F; Bessonova, N A; Burmistrova, O V; Losik, T K; Konstantinov, E I

    2013-01-01

    Studies of heat state of human involved into physical work in heating environment and having various protective clothing on demonstrated value of the protective clothing in modifying thermal load on the body and possible decrease of this load through air temperature and humidity correction, shorter stay at workplace. The authors presented hygienic requirements to air temperatures range in accordance with allowable body heating degree, suggested mathematic model to forecast integral parameter of human functional state in accordance with type of protective clothing applied. The article also covers necessity of upper air temperature limit during hot season, for applying protective clothing made of materials with low air permeability and hydraulic conductivity.

  5. Degradation forecast for PEMFC cathode-catalysts under cyclic loads

    NASA Astrophysics Data System (ADS)

    Moein-Jahromi, M.; Kermani, M. J.; Movahed, S.

    2017-08-01

    Degradation of Fuel Cell (FC) components under cyclic loads is one of the biggest bottlenecks in FC commercialization. In this paper, a novel experimental based algorithm is presented to predict the Catalyst Layer (CL) performance loss during cyclic load. The algorithm consists of two models namely Models 1 and 2. The Model 1 calculates the Electro-Chemical Surface Area (ECSA) and agglomerate size (e.g. agglomerate radius, rt,agg) for the catalyst layer under cyclic load. The Model 2 is the already-existing model from our earlier studies that computes catalyst performance with fixed structural parameters. Combinations of these two Models predict the CL performance under an arbitrary cyclic load. A set of parametric/sensitivity studies is performed to investigate the effects of operating parameters on the percentage of Voltage Degradation Rate (VDR%) with rank 1 for the most influential one. Amongst the considered parameters (such as: temperature, relative humidity, pressure, minimum and maximum voltage of the cyclic load), the results show that temperature and pressure have the most and the least influences on the VDR%, respectively. So that, increase of temperature from 60 °C to 80 °C leads to over 20% VDR intensification, the VDR will also reduce 1.41% by increasing pressure from 2 atm to 4 atm.

  6. Development of a monthly to seasonal forecast framework tailored to inland waterway transport in central Europe

    NASA Astrophysics Data System (ADS)

    Meißner, Dennis; Klein, Bastian; Ionita, Monica

    2017-12-01

    Traditionally, navigation-related forecasts in central Europe cover short- to medium-range lead times linked to the travel times of vessels to pass the main waterway bottlenecks leaving the loading ports. Without doubt, this aspect is still essential for navigational users, but in light of the growing political intention to use the free capacity of the inland waterway transport in Europe, additional lead time supporting strategic decisions is more and more in demand. However, no such predictions offering extended lead times of several weeks up to several months currently exist for considerable parts of the European waterway network. This paper describes the set-up of a monthly to seasonal forecasting system for the German stretches of the international waterways of the Rhine, Danube and Elbe rivers. Two competitive forecast approaches have been implemented: the dynamical set-up forces a hydrological model with post-processed outputs from ECMWF general circulation model System 4, whereas the statistical approach is based on the empirical relationship (teleconnection) of global oceanic, climate and regional hydro-meteorological data with river flows. The performance of both forecast methods is evaluated in relation to the climatological forecast (ensemble of historical streamflow) and the well-known ensemble streamflow prediction approach (ESP, ensemble based on historical meteorology) using common performance indicators (correlation coefficient; mean absolute error, skill score; mean squared error, skill score; and continuous ranked probability, skill score) and an impact-based evaluation quantifying the potential economic gain. The following four key findings result from this study: (1) as former studies for other regions of central Europe indicate, the accuracy and/or skill of the meteorological forcing used has a larger effect than the quality of initial hydrological conditions for relevant stations along the German waterways. (2) Despite the predictive limitations on longer lead times in central Europe, this study reveals the existence of a valuable predictability of streamflow on monthly up to seasonal timescales along the Rhine, upper Danube and Elbe waterways, and the Elbe achieves the highest skill and economic value. (3) The more physically based and the statistical approach are able to improve the predictive skills and economic value compared to climatology and the ESP approach. The specific forecast skill highly depends on the forecast location, the lead time and the season. (4) Currently, the statistical approach seems to be most skilful for the three waterways investigated. The lagged relationship between the monthly and/or seasonal streamflow and the climatic and/or oceanic variables vary between 1 month (e.g. local precipitation, temperature and soil moisture) up to 6 months (e.g. sea surface temperature). Besides focusing on improving the forecast methodology, especially by combining the individual approaches, the focus is on developing useful forecast products on monthly to seasonal timescales for waterway transport and to operationalize the related forecasting service.

  7. Visualization of uncertainties and forecast skill in user-tailored seasonal climate predictions for agriculture

    NASA Astrophysics Data System (ADS)

    Sedlmeier, Katrin; Gubler, Stefanie; Spierig, Christoph; Flubacher, Moritz; Maurer, Felix; Quevedo, Karim; Escajadillo, Yury; Avalos, Griña; Liniger, Mark A.; Schwierz, Cornelia

    2017-04-01

    Seasonal climate forecast products potentially have a high value for users of different sectors. During the first phase (2012-2015) of the project CLIMANDES (a pilot project of the Global Framework for Climate Services led by WMO [http://www.wmo.int/gfcs/climandes]), a demand study conducted with Peruvian farmers indicated a large interest in seasonal climate information for agriculture. The study further showed that the required information should by precise, timely, and understandable. In addition to the actual forecast, two complex measures are essential to understand seasonal climate predictions and their limitations correctly: forecast uncertainty and forecast skill. The former can be sampled by using an ensemble of climate simulations, the latter derived by comparing forecasts of past time periods to observations. Including uncertainty and skill information in an understandable way for end-users (who are often not technically educated) poses a great challenge. However, neglecting this information would lead to a false sense of determinism which could prove fatal to the credibility of climate information. Within the second phase (2016-2018) of the project CLIMANDES, one goal is to develop a prototype of a user-tailored seasonal forecast for the agricultural sector in Peru. In this local context, the basic education level of the rural farming community presents a major challenge for the communication of seasonal climate predictions. This contribution proposes different graphical presentations of climate forecasts along with possible approaches to visualize and communicate the associated skill and uncertainties, considering end users with varying levels of technical knowledge.

  8. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  9. The impact bias in self and others: Affective and empathic forecasting in individuals with social anxiety.

    PubMed

    Arditte Hall, Kimberly A; Joormann, Jutta; Siemer, Matthias; Timpano, Kiara R

    2018-07-01

    People tend to overestimate the intensity and duration of affect (i.e., impact bias) when making predictions about their own and others' responding, termed affective and empathic forecasting, respectively. Research links impact biases to clinical symptoms of affective disorders, but little work has been done to examine how social anxiety is related to affective and empathic forecasting biases. The current investigation included two studies examining these associations in independent samples of young adults with dimensionally distributed social anxiety symptoms. Study 1 (N = 100) examined the associations between social anxiety and affective and empathic forecasts in response to a series of novel hypothetical vignettes in which a second-person narrator (i.e., the self) elicited anger, disgust, or happiness from another person (i.e., the other). Study 2 utilized an innovative experimental paradigm involving N = 68 participant dyads. Overall, results supported the existence of affective and empathic forecasting biases. Further, symptoms of social anxiety were associated with the tendency to overestimate one's own and others' negative affect and underestimate others' positive affect. Such forecasting biases may help to explain the avoidance that is characteristic of individuals with social anxiety and could represent a fruitful target of cognitive behavioral intervention. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Elkhorn Slough: Detecting Eutrophication through Geospatial Modeling Applications

    NASA Astrophysics Data System (ADS)

    Caraballo Álvarez, I. O.; Childs, A.; Jurich, K.

    2016-12-01

    Elkhorn Slough in Monterey, California, has experienced substantial nutrient loading and eutrophication over the past 21 years as a result of fertilizer-rich runoff from nearby agricultural fields. This study seeks to identify and track spatial patterns of eutrophication hotspots and the correlation to land use changes, possible nutrient sources, and general climatic trends using remotely sensed and in situ data. Threats of rising sea level, subsiding marshes, and increased eutrophication hotspots demonstrate the necessity to analyze the effects of increasing nutrient loads, relative sea level changes, and sedimentation within Elkhorn Slough. The Soil & Water Assessment Tool (SWAT) model integrates specified inputs to assess nutrient and sediment loading and their sources. TerrSet's Land Change Modeler forecasts the future potential of land change transitions for various land cover classes around the slough as a result of nutrient loading, eutrophication, and increased sedimentation. TerrSet's Earth Trends Modeler provides a comprehensive analysis of image time series to rapidly assess long term eutrophication trends and detect spatial patterns of known hotspots. Results from this study will inform future coastal management practices and provide greater spatial and temporal insight into Elkhorn Slough eutrophication dynamics.

  11. A comparison of five sampling techniques to estimate surface fuel loading in montane forests

    Treesearch

    Pamela G. Sikkink; Robert E. Keane

    2008-01-01

    Designing a fuel-sampling program that accurately and efficiently assesses fuel load at relevant spatial scales requires knowledge of each sample method's strengths and weaknesses.We obtained loading values for six fuel components using five fuel load sampling techniques at five locations in western Montana, USA. The techniques included fixed-area plots, planar...

  12. The added value of stochastic spatial disaggregation for short-term rainfall forecasts currently available in Canada

    NASA Astrophysics Data System (ADS)

    Gagnon, Patrick; Rousseau, Alain N.; Charron, Dominique; Fortin, Vincent; Audet, René

    2017-11-01

    Several businesses and industries rely on rainfall forecasts to support their day-to-day operations. To deal with the uncertainty associated with rainfall forecast, some meteorological organisations have developed products, such as ensemble forecasts. However, due to the intensive computational requirements of ensemble forecasts, the spatial resolution remains coarse. For example, Environment and Climate Change Canada's (ECCC) Global Ensemble Prediction System (GEPS) data is freely available on a 1-degree grid (about 100 km), while those of the so-called High Resolution Deterministic Prediction System (HRDPS) are available on a 2.5-km grid (about 40 times finer). Potential users are then left with the option of using either a high-resolution rainfall forecast without uncertainty estimation and/or an ensemble with a spectrum of plausible rainfall values, but at a coarser spatial scale. The objective of this study was to evaluate the added value of coupling the Gibbs Sampling Disaggregation Model (GSDM) with ECCC products to provide accurate, precise and consistent rainfall estimates at a fine spatial resolution (10-km) within a forecast framework (6-h). For 30, 6-h, rainfall events occurring within a 40,000-km2 area (Québec, Canada), results show that, using 100-km aggregated reference rainfall depths as input, statistics of the rainfall fields generated by GSDM were close to those of the 10-km reference field. However, in forecast mode, GSDM outcomes inherit of the ECCC forecast biases, resulting in a poor performance when GEPS data were used as input, mainly due to the inherent rainfall depth distribution of the latter product. Better performance was achieved when the Regional Deterministic Prediction System (RDPS), available on a 10-km grid and aggregated at 100-km, was used as input to GSDM. Nevertheless, most of the analyzed ensemble forecasts were weakly consistent. Some areas of improvement are identified herein.

  13. The probability forecast evaluation of hazard and storm wind over the territories of Russia and Europe

    NASA Astrophysics Data System (ADS)

    Perekhodtseva, E. V.

    2012-04-01

    The results of the probability forecast methods of summer storm and hazard wind over territories of Russia and Europe are submitted at this paper. These methods use the hydrodynamic-statistical model of these phenomena. The statistical model was developed for the recognition of the situation involving these phenomena. For this perhaps the samples of the values of atmospheric parameters (n=40) for the presence and for the absence of these phenomena of storm and hazard wind were accumulated. The compressing of the predictors space without the information losses was obtained by special algorithm (k=7<19m/s, the values of 65%24m/s, the values of 75%29m/s or the area of the tornado and strong squalls. The evaluation of this probability forecast was provided by criterion of Brayer. The estimation was successful and was equal for the European part of Russia B=0,37. The application of the probability forecast of storm and hazard winds allows to mitigate the economic losses when the errors of the first and second kinds of storm wind categorical forecast are not so small. A lot of examples of the storm wind probability forecast are submitted at this report.

  14. Economic evaluation of a solar hot-water-system

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Analysis shows economic benefits at six representative sites using actual data from Tempe, Arizona and San Diego, California installations. Model is two-tank cascade water heater with flat-plate collector array for single-family residences. Performances are forecast for Albuquerque, New Mexico; Fort Worth, Texas; Madison, Wisconsin; and Washington, D.C. Costs are compared to net energy savings using variables for each site's environmental conditions, loads, fuel costs, and other economic factors; uncertainty analysis is included.

  15. How Organizations Learn: A Communication Framework.

    DTIC Science & Technology

    1986-04-01

    Information load is defined as the volume of information inputs requirted for an organization to perform its tasks ( Farace , Monge, and Russell, 1977...intormaLlon sIrategy is to set priorities to pinpoint critical Intormition that can be summarized or "chunked" into meaningful units ( Farace , a, 1..9...34Environmental Scanning and Forecasting in Strategic Planning--The State of the Art," Long Range Planning, Vol. 14, No. I (1981), pp. 32-39. Farace , R. B

  16. An Example of Unsupervised Networks Kohonen's Self-Organizing Feature Map

    NASA Technical Reports Server (NTRS)

    Niebur, Dagmar

    1995-01-01

    Kohonen's self-organizing feature map belongs to a class of unsupervised artificial neural network commonly referred to as topographic maps. It serves two purposes, the quantization and dimensionality reduction of date. A short description of its history and its biological context is given. We show that the inherent classification properties of the feature map make it a suitable candidate for solving the classification task in power system areas like load forecasting, fault diagnosis and security assessment.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cibin, Raj; Trybula, Elizabeth; Chaubey, Indrajeet

    Cellulosic bioenergy feedstock such as perennial grasses and crop residues are expected to play a significant role in meeting US biofuel production targets. Here, we used an improved version of the Soil and Water Assessment Tool (SWAT) to forecast impacts on watershed hydrology and water quality by implementing an array of plausible land-use changes associated with commercial bioenergy crop production for two watersheds in the Midwest USA. Watershed-scale impacts were estimated for 13 bioenergy crop production scenarios, including: production of Miscanthus 9 giganteus and upland Shawnee switchgrass on highly erodible landscape positions, agricultural marginal land areas and pastures, removal ofmore » corn stover and combinations of these options. We also measured water quality as erosion and sediment loading; this was forecasted to improve compared to baseline when perennial grasses were used for bioenergy production, but not with stover removal scenarios. Erosion reduction with perennial energy crop production scenarios ranged between 0.2% and 59%. Stream flow at the watershed outlet was reduced between 0 and 8% across these bioenergy crop production scenarios compared to baseline across the study watersheds. Our results indicate that bioenergy production scenarios that incorporate perennial grasses reduced the nonpoint source pollutant load at the watershed outlet compared to the baseline conditions (0–20% for nitrate-nitrogen and 3–56% for mineral phosphorus); but, the reduction rates were specific to site characteristics and management practices.« less

  18. Feasibility study for the Swaziland/Mozambique interconnector. Final report. Export trade information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-11-01

    This study, conducted by Black & Veatch, was funded by the U.S. Trade and Development Agency. The report, produced for the Ministry of National Resources, Energy and Environment (MNRE) of Swaziland, determines the least cost capacity expansion option to meet the future power demand and system reliability criteria of Swaziland, with particular emphasis on the proposed interconnector between Swaziland and Mozambique. Volume 2, the Final Report, contains the following sections: (1.0) Introduction; (2.0) Review of SEB Power System; (3.0) SEB Load Forecast and Review; (4.0) SEB Load Forecast Revision; (5.0) The SEB Need for Power; (6.0) SEB System Development Planmore » Review; (7.0) Southern Mozambique EdM power System Review; (8.0) Southern Mozambique EdM Energy and Demand; (9.0) Supply Side Capacity Options for Swaziland and Mozambique; (10.0) SEB Expansion Plan Development; (11.0) EdM Expansion Plan Development; (12.0) Cost Sharing of the Interconnector; (13.0) Enviroinmental Evaluation of Interconnector Options; (14.0) Generation/Transmission Trade Offs; (15.0) Draft Interconnection Agreement and Contract Packages; (16.0) Transmission System Study; (17.0) Automatic General Control; (18.0) Automatic Startup and Shutdown of Hydro Electric Power Plants; (19.0) Communications and Metering; (20.0) Conclusions and Recommendations; Appendix A: Demand Side Management Primer; Appendix B. PURPA and Avoided Cost Calculations.« less

  19. Modeling of Micro Deval abrasion loss based on some rock properties

    NASA Astrophysics Data System (ADS)

    Capik, Mehmet; Yilmaz, Ali Osman

    2017-10-01

    Aggregate is one of the most widely used construction material. The quality of the aggregate is determined using some testing methods. Among these methods, the Micro Deval Abrasion Loss (MDAL) test is commonly used for the determination of the quality and the abrasion resistance of aggregate. The main objective of this study is to develop models for the prediction of MDAL from rock properties, including uniaxial compressive strength, Brazilian tensile strength, point load index, Schmidt rebound hardness, apparent porosity, void ratio Cerchar abrasivity index and Bohme abrasion test are examined. Additionally, the MDAL is modeled using simple regression analysis and multiple linear regression analysis based on the rock properties. The study shows that the MDAL decreases with the increase of uniaxial compressive strength, Brazilian tensile strength, point load index, Schmidt rebound hardness and Cerchar abrasivity index. It is also concluded that the MDAL increases with the increase of apparent porosity, void ratio and Bohme abrasion test. The modeling results show that the models based on Bohme abrasion test and L type Schmidt rebound hardness give the better forecasting performances for the MDAL. More models, including the uniaxial compressive strength, the apparent porosity and Cerchar abrasivity index, are developed for the rapid estimation of the MDAL of the rocks. The developed models were verified by statistical tests. Additionally, it can be stated that the proposed models can be used as a forecasting for aggregate quality.

  20. Forecasting approaches to the Mekong River

    NASA Astrophysics Data System (ADS)

    Plate, E. J.

    2009-04-01

    Hydrologists distinguish between flood forecasts, which are concerned with events of the immediate future, and flood predictions, which are concerned with events that are possible, but whose date of occurrence is not determined. Although in principle both involve the determination of runoff from rainfall, the analytical approaches differ because of different objectives. The differences between the two approaches will be discussed, starting with an analysis of the forecasting process. The Mekong River in south-east Asia is used as an example. Prediction is defined as forecast for a hypothetical event, such as the 100-year flood, which is usually sufficiently specified by its magnitude and its probability of occurrence. It forms the basis for designing flood protection structures and risk management activities. The method for determining these quantities is hydrological modeling combined with extreme value statistics, today usually applied both to rainfall events and to observed river discharges. A rainfall-runoff model converts extreme rainfall events into extreme discharges, which at certain gage points along a river are calibrated against observed discharges. The quality of the model output is assessed against the mean value by means of the Nash-Sutcliffe quality criterion. The result of this procedure is a design hydrograph (or a family of design hydrographs) which are used as inputs into a hydraulic model, which converts the hydrograph into design water levels according to the hydraulic situation of the location. The accuracy of making a prediction in this sense is not particularly high: hydrologists know that the 100-year flood is a statistical quantity which can be estimated only within comparatively wide error bounds, and the hydraulics of a river site, in particular under conditions of heavy sediment loads has many uncertainties. Safety margins, such as additional freeboards are arranged to compensate for the uncertainty of the prediction. Forecasts, on the other hand, have as objective to obtain an accurate hydrograph of the near future. The method by means of which this is done is not as important as the accuracy of the forecast. A mathematical rainfall-runoff model is not necessarily a good forecast model. It has to be very carefully designed, and in many cases statistical models are found to give better results than mathematical models. Forecasters have the advantage of knowing the course of the hydrographs up to the point in time where forecasts have to be made. Therefore, models can be calibrated on line against the hydrograph of the immediate past. To assess the quality of a forecast, the quality criterion should not be based on the mean value, as does the Nash-Sutcliffe criterion, but should be based on the best forecast given the information up to the forecast time. Without any additional information, the best forecast when only the present day value is known is to assume a no-change scenario, i.e. to assume that the present value does not change in the immediate future. For the Mekong there exists a forecasting system which is based on a rainfall-runoff model operated by the Mekong River Commission. This model is found not to be adequate for forecasting for periods longer than one or two days ahead. Improvements are sought through two approaches: a strictly deterministic rainfall-runoff model, and a strictly statistical model based on regression with upstream stations. The two approaches are com-pared, and suggestions are made how to best combine the advantages of both approaches. This requires that due consideration is given to critical hydraulic conditions of the river at and in between the gauging stations. Critical situations occur in two ways: when the river overtops, in which case the rainfall-runoff model is incomplete unless overflow losses are considered, and at the confluence with tributaries. Of particular importance is the role of the large Tonle Sap Lake, which dampens the hydrograph downstream of Phnom Penh. The effect of these components of river hydraulics on forecasting accuracy will be assessed.

  1. Forecasting volatility in gold returns under the GARCH, IGARCH and FIGARCH frameworks: New evidence

    NASA Astrophysics Data System (ADS)

    Bentes, Sonia R.

    2015-11-01

    This study employs three volatility models of the GARCH family to examine the volatility behavior of gold returns. Much of the literature on this topic suggests that gold plays a fundamental role as a hedge and safe haven against adverse market conditions, which is particularly relevant in periods of high volatility. This makes understanding gold volatility important for a number of theoretical and empirical applications, namely investment valuation, portfolio selection, risk management, monetary policy-making, futures and option pricing, hedging strategies and value-at-risk (VaR) policies (e.g. Baur and Lucey (2010)). We use daily data from August 2, 1976 to February 6, 2015 and divide the full sample into two periods: the in-sample period (August 2, 1976-October 24, 2008) is used to estimate model coefficients, while the out-of-sample period (October 27, 2008-February 6, 2015) is for forecasting purposes. Specifically, we employ the GARCH(1,1), IGARCH(1,1) and FIGARCH(1, d,1) specifications. The results show that the FIGARCH(1, d,1) is the best model to capture linear dependence in the conditional variance of the gold returns as given by the information criteria. It is also found to be the best model to forecast the volatility of gold returns.

  2. A Sequential Monte Carlo Approach for Streamflow Forecasting

    NASA Astrophysics Data System (ADS)

    Hsu, K.; Sorooshian, S.

    2008-12-01

    As alternatives to traditional physically-based models, Artificial Neural Network (ANN) models offer some advantages with respect to the flexibility of not requiring the precise quantitative mechanism of the process and the ability to train themselves from the data directly. In this study, an ANN model was used to generate one-day-ahead streamflow forecasts from the precipitation input over a catchment. Meanwhile, the ANN model parameters were trained using a Sequential Monte Carlo (SMC) approach, namely Regularized Particle Filter (RPF). The SMC approaches are known for their capabilities in tracking the states and parameters of a nonlinear dynamic process based on the Baye's rule and the proposed effective sampling and resampling strategies. In this study, five years of daily rainfall and streamflow measurement were used for model training. Variable sample sizes of RPF, from 200 to 2000, were tested. The results show that, after 1000 RPF samples, the simulation statistics, in terms of correlation coefficient, root mean square error, and bias, were stabilized. It is also shown that the forecasted daily flows fit the observations very well, with the correlation coefficient of higher than 0.95. The results of RPF simulations were also compared with those from the popular back-propagation ANN training approach. The pros and cons of using SMC approach and the traditional back-propagation approach will be discussed.

  3. Comparative verification between GEM model and official aviation terminal forecasts

    NASA Technical Reports Server (NTRS)

    Miller, Robert G.

    1988-01-01

    The Generalized Exponential Markov (GEM) model uses the local standard airways observation (SAO) to predict hour-by-hour the following elements: temperature, pressure, dew point depression, first and second cloud-layer height and amount, ceiling, total cloud amount, visibility, wind, and present weather conditions. GEM is superior to persistence at all projections for all elements in a large independent sample. A minute-by-minute GEM forecasting system utilizing the Automated Weather Observation System (AWOS) is under development.

  4. Comparisons of forecasting for hepatitis in Guangxi Province, China by using three neural networks models.

    PubMed

    Gan, Ruijing; Chen, Ni; Huang, Daizheng

    2016-01-01

    This study compares and evaluates the prediction of hepatitis in Guangxi Province, China by using back propagation neural networks based genetic algorithm (BPNN-GA), generalized regression neural networks (GRNN), and wavelet neural networks (WNN). In order to compare the results of forecasting, the data obtained from 2004 to 2013 and 2014 were used as modeling and forecasting samples, respectively. The results show that when the small data set of hepatitis has seasonal fluctuation, the prediction result by BPNN-GA will be better than the two other methods. The WNN method is suitable for predicting the large data set of hepatitis that has seasonal fluctuation and the same for the GRNN method when the data increases steadily.

  5. Wind Information Uplink to Aircraft Performing Interval Management Operations

    NASA Technical Reports Server (NTRS)

    Ahmad, Nashat; Barmore, Bryan; Swieringa, Kurt

    2015-01-01

    The accuracy of the wind information used to generate trajectories for aircraft performing Interval Management (IM) operations is critical to the success of an IM operation. There are two main forms of uncertainty in the wind information used by the Flight Deck Interval Management (FIM) equipment. The first is the accuracy of the forecast modeling done by the weather provider. The second is that only a small subset of the forecast data can be uplinked to the aircraft for use by the FIM equipment, resulting in loss of additional information. This study focuses on what subset of forecast data, such as the number and location of the points where the wind is sampled should be made available to uplink to the aircraft.

  6. Enumeration of Salmonella and Campylobacter spp. in environmental farm samples and processing plant carcass rinses from commercial broiler chicken flocks.

    PubMed

    Berghaus, Roy D; Thayer, Stephan G; Law, Bibiana F; Mild, Rita M; Hofacre, Charles L; Singer, Randall S

    2013-07-01

    A prospective cohort study was performed to evaluate the prevalences and loads of Salmonella and Campylobacter spp. in farm and processing plant samples collected from 55 commercial broiler chicken flocks. Environmental samples were collected from broiler houses within 48 h before slaughter, and carcass rinses were performed on birds from the same flocks at 4 different stages of processing. Salmonella was detected in farm samples of 50 (90.9%) flocks and in processing samples of 52 (94.5%) flocks. Campylobacter was detected in farm samples of 35 (63.6%) flocks and in processing samples of 48 (87.3%) flocks. There was a significant positive relationship between environmental farm samples and processing plant carcass rinses with respect to both Salmonella and Campylobacter prevalences and loads. Campylobacter loads were significantly higher than Salmonella loads, and the correlations between samples collected from the same flocks were higher for Campylobacter than they were for Salmonella. Boot socks were the most sensitive sample type for detection of Salmonella on the farm, whereas litter samples had the strongest association with Salmonella loads in pre- and postchill carcass rinses. Boot socks, drag swabs, and fecal samples all had similar sensitivities for detecting Campylobacter on the farm, and all were more strongly associated with Campylobacter loads in carcass rinses than were litter samples. Farm samples explained a greater proportion of the variability in carcass rinse prevalences and loads for Campylobacter than they did for Salmonella. Salmonella and Campylobacter prevalences and loads both decreased significantly as birds progressed through the processing plant.

  7. Enumeration of Salmonella and Campylobacter spp. in Environmental Farm Samples and Processing Plant Carcass Rinses from Commercial Broiler Chicken Flocks

    PubMed Central

    Thayer, Stephan G.; Law, Bibiana F.; Mild, Rita M.; Hofacre, Charles L.; Singer, Randall S.

    2013-01-01

    A prospective cohort study was performed to evaluate the prevalences and loads of Salmonella and Campylobacter spp. in farm and processing plant samples collected from 55 commercial broiler chicken flocks. Environmental samples were collected from broiler houses within 48 h before slaughter, and carcass rinses were performed on birds from the same flocks at 4 different stages of processing. Salmonella was detected in farm samples of 50 (90.9%) flocks and in processing samples of 52 (94.5%) flocks. Campylobacter was detected in farm samples of 35 (63.6%) flocks and in processing samples of 48 (87.3%) flocks. There was a significant positive relationship between environmental farm samples and processing plant carcass rinses with respect to both Salmonella and Campylobacter prevalences and loads. Campylobacter loads were significantly higher than Salmonella loads, and the correlations between samples collected from the same flocks were higher for Campylobacter than they were for Salmonella. Boot socks were the most sensitive sample type for detection of Salmonella on the farm, whereas litter samples had the strongest association with Salmonella loads in pre- and postchill carcass rinses. Boot socks, drag swabs, and fecal samples all had similar sensitivities for detecting Campylobacter on the farm, and all were more strongly associated with Campylobacter loads in carcass rinses than were litter samples. Farm samples explained a greater proportion of the variability in carcass rinse prevalences and loads for Campylobacter than they did for Salmonella. Salmonella and Campylobacter prevalences and loads both decreased significantly as birds progressed through the processing plant. PMID:23624481

  8. An evaluation of flow-stratified sampling for estimating suspended sediment loads

    Treesearch

    Robert B. Thomas; Jack Lewis

    1995-01-01

    Abstract - Flow-stratified sampling is a new method for sampling water quality constituents such as suspended sediment to estimate loads. As with selection-at-list-time (SALT) and time-stratified sampling, flow-stratified sampling is a statistical method requiring random sampling, and yielding unbiased estimates of load and variance. It can be used to estimate event...

  9. COP21 climate negotiators' responses to climate model forecasts

    NASA Astrophysics Data System (ADS)

    Bosetti, Valentina; Weber, Elke; Berger, Loïc; Budescu, David V.; Liu, Ning; Tavoni, Massimo

    2017-02-01

    Policymakers involved in climate change negotiations are key users of climate science. It is therefore vital to understand how to communicate scientific information most effectively to this group. We tested how a unique sample of policymakers and negotiators at the Paris COP21 conference update their beliefs on year 2100 global mean temperature increases in response to a statistical summary of climate models' forecasts. We randomized the way information was provided across participants using three different formats similar to those used in Intergovernmental Panel on Climate Change reports. In spite of having received all available relevant scientific information, policymakers adopted such information very conservatively, assigning it less weight than their own prior beliefs. However, providing individual model estimates in addition to the statistical range was more effective in mitigating such inertia. The experiment was repeated with a population of European MBA students who, despite starting from similar priors, reported conditional probabilities closer to the provided models' forecasts than policymakers. There was also no effect of presentation format in the MBA sample. These results highlight the importance of testing visualization tools directly on the population of interest.

  10. Artificial neural networks environmental forecasting in comparison with multiple linear regression technique: From heavy metals to organic micropollutants screening in agricultural soils

    NASA Astrophysics Data System (ADS)

    Bonelli, Maria Grazia; Ferrini, Mauro; Manni, Andrea

    2016-12-01

    The assessment of metals and organic micropollutants contamination in agricultural soils is a difficult challenge due to the extensive area used to collect and analyze a very large number of samples. With Dioxins and dioxin-like PCBs measurement methods and subsequent the treatment of data, the European Community advises the develop low-cost and fast methods allowing routing analysis of a great number of samples, providing rapid measurement of these compounds in the environment, feeds and food. The aim of the present work has been to find a method suitable to describe the relations occurring between organic and inorganic contaminants and use the value of the latter in order to forecast the former. In practice, the use of a metal portable soil analyzer coupled with an efficient statistical procedure enables the required objective to be achieved. Compared to Multiple Linear Regression, the Artificial Neural Networks technique has shown to be an excellent forecasting method, though there is no linear correlation between the variables to be analyzed.

  11. Multiple regression and Artificial Neural Network for long-term rainfall forecasting using large scale climate modes

    NASA Astrophysics Data System (ADS)

    Mekanik, F.; Imteaz, M. A.; Gato-Trinidad, S.; Elmahdi, A.

    2013-10-01

    In this study, the application of Artificial Neural Networks (ANN) and Multiple regression analysis (MR) to forecast long-term seasonal spring rainfall in Victoria, Australia was investigated using lagged El Nino Southern Oscillation (ENSO) and Indian Ocean Dipole (IOD) as potential predictors. The use of dual (combined lagged ENSO-IOD) input sets for calibrating and validating ANN and MR Models is proposed to investigate the simultaneous effect of past values of these two major climate modes on long-term spring rainfall prediction. The MR models that did not violate the limits of statistical significance and multicollinearity were selected for future spring rainfall forecast. The ANN was developed in the form of multilayer perceptron using Levenberg-Marquardt algorithm. Both MR and ANN modelling were assessed statistically using mean square error (MSE), mean absolute error (MAE), Pearson correlation (r) and Willmott index of agreement (d). The developed MR and ANN models were tested on out-of-sample test sets; the MR models showed very poor generalisation ability for east Victoria with correlation coefficients of -0.99 to -0.90 compared to ANN with correlation coefficients of 0.42-0.93; ANN models also showed better generalisation ability for central and west Victoria with correlation coefficients of 0.68-0.85 and 0.58-0.97 respectively. The ability of multiple regression models to forecast out-of-sample sets is compatible with ANN for Daylesford in central Victoria and Kaniva in west Victoria (r = 0.92 and 0.67 respectively). The errors of the testing sets for ANN models are generally lower compared to multiple regression models. The statistical analysis suggest the potential of ANN over MR models for rainfall forecasting using large scale climate modes.

  12. Prediction of Individual Serum Infliximab Concentrations in Inflammatory Bowel Disease by a Bayesian Dashboard System.

    PubMed

    Eser, Alexander; Primas, Christian; Reinisch, Sieglinde; Vogelsang, Harald; Novacek, Gottfried; Mould, Diane R; Reinisch, Walter

    2018-01-30

    Despite a robust exposure-response relationship of infliximab in inflammatory bowel disease (IBD), attempts to adjust dosing to individually predicted serum concentrations of infliximab (SICs) are lacking. Compared with labor-intensive conventional software for pharmacokinetic (PK) modeling (eg, NONMEM) dashboards are easy-to-use programs incorporating complex Bayesian statistics to determine individual pharmacokinetics. We evaluated various infliximab detection assays and the number of samples needed to precisely forecast individual SICs using a Bayesian dashboard. We assessed long-term infliximab retention in patients being dosed concordantly versus discordantly with Bayesian dashboard recommendations. Three hundred eighty-two serum samples from 117 adult IBD patients on infliximab maintenance therapy were analyzed by 3 commercially available assays. Data from each assay was modeled using NONMEM and a Bayesian dashboard. PK parameter precision and residual variability were assessed. Forecast concentrations from both systems were compared with observed concentrations. Infliximab retention was assessed by prediction for dose intensification via Bayesian dashboard versus real-life practice. Forecast precision of SICs varied between detection assays. At least 3 SICs from a reliable assay are needed for an accurate forecast. The Bayesian dashboard performed similarly to NONMEM to predict SICs. Patients dosed concordantly with Bayesian dashboard recommendations had a significantly longer median drug survival than those dosed discordantly (51.5 versus 4.6 months, P < .0001). The Bayesian dashboard helps to assess the diagnostic performance of infliximab detection assays. Three, not single, SICs provide sufficient information for individualized dose adjustment when incorporated into the Bayesian dashboard. Treatment adjusted to forecasted SICs is associated with longer drug retention of infliximab. © 2018, The American College of Clinical Pharmacology.

  13. Development of a drought forecasting model for the Asia-Pacific region using remote sensing and climate data: Focusing on Indonesia

    NASA Astrophysics Data System (ADS)

    Rhee, Jinyoung; Kim, Gayoung; Im, Jungho

    2017-04-01

    Three regions of Indonesia with different rainfall characteristics were chosen to develop drought forecast models based on machine learning. The 6-month Standardized Precipitation Index (SPI6) was selected as the target variable. The models' forecast skill was compared to the skill of long-range climate forecast models in terms of drought accuracy and regression mean absolute error (MAE). Indonesian droughts are known to be related to El Nino Southern Oscillation (ENSO) variability despite of regional differences as well as monsoon, local sea surface temperature (SST), other large-scale atmosphere-ocean interactions such as Indian Ocean Dipole (IOD) and Southern Pacific Convergence Zone (SPCZ), and local factors including topography and elevation. Machine learning models are thus to enhance drought forecast skill by combining local and remote SST and remote sensing information reflecting initial drought conditions to the long-range climate forecast model results. A total of 126 machine learning models were developed for the three regions of West Java (JB), West Sumatra (SB), and Gorontalo (GO) and six long-range climate forecast models of MSC_CanCM3, MSC_CanCM4, NCEP, NASA, PNU, POAMA as well as one climatology model based on remote sensing precipitation data, and 1 to 6-month lead times. When compared the results between the machine learning models and the long-range climate forecast models, West Java and Gorontalo regions showed similar characteristics in terms of drought accuracy. Drought accuracy of the long-range climate forecast models were generally higher than the machine learning models with short lead times but the opposite appeared for longer lead times. For West Sumatra, however, the machine learning models and the long-range climate forecast models showed similar drought accuracy. The machine learning models showed smaller regression errors for all three regions especially with longer lead times. Among the three regions, the machine learning models developed for Gorontalo showed the highest drought accuracy and the lowest regression error. West Java showed higher drought accuracy compared to West Sumatra, while West Sumatra showed lower regression error compared to West Java. The lower error in West Sumatra may be because of the smaller sample size used for training and evaluation for the region. Regional differences of forecast skill are determined by the effect of ENSO and the following forecast skill of the long-range climate forecast models. While shown somewhat high in West Sumatra, relative importance of remote sensing variables was mostly low in most cases. High importance of the variables based on long-range climate forecast models indicates that the forecast skill of the machine learning models are mostly determined by the forecast skill of the climate models.

  14. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  15. Operation and Management of Thermostatically Controlled Loads for Providing Regulation Services to Power Grids

    NASA Astrophysics Data System (ADS)

    Vanouni, Maziar

    The notion of demand-side participation in power systems operation and control is on the verge of realization because of the advancement in the required technologies an tools like communications, smart meters, sensor networks, large data management techniques, large scale optimization method, etc. Therefore, demand-response (DR) programs can be one of the prosperous solutions to accommodate part of the increasing demand for load balancing services which is brought about by the high penetration of intermittent renewable energies in power systems. This dissertation studies different aspects of the DR programs that utilized the thermostatically controlled loads (TCLs) to provide load balancing services. The importance of TCLs among the other loads lie on their flexibility in power consumption pattern while the customer/end-user comfort is not (or minimally) impacted. Chapter 2 discussed a previously presented direct load control (DLC) to control the power consumption of aggregated TCLs. The DLC method performs a power tracking control and based on central approach where a central controller broadcasts the control command to the dispersed TCLs to toggle them on/off. The central controller receives measurement feedback from the TCLs once per couple of minutes to run a successful forecast process. The performance evaluation criteria to evaluate the load balancing service provided by the TCLs are presented. The results are discussed under different scenarios and situation. The numerical results show the proper performance of the DLC method. This DLC method is used as the control method in all the studies in this dissertation. Chapter 3 presents performance improvements for the original method in Chapter 2 by communicating two more pieces of information called forecast parameters (FPs). Communicating improves the forecast process in the DLC and hence, both performance accuracy and the amount of tear-and-wear imposed on the TCLs. Chapter 4 formulates a stochastic optimization model for a load aggregator (LA) to participate in the performance-based regulation markets (PBRM). PBRMs are the recently developed and practiced regulation market structure recommended by Federal Energy Regulatory Commission (FERC) in 2011. In PBRMs, regulation resources are paid based on both regulation capacity bids and the regulation performance including the provided mileage and the performance accuracy. In order to develop the income from the PBRM, the convention of California Independent System Operator (CAISO) is used. In the presented optimization model, the amount of tear-and-wear imposed on the TCLs are confined to prevent abrupt switching of TCLs. In Chapter 5, a two-stage reward allocation mechanism is developed for a LA recruiting TCLs for regulation service provision. The mechanism helps the LA to distribute the total reward (earned from regulation service provision) among the TCLs according to their contribution in the whole provided service. In the first stage, TCLs are prioritized based on their service provision capability. In order to do so, an index called SPCI is presented to quantify TCLs capability/flexibility and therefore, prioritize them. After prioritization TCLs a priority list is constructed in the first stage. In the second stage, a reward curve is constructed representing the functionality of the possible total reward with respect to the number top TCLs in the priority list. Then, the allocated reward to individual TCLs is calculated by applying the incremental method on the constructed reward curve. This presented reward allocation mechanism is based on the definition of maximum service capacity (MSC) for a control group including TCLs. MSC is defined and its calculation method is presented before discussing the two stages of the reward allocation mechanism. The numerical results proves the suitability of the proposed prioritization method as it is observed the TCLs with higher rankings can contribute more to the total reward in comparison to the TCLs with lower rankings in the priority list.

  16. Influence of various water quality sampling strategies on load estimates for small streams

    USGS Publications Warehouse

    Robertson, Dale M.; Roerish, Eric D.

    1999-01-01

    Extensive streamflow and water quality data from eight small streams were systematically subsampled to represent various water‐quality sampling strategies. The subsampled data were then used to determine the accuracy and precision of annual load estimates generated by means of a regression approach (typically used for big rivers) and to determine the most effective sampling strategy for small streams. Estimation of annual loads by regression was imprecise regardless of the sampling strategy used; for the most effective strategy, median absolute errors were ∼30% based on the load estimated with an integration method and all available data, if a regression approach is used with daily average streamflow. The most effective sampling strategy depends on the length of the study. For 1‐year studies, fixed‐period monthly sampling supplemented by storm chasing was the most effective strategy. For studies of 2 or more years, fixed‐period semimonthly sampling resulted in not only the least biased but also the most precise loads. Additional high‐flow samples, typically collected to help define the relation between high streamflow and high loads, result in imprecise, overestimated annual loads if these samples are consistently collected early in high‐flow events.

  17. Bi-Level Arbitrage Potential Evaluation for Grid-Scale Energy Storage Considering Wind Power and LMP Smoothing Effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Hantao; Li, Fangxing; Fang, Xin

    Our paper deals with extended-term energy storage (ES) arbitrage problems to maximize the annual revenue in deregulated power systems with high penetration wind power. The conventional ES arbitrage model takes the locational marginal prices (LMP) as an input and is unable to account for the impacts of ES operations on system LMPs. This paper proposes a bi-level ES arbitrage model, where the upper level maximizes the ES arbitrage revenue and the lower level simulates the market clearing process considering wind power and ES. The bi-level model is formulated as a mathematical program with equilibrium constraints (MPEC) and then recast intomore » a mixed-integer linear programming (MILP) using strong duality theory. Wind power fluctuations are characterized by the GARCH forecast model and the forecast error is modeled by forecast-bin based Beta distributions. Case studies are performed on a modified PJM 5-bus system and an IEEE 118-bus system with a weekly time horizon over an annual term to show the validity of the proposed bi-level model. The results from the conventional model and the bi-level model are compared under different ES power and energy ratings, and also various load and wind penetration levels.« less

  18. Hyperparameterization of soil moisture statistical models for North America with Ensemble Learning Models (Elm)

    NASA Astrophysics Data System (ADS)

    Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.

    2017-12-01

    Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.

  19. Bi-Level Arbitrage Potential Evaluation for Grid-Scale Energy Storage Considering Wind Power and LMP Smoothing Effect

    DOE PAGES

    Cui, Hantao; Li, Fangxing; Fang, Xin; ...

    2017-10-04

    Our paper deals with extended-term energy storage (ES) arbitrage problems to maximize the annual revenue in deregulated power systems with high penetration wind power. The conventional ES arbitrage model takes the locational marginal prices (LMP) as an input and is unable to account for the impacts of ES operations on system LMPs. This paper proposes a bi-level ES arbitrage model, where the upper level maximizes the ES arbitrage revenue and the lower level simulates the market clearing process considering wind power and ES. The bi-level model is formulated as a mathematical program with equilibrium constraints (MPEC) and then recast intomore » a mixed-integer linear programming (MILP) using strong duality theory. Wind power fluctuations are characterized by the GARCH forecast model and the forecast error is modeled by forecast-bin based Beta distributions. Case studies are performed on a modified PJM 5-bus system and an IEEE 118-bus system with a weekly time horizon over an annual term to show the validity of the proposed bi-level model. The results from the conventional model and the bi-level model are compared under different ES power and energy ratings, and also various load and wind penetration levels.« less

  20. Prospective Tests of Southern California Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.

    2004-12-01

    We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we estimate by simulations. In this scheme, each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results will be archived and posted on the RELM web site. Major problems under discussion include how to treat aftershocks, which clearly violate the variable-rate Poissonian hypotheses that we employ, and how to deal with the temporal variations in catalog completeness that follow large earthquakes.

  1. Dust storm events over Delhi: verification of dust AOD forecasts with satellite and surface observations

    NASA Astrophysics Data System (ADS)

    Singh, Aditi; Iyengar, Gopal R.; George, John P.

    2016-05-01

    Thar desert located in northwest part of India is considered as one of the major dust source. Dust storms originate in Thar desert during pre-monsoon season, affects large part of Indo-Gangetic plains. High dust loading causes the deterioration of the ambient air quality and degradation in visibility. Present study focuses on the identification of dust events and verification of the forecast of dust events over Delhi and western part of IG Plains, during the pre-monsoon season of 2015. Three dust events have been identified over Delhi during the study period. For all the selected days, Terra-MODIS AOD at 550 nm are found close to 1.0, while AURA-OMI AI shows high values. Dust AOD forecasts from NCMRWF Unified Model (NCUM) for the three selected dust events are verified against satellite (MODIS) and ground based observations (AERONET). Comparison of observed AODs at 550 nm from MODIS with NCUM predicted AODs reveals that NCUM is able to predict the spatial and temporal distribution of dust AOD, in these cases. Good correlation (~0.67) is obtained between the NCUM predicted dust AODs and location specific observations available from AERONET. Model under-predicted the AODs as compared to the AERONET observations. This may be mainly because the model account for only dust and no anthropogenic activities are considered. The results of the present study emphasize the requirement of more realistic representation of local dust emission in the model both of natural and anthropogenic origin, to improve the forecast of dust from NCUM during the dust events.

  2. Forecasting of Information Security Related Incidents: Amount of Spam Messages as a Case Study

    NASA Astrophysics Data System (ADS)

    Romanov, Anton; Okamoto, Eiji

    With the increasing demand for services provided by communication networks, quality and reliability of such services as well as confidentiality of data transfer are becoming ones of the highest concerns. At the same time, because of growing hacker's activities, quality of provided content and reliability of its continuous delivery strongly depend on integrity of data transmission and availability of communication infrastructure, thus on information security of a given IT landscape. But, the amount of resources allocated to provide information security (like security staff, technical countermeasures and etc.) must be reasonable from the economic point of view. This fact, in turn, leads to the need to employ a forecasting technique in order to make planning of IT budget and short-term planning of potential bottlenecks. In this paper we present an approach to make such a forecasting for a wide class of information security related incidents (ISRI) — unambiguously detectable ISRI. This approach is based on different auto regression models which are widely used in financial time series analysis but can not be directly applied to ISRI time series due to specifics related to information security. We investigate and address this specifics by proposing rules (special conditions) of collection and storage of ISRI time series, adherence to which improves forecasting in this subject field. We present an application of our approach to one type of unambiguously detectable ISRI — amount of spam messages which, if not mitigated properly, could create additional load on communication infrastructure and consume significant amounts of network capacity. Finally we evaluate our approach by simulation and actual measurement.

  3. Semiparametric modeling: Correcting low-dimensional model error in parametric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Tyrus, E-mail: thb11@psu.edu; Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, 503 Walker Building, University Park, PA 16802-5013

    2016-03-01

    In this paper, a semiparametric modeling approach is introduced as a paradigm for addressing model error arising from unresolved physical phenomena. Our approach compensates for model error by learning an auxiliary dynamical model for the unknown parameters. Practically, the proposed approach consists of the following steps. Given a physics-based model and a noisy data set of historical observations, a Bayesian filtering algorithm is used to extract a time-series of the parameter values. Subsequently, the diffusion forecast algorithm is applied to the retrieved time-series in order to construct the auxiliary model for the time evolving parameters. The semiparametric forecasting algorithm consistsmore » of integrating the existing physics-based model with an ensemble of parameters sampled from the probability density function of the diffusion forecast. To specify initial conditions for the diffusion forecast, a Bayesian semiparametric filtering method that extends the Kalman-based filtering framework is introduced. In difficult test examples, which introduce chaotically and stochastically evolving hidden parameters into the Lorenz-96 model, we show that our approach can effectively compensate for model error, with forecasting skill comparable to that of the perfect model.« less

  4. A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run.

    PubMed

    Armeanu, Daniel; Andrei, Jean Vasile; Lache, Leonard; Panait, Mirela

    2017-01-01

    The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets.

  5. A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run

    PubMed Central

    Armeanu, Daniel; Lache, Leonard; Panait, Mirela

    2017-01-01

    The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets. PMID:28742100

  6. Vertical distribution of Saharan dust over Rome (Italy): Comparison between 3-year model predictions and lidar soundings

    NASA Astrophysics Data System (ADS)

    Kishcha, P.; Barnaba, F.; Gobbi, G. P.; Alpert, P.; Shtivelman, A.; Krichak, S. O.; Joseph, J. H.

    2005-03-01

    Mineral dust particles loaded into the atmosphere from the Sahara desert represent one major factor affecting the Earth's radiative budget. Regular model-based forecasts of 3-D dust fields can be used in order to determine the dust radiative effect in climate models, in spite of the large gaps in observations of dust vertical profiles. In this study, dust forecasts by the Tel Aviv University (TAU) dust prediction system were compared to lidar observations to better evaluate the model's capabilities. The TAU dust model was initially developed at the University of Athens and later modified at Tel Aviv University. Dust forecasts are initialized with the aid of the Total Ozone Mapping Spectrometer aerosol index (TOMS AI) measurements. The lidar soundings employed were collected at the outskirts of Rome, Italy (41.84°N, 12.64°E) during the high-dust activity season from March to June of the years 2001, 2002, and 2003. The lidar vertical profiles collected in the presence of dust were used for obtaining statistically significant reference parameters of dust layers over Rome and for model versus lidar comparison. The Barnaba and Gobbi (2001) approach was used in the current study to derive height-resolved dust volumes from lidar measurements of backscatter. Close inspection of the juxtaposed vertical profiles, obtained from lidar and model data near Rome, indicates that the majority (67%) of the cases under investigation can be classified as good or acceptable forecasts of the dust vertical distribution. A more quantitative comparison shows that the model predictions are mainly accurate in the middle part of dust layers. This is supported by high correlation (0.85) between lidar and model data for forecast dust volumes greater than the threshold of 1 × 10-12 cm3/cm3. In general, however, the model tends to underestimate the lidar-derived dust volume profiles. The effect of clouds in the TOMS detection of AI is supposed to be the main factor responsible for this effect. Moreover, some model assumptions on dust sources and particle size and the accuracy of model-simulated meteorological parameters are also likely to affect the dust forecast quality.

  7. THE EMPACT BEACHES: A CASE STUDY IN RECREATIONAL WATER SAMPLING

    EPA Science Inventory

    Various chapters describe sample and experimental design, use of a geometric mean or an arithmetic mean, modeling and forecasting, and risk assessment in relation to monitoring recreational waters for fecal indicators. All of these aspects of monitoring are dependent on the spat...

  8. Multifractality and value-at-risk forecasting of exchange rates

    NASA Astrophysics Data System (ADS)

    Batten, Jonathan A.; Kinateder, Harald; Wagner, Niklas

    2014-05-01

    This paper addresses market risk prediction for high frequency foreign exchange rates under nonlinear risk scaling behaviour. We use a modified version of the multifractal model of asset returns (MMAR) where trading time is represented by the series of volume ticks. Our dataset consists of 138,418 5-min round-the-clock observations of EUR/USD spot quotes and trading ticks during the period January 5, 2006 to December 31, 2007. Considering fat-tails, long-range dependence as well as scale inconsistency with the MMAR, we derive out-of-sample value-at-risk (VaR) forecasts and compare our approach to historical simulation as well as a benchmark GARCH(1,1) location-scale VaR model. Our findings underline that the multifractal properties in EUR/USD returns in fact have notable risk management implications. The MMAR approach is a parsimonious model which produces admissible VaR forecasts at the 12-h forecast horizon. For the daily horizon, the MMAR outperforms both alternatives based on conditional as well as unconditional coverage statistics.

  9. Forecasting Tehran stock exchange volatility; Markov switching GARCH approach

    NASA Astrophysics Data System (ADS)

    Abounoori, Esmaiel; Elmi, Zahra (Mila); Nademi, Younes

    2016-03-01

    This paper evaluates several GARCH models regarding their ability to forecast volatility in Tehran Stock Exchange (TSE). These include GARCH models with both Gaussian and fat-tailed residual conditional distribution, concerning their ability to describe and forecast volatility from 1-day to 22-day horizon. Results indicate that AR(2)-MRSGARCH-GED model outperforms other models at one-day horizon. Also, the AR(2)-MRSGARCH-GED as well as AR(2)-MRSGARCH-t models outperform other models at 5-day horizon. In 10 day horizon, three models of AR(2)-MRSGARCH outperform other models. Concerning 22 day forecast horizon, results indicate no differences between MRSGARCH models with that of standard GARCH models. Regarding Risk management out-of-sample evaluation (95% VaR), a few models seem to provide reasonable and accurate VaR estimates at 1-day horizon, with a coverage rate close to the nominal level. According to the risk management loss functions, there is not a uniformly most accurate model.

  10. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    NASA Technical Reports Server (NTRS)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  11. Gear fatigue crack prognosis using embedded model, gear dynamic model and fracture mechanics

    NASA Astrophysics Data System (ADS)

    Li, C. James; Lee, Hyungdae

    2005-07-01

    This paper presents a model-based method that predicts remaining useful life of a gear with a fatigue crack. The method consists of an embedded model to identify gear meshing stiffness from measured gear torsional vibration, an inverse method to estimate crack size from the estimated meshing stiffness; a gear dynamic model to simulate gear meshing dynamics and determine the dynamic load on the cracked tooth; and a fast crack propagation model to forecast the remaining useful life based on the estimated crack size and dynamic load. The fast crack propagation model was established to avoid repeated calculations of FEM and facilitate field deployment of the proposed method. Experimental studies were conducted to validate and demonstrate the feasibility of the proposed method for prognosis of a cracked gear.

  12. Size-Segregated Aerosol Composition and Mass Loading of Atmospheric Particles as Part of the Pacific Northwest 2001(PNW2001) Air Quality Study In Puget Sound

    NASA Astrophysics Data System (ADS)

    Disselkamp, R. S.; Barrie, L. A.; Shutthanadan, S.; Cliff, S.; Cahill, T.

    2001-12-01

    In mid-August, 2001, an aircraft-based air-quality study was performed in the Puget Sound, WA, area entitled PNW2001 (http://www.pnl.gov/pnw2001). The objectives of this field campaign were the following: 1. reveal information about the 3-dimensional distribution of ozone, its gaseous precursors and fine particulate matter during weather conditions favoring air pollution; 2. derive information about the accuracy of urban and biogenic emissions inventories that are used to drive the air quality forecast models; and 3. examine the accuracy of modeled ozone concentration with that observed. In support of these efforts, we collected time-averaged ( { ~}10 minute averages), size-segregated, aerosol composition and mass-loading information using ex post facto analysis techniques of synchrotron x-ray fluorescence (s-XRF), proton induced x-ray emissions(PIXE), proton elastic scattering (PESA), and scanning transmission ion microscopy (STIM). This is the first time these analysis techniques have been used together on samples collected from aircraft using an optimized 3-stage rotating drum impactor. In our presentation, we will discuss the aerosol components in three aerosol size fractions as identified by statistical analysis of multielemental data (including total mass, H, Na, Mg, Al, Si, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Pb) and relate variations in these components to physical aerosol properties, other gaseous trace constituents and to air mass origin.

  13. C/NOFS Observations of Electromagnetic Coupling Between Magnetically Conjugate MSTID Structures

    NASA Technical Reports Server (NTRS)

    Burke, W. J.; Martinis, C. R.; Lai, P. C.; Gentile, L. C.; Sullivan, C.; Pfaff, Robert F.

    2016-01-01

    This report demonstrates empirically that couplings between magnetically conjugate medium-scale traveling ionospheric disturbances (MSTIDs) are electromagnetic in nature. This is accomplished by comparing plasma density, electric, and magnetic perturbations sampled simultaneously by sensors on the Communication Navigation Outage Forecasting System (CNOFS) satellite. During the period of interest on 17 February 2010, CNOFS made three consecutive orbits while magnetically conjugate to the field of view of an all-sky imager located at El Leoncito, Argentina (31.8degS, 69.3degW). Imaged 630.0 nm airglow was characterized by alternating bands of relatively bright and dark emissions that were aligned from northeast to southwest and propagated toward the northwest, characteristic of MSTIDs in the southern hemisphere. Measurable Poynting fluxes flow along the Earths magnetic field (S) from generator to load hemispheres. While S was predominantly away from the ionosphere above El Leoncito, interhemispheric energy flows were not one-way streets. Measured Poynting flux intensities diminished with time over the three CNOFS passes, suggesting that source mechanisms of MSTIDs were absent or that initial impedance mismatches between the two hemispheres approached an equilibrium status.

  14. C/NOFS observations of electromagnetic coupling between magnetically conjugate MSTID structures

    NASA Astrophysics Data System (ADS)

    Burke, W. J.; Martinis, C. R.; Lai, P. C.; Gentile, L. C.; Sullivan, C.; Pfaff, R. F.

    2016-03-01

    This report demonstrates empirically that couplings between magnetically conjugate medium-scale traveling ionospheric disturbances (MSTIDs) are electromagnetic in nature. This is accomplished by comparing plasma density, electric, and magnetic perturbations sampled simultaneously by sensors on the Communication/Navigation Outage Forecasting System (C/NOFS) satellite. During the period of interest on 17 February 2010, C/NOFS made three consecutive orbits while magnetically conjugate to the field of view of an all-sky imager located at El Leoncito, Argentina (31.8°S, 69.3°W). Imaged 630.0 nm airglow was characterized by alternating bands of relatively bright and dark emissions that were aligned from northeast to southwest and propagated toward the northwest, characteristic of MSTIDs in the southern hemisphere. Measurable Poynting fluxes flow along the Earth's magnetic field (S||) from "generator" to "load" hemispheres. While S|| was predominantly away from the ionosphere above El Leoncito, interhemispheric energy flows were not one-way streets. Measured Poynting flux intensities diminished with time over the three C/NOFS passes, suggesting that source mechanisms of MSTIDs were absent or that initial impedance mismatches between the two hemispheres approached an equilibrium status.

  15. NASA Tech Briefs, December 2006

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Topic include: Inferring Gear Damage from Oil-Debris and Vibration Data; Forecasting of Storm-Surge Floods Using ADCIRC and Optimized DEMs; User Interactive Software for Analysis of Human Physiological Data; Representation of Serendipitous Scientific Data; Automatic Locking of Laser Frequency to an Absorption Peak; Self-Passivating Lithium/Solid Electrolyte/Iodine Cells; Four-Quadrant Analog Multipliers Using G4-FETs; Noise Source for Calibrating a Microwave Polarimeter; Hybrid Deployable Foam Antennas and Reflectors; Coating MCPs with AlN and GaN; Domed, 40-cm-Diameter Ion Optics for an Ion Thruster; Gesture-Controlled Interfaces for Self-Service Machines; Dynamically Alterable Arrays of Polymorphic Data Types; Identifying Trends in Deep Space Network Monitor Data; Predicting Lifetime of a Thermomechanically Loaded Component; Partial Automation of Requirements Tracing; Automated Synthesis of Architecture of Avionic Systems; SSRL Emergency Response Shore Tool; Wholly Aromatic Ether-Imides as n-Type Semiconductors; Carbon-Nanotube-Carpet Heat-Transfer Pads; Pulse-Flow Microencapsulation System; Automated Low-Gravitation Facility Would Make Optical Fibers; Alignment Cube with One Diffractive Face; Graphite Composite Booms with Integral Hinges; Tool for Sampling Permafrost on a Remote Planet; and Special Semaphore Scheme for UHF Spacecraft Communications.

  16. Military Hydrology. Report 8. Feasibility of Utilizing Satellite and Radar Data in Hydrologic Forecasting.

    DTIC Science & Technology

    1985-09-01

    Extratropical Storm ," Draft Report. Atlas, David. 1964. "Advances in Radar Meteorology," Advances in Geophysics, Vol 10, Academic Press, N.Y., pp 318-478. Barnes...forecasting purposes, data on storm morphology, direction of movement, and rate of movement are re- quired in addition to the data cited above. 7...or storm duration. He also showed that, for a given sampling error, the gage density needed for warm season storms was two to three times greater than

  17. Leverage effect, economic policy uncertainty and realized volatility with regime switching

    NASA Astrophysics Data System (ADS)

    Duan, Yinying; Chen, Wang; Zeng, Qing; Liu, Zhicao

    2018-03-01

    In this study, we first investigate the impacts of leverage effect and economic policy uncertainty (EPU) on future volatility in the framework of regime switching. Out-of-sample results show that the HAR-RV including the leverage effect and economic policy uncertainty with regimes can achieve higher forecast accuracy than RV-type and GARCH-class models. Our robustness results further imply that these factors in the framework of regime switching can substantially improve the HAR-RV's forecast performance.

  18. Hyperspectral Remote Sensing of the Coastal Ocean: Adaptive Sampling and Forecasting of In situ Optical Properties

    DTIC Science & Technology

    2003-09-30

    We are developing an integrated rapid environmental assessment capability that will be used to feed an ocean nowcast/forecast system. The goal is to develop a capacity for predicting the dynamics in inherent optical properties in coastal waters. This is being accomplished by developing an integrated observation system that is being coupled to a data assimilative hydrodynamic bio-optical ecosystem model. The system was used adaptively to calibrate hyperspectral remote sensing sensors in optically complex nearshore coastal waters.

  19. The useful potential of using existing data to uniquely identify predictable wind events and regimes, part 1

    NASA Technical Reports Server (NTRS)

    Trettel, D. W.; Aquino, J. T.; Piazza, T. R.; Taylor, L. E.; Trask, D. C.

    1982-01-01

    Correlations between standard meteorological data and wind power generation potential were developed. Combined with appropriate wind forecasts, these correlations can be useful to load dispatchers to supplement conventional energy sources. Hourly wind data were analyzed for four sites, each exhibiting a unique physiography. These sites are Amarillo, Texas; Ludington, Michigan; Montauk Point, New York; and San Gorgonio, California. Synoptic weather maps and tables are presented to illustrate various wind 'regimes' at these sites.

  20. Performance Assessment of the Spare Parts for the Activation of Relocated Systems (SPARES) Forecasting Model

    DTIC Science & Technology

    1991-09-01

    constant data into the gaining base’s computer records. Among the data elements to be loaded, the 1XT434 image contains the level detail effective date...the mission support effective date, and the PBR override (19:19-203). In conjunction with the 1XT434, the Mission Change Parameter Image (Constant...the gaining base (19:19-208). The level detail effective date establishes the date the MCDDFR and MCDDR "are considered by the requirements computation

  1. An Operational System for Surveillance and Ecological Forecasting of West Nile Virus Outbreaks

    NASA Astrophysics Data System (ADS)

    Wimberly, M. C.; Davis, J. K.; Vincent, G.; Hess, A.; Hildreth, M. B.

    2017-12-01

    Mosquito-borne disease surveillance has traditionally focused on tracking human cases along with the abundance and infection status of mosquito vectors. For many of these diseases, vector and host population dynamics are also sensitive to climatic factors, including temperature fluctuations and the availability of surface water for mosquito breeding. Thus, there is a potential to strengthen surveillance and predict future outbreaks by monitoring environmental risk factors using broad-scale sensor networks that include earth-observing satellites. The South Dakota Mosquito Information System (SDMIS) project combines entomological surveillance with gridded meteorological data from NASA's North American Land Data Assimilation System (NLDAS) to generate weekly risk maps for West Nile virus (WNV) in the north-central United States. Critical components include a mosquito infection model that smooths the noisy infection rate and compensates for unbalanced sampling, and a human infection model that combines the entomological risk estimates with lagged effects of meteorological variables from the North American Land Data Assimilation System (NLDAS). Two types of forecasts are generated: long-term forecasts of statewide risk extending through the entire WNV season, and short-term forecasts of the geographic pattern of WNV risk in the upcoming week. Model forecasts are connected to public health actions through decision support matrices that link predicted risk levels to a set of phased responses. In 2016, the SDMIS successfully forecast an early start to the WNV season and a large outbreak of WNV cases following several years of low transmission. An evaluation of the 2017 forecasts will also be presented. Our experiences with the SDMIS highlight several important lessons that can inform future efforts at disease early warning. These include the value of integrating climatic models with recent observations of infection, the critical role of automated workflows to facilitate the timely integration of multiple data streams, the need for effective synthesis and visualization of forecasts, and the importance of linking forecasts to specific public health responses.

  2. Pla a_1 aeroallergen immunodetection related to the airborne Platanus pollen content.

    PubMed

    Fernández-González, M; Guedes, A; Abreu, I; Rodríguez-Rajo, F J

    2013-10-01

    Platanus hispanica pollen is considered an important source of aeroallergens in many Southern European cities. This tree is frequently used in urban green spaces as ornamental specie. The flowering period is greatly influenced by the meteorological conditions, which directly affect its allergenic load in the atmosphere. The purpose of this study is to develop equations to predict the Platanus allergy risk periods as a function of the airborne pollen, the allergen concentration and the main meteorological parameters. The study was conducted by means two volumetric pollen samplers; a Lanzoni VPPS 2000 for the Platanus pollen sampling and a Burkard multivial Cyclone Sampler to collect the aeroallergen particles (Pla a_1). In addiction the Dot-Blot and the Raman spectroscopy methods were used to corroborate the results. The Pla a_1 protein is recorded in the atmosphere after the presence of the Platanus pollen, which extend the Platanus pollen allergy risk periods. The Platanus pollen and the Pla a 1 allergens concentration are associated with statistical significant variations of some meteorological variables: in a positive way with the mean and maximum temperature whereas the sign of the correlation coefficient is negative with the relative humidity. The lineal regression equation elaborated in order to forecast the Platanus pollen content in the air explain the 64.5% of variance of the pollen presence in the environment, whereas the lineal regression equation elaborated in order to forecast the aeroallergen a 54.1% of the Pla a_1 presence variance. The combination of pollen count and the allergen quantification must be assessed in the epidemiologic study of allergic respiratory diseases to prevent the allergy risk periods. © 2013 Elsevier B.V. All rights reserved.

  3. The Invasive Species Forecasting System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Most, Neal; Gill, Roger; Ma, Peter

    2011-01-01

    The Invasive Species Forecasting System (ISFS) provides computational support for the generic work processes found in many regional-scale ecosystem modeling applications. Decision support tools built using ISFS allow a user to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of management concern, such as a national park, monument, forest, or refuge. This type of decision product helps resource managers plan invasive species protection, monitoring, and control strategies for the lands they manage. Until now, scientists and resource managers have lacked the data-assembly and computing capabilities to produce these maps quickly and cost efficiently. ISFS focuses on regional-scale habitat suitability modeling for invasive terrestrial plants. ISFS s component architecture emphasizes simplicity and adaptability. Its core services can be easily adapted to produce model-based decision support tools tailored to particular parks, monuments, forests, refuges, and related management units. ISFS can be used to build standalone run-time tools that require no connection to the Internet, as well as fully Internet-based decision support applications. ISFS provides the core data structures, operating system interfaces, network interfaces, and inter-component constraints comprising the canonical workflow for habitat suitability modeling. The predictors, analysis methods, and geographic extents involved in any particular model run are elements of the user space and arbitrarily configurable by the user. ISFS provides small, lightweight, readily hardened core components of general utility. These components can be adapted to unanticipated uses, are tailorable, and require at most a loosely coupled, nonproprietary connection to the Web. Users can invoke capabilities from a command line; programmers can integrate ISFS's core components into more complex systems and services. Taken together, these features enable a degree of decentralization and distributed ownership that have helped other types of scientific information services succeed in recent years.

  4. Does an inter-flaw length control the accuracy of rupture forecasting in geological materials?

    NASA Astrophysics Data System (ADS)

    Vasseur, Jérémie; Wadsworth, Fabian B.; Heap, Michael J.; Main, Ian G.; Lavallée, Yan; Dingwell, Donald B.

    2017-10-01

    Multi-scale failure of porous materials is an important phenomenon in nature and in material physics - from controlled laboratory tests to rockbursts, landslides, volcanic eruptions and earthquakes. A key unsolved research question is how to accurately forecast the time of system-sized catastrophic failure, based on observations of precursory events such as acoustic emissions (AE) in laboratory samples, or, on a larger scale, small earthquakes. Until now, the length scale associated with precursory events has not been well quantified, resulting in forecasting tools that are often unreliable. Here we test the hypothesis that the accuracy of the forecast failure time depends on the inter-flaw distance in the starting material. We use new experimental datasets for the deformation of porous materials to infer the critical crack length at failure from a static damage mechanics model. The style of acceleration of AE rate prior to failure, and the accuracy of forecast failure time, both depend on whether the cracks can span the inter-flaw length or not. A smooth inverse power-law acceleration of AE rate to failure, and an accurate forecast, occurs when the cracks are sufficiently long to bridge pore spaces. When this is not the case, the predicted failure time is much less accurate and failure is preceded by an exponential AE rate trend. Finally, we provide a quantitative and pragmatic correction for the systematic error in the forecast failure time, valid for structurally isotropic porous materials, which could be tested against larger-scale natural failure events, with suitable scaling for the relevant inter-flaw distances.

  5. PLS Road surface temperature forecast for susceptibility of ice occurrence

    NASA Astrophysics Data System (ADS)

    Marchetti, Mario; Khalifa, Abderrhamen; Bues, Michel

    2014-05-01

    Winter maintenance relies on many operational tools consisting in monitoring atmospheric and pavement physical parameters. Among them, road weather information systems (RWIS) and thermal mapping are mostly used by service in charge of managing infrastructure networks. The Data from RWIS and thermal mapping are considered as inputs for forecasting physical numerical models, commonly in place since the 80s. These numerical models do need an accurate description of the infrastructure, such as pavement layers and sub-layers, along with many meteorological parameters, such as air temperature and global and infrared radiation. The description is sometimes partially known, and meteorological data is only monitored on specific spot. On the other hand, thermal mapping is now an easy, reliable and cost effective way to monitor road surface temperature (RST), and many meteorological parameters all along routes of infrastructure networks, including with a whole fleet of vehicles in the specific cases of roads, or airports. The technique uses infrared thermometry to measure RST and an atmospheric probes for air temperature, relative humidity, wind speed and global radiation, both at a high resolution interval, to identify sections of the road network prone to ice occurrence. However, measurements are time-consuming, and the data from thermal mapping is one input among others to establish the forecast. The idea was to build a reliable forecast on the sole data from thermal mapping. Previous work has established the interest to use principal component analysis (PCA) on the basis of a reduced number of thermal fingerprints. The work presented here is a focus on the use of partial least-square regression (PLS) to build a RST forecast with air temperature measurements. Roads with various environments, weather conditions (clear, cloudy mainly) and seasons were monitored over several months to generate an appropriate number of samples. The study was conducted to determine the minimum number of samples to get a reliable forecast, considering inputs for numerical models do not exceed five thermal fingerprints. Results of PLS have shown that the PLS model could have a R² of 0.9562, a RMSEP of 1.34 and a bias of -0.66. The same model applied to establish a forecast on past event indicates an average difference between measurements and forecasts of 0.20 °C. The advantage of such approach is its potential application not only to winter events, but also the extreme summer ones for urban heat island.

  6. Using Haines Index coupled with fire weather model predicted from high resolution LAM forecasts to asses wildfire extreme behaviour in Southern Europe.

    NASA Astrophysics Data System (ADS)

    Gaetani, Francesco; Baptiste Filippi, Jean; Simeoni, Albert; D'Andrea, Mirko

    2010-05-01

    Haines Index (HI) was developed by USDA Forest Service to measure the atmosphere's contribution to the growth potential of a wildfire. The Haines Index combines two atmospheric factors that are known to have an effect on wildfires: Stability and Dryness. As operational tools, HI proved its ability to predict plume dominated high intensity wildfires. However, since HI does not take into account the fuel continuity, composition and moisture conditions and the effects of wind and topography on fire behaviour, its use as forecasting tool should be carefully considered. In this work we propose the use of HI, predicted from HR Limited Area Model forecasts, coupled with a Fire Weather model (i.e., RISICO system) fully operational in Italy since 2003. RISICO is based on dynamic models able to represent in space and in time the effects that environment and vegetal physiology have on fuels and, in turn, on the potential behaviour of wildfires. The system automatically acquires from remote databases a thorough data-set of input information both of in situ and spatial nature. Meteorological observations, radar data, Limited Area Model weather forecasts, EO data, and fuel data are managed by a Unified Interface able to process a wide set of different data. Specific semi-physical models are used in the system to simulate the dynamics of the fuels (load and moisture contents of dead and live fuel) and the potential fire behaviour (rate of spread and linear intensity). A preliminary validation of this approach will be provided with reference to Sardinia and Corsica Islands, two major islands of the Mediterranean See frequently affected by extreme plume dominated wildfires. A time series of about 3000 wildfires burnt in Sardinia and Corsica in 2007 and 2008 will be used to evaluate the capability of HI coupled with the outputs of the Fire Weather model to forecast the actual risk in time and in space.

  7. An Early-Warning System for Volcanic Ash Dispersal: The MAFALDA Procedure

    NASA Astrophysics Data System (ADS)

    Barsotti, S.; Nannipieri, L.; Neri, A.

    2006-12-01

    Forecasts of the dispersal of volcanic ash is a fundamental goal in order to mitigate its potential impact on urbanized areas and transport routes surrounding explosive volcanoes. To this aim we developed an early- warning procedure named MAFALDA (Modeling And Forecasting Ash Loading and Dispersal in the Atmosphere). Such tool is able to quantitatively forecast the atmospheric concentration of ash as well as the ground deposition as a function of time over a 3D spatial domain.\\The main features of MAFALDA are: (1) the use of the hybrid Lagrangian-Eulerian code VOL-CALPUFF able to describe both the rising column phase and the atmospheric dispersal as a function of weather conditions, (2) the use of high-resolution weather forecasting data, (3) the short execution time that allows to analyse a set of scenarios and (4) the web-based CGI software application (written in Perl programming language) that shows the results in a standard graphical web interface and makes it suitable as an early-warning system during volcanic crises.\\MAFALDA is composed by a computational part that simulates the ash cloud dynamics and a graphical interface for visualizing the modelling results. The computational part includes the codes for elaborating the meteorological data, the dispersal code and the post-processing programs. These produces hourly 2D maps of aerial ash concentration at several vertical levels, extension of "threat" area on air and 2D maps of ash deposit on the ground, in addition to graphs of hourly variations of column height.\\The processed results are available on the web by the graphical interface and the users can choose, by drop-down menu, which data to visualize. \\A first partial application of the procedure has been carried out for Mt. Etna (Italy). In this case, the procedure simulates four volcanological scenarios characterized by different plume intensities and uses 48-hrs weather forecasting data with a resolution of 7 km provided by the Italian Air Force.

  8. Spatio-temporal variability of aerosols in the tropics relationship with atmospheric and oceanic environments

    NASA Astrophysics Data System (ADS)

    Zuluaga-Arias, Manuel D.

    2011-12-01

    Earth's radiation budget is directly influenced by aerosols through the absorption of solar radiation and subsequent heating of the atmosphere. Aerosols modulate the hydrological cycle indirectly by modifying cloud properties, precipitation and ocean heat storage. In addition, polluting aerosols impose health risks in local, regional and global scales. In spite of recent advances in the study of aerosols variability, uncertainty in their spatio-temporal distributions still presents a challenge in the understanding of climate variability. For example, aerosol loading varies not only from year to year but also on higher frequency intraseasonal time scales producing strong variability on local and regional scales. An assessment of the impact of aerosol variability requires long period measurements of aerosols at both regional and global scales. The present dissertation compiles a large database of remotely sensed aerosol loading in order to analyze its spatio-temporal variability, and how this load interacts with different variables that characterize the dynamic and thermodynamic states of the environment. Aerosol Index (AI) and Aerosol Optical Depth (AOD) were used as measures of the atmospheric aerosol load. In addition, atmospheric and oceanic satellite observations, and reanalysis datasets is used in the analysis to investigate aerosol-environment interactions. A diagnostic study is conducted to produce global and regional aerosol satellite climatologies, and to analyze and compare the validity of aerosol retrievals. We find similarities and differences between the aerosol distributions over various regions of the globe when comparing the different satellite retrievals. A nonparametric approach is also used to examine the spatial distribution of the recent trends in aerosol concentration. A significant positive trend was found over the Middle East, Arabian Sea and South Asian regions strongly influenced by increases in dust events. Spectral and composite analyses of surface temperature, atmospheric wind, geopotential height, outgoing longwave radiation, water vapor and precipitation together with the climatology of aerosols provide insight on how the variables interact. Different modes of variability, especially in intraseasonal time scales appear as strong modulators of the aerosol distribution. In particular, we investigate how two modes of variability related to the westward propagating synoptic African Easterly Waves of the Tropical Atlantic Ocean affect the horizontal and vertical structure of the environment. The statistical significance of these two modes is tested with the use of two different spectral techniques. The pattern of propagation of aerosol load shows good correspondence with the progression of the atmospheric and oceanic conditions suitable for dust mobilization over the Atlantic Ocean. We present extensions to previous studies related with dust variability over the Atlantic region by evaluating the performance of the long period satellite aerosol retrievals in determining modes of aerosol variability. Results of the covariability between aerosols-environment motivate the use of statistical regression models to test the significance of the forecasting skill of daily AOD time series. The regression models are calibrated using atmospheric variables as predictors from the reanalysis variables. The results show poor forecasting skill with significant error growing after the 3 rd day of the prediction. It is hypothesized that the simplicity of linear models results in an inability to provide a useful forecast.

  9. Asymptomatic Cerebrospinal Fluid HIV-1 Viral Blips and Viral Escape During Antiretroviral Therapy: A Longitudinal Study.

    PubMed

    Edén, Arvid; Nilsson, Staffan; Hagberg, Lars; Fuchs, Dietmar; Zetterberg, Henrik; Svennerholm, Bo; Gisslén, Magnus

    2016-12-15

    We examined longitudinal cerebrospinal fluid (CSF) samples (median, 5 samples/patients; interquartile range [IQR], 3-8 samples/patient) in 75 neurologically asymptomatic human immunodeficiency virus (HIV)-infected patients receiving antiretroviral therapy. Twenty-seven patients (36%) had ≥1 CSF HIV RNA load of >20 copies/mL (23% had ≥1 load of >50 copies/mL), with a median HIV RNA load of 50 copies/mL (IQR, 32-77 copies/mL). In plasma, 42 subjects (52%) and 22 subjects (29%) had an HIV RNA load of >20 and >50 copies/mL, respectively. Two subjects had an increasing virus load in consecutive CSF samples, representing possible CSF escape. Of 418 samples, 9% had a CSF HIV RNA load of >20 copies/mL (5% had a load of >50 copies/mL) and 19% had a plasma HIV RNA load of >20 copies/mL (8% had a load of >50 copies/mL). A CSF-associated virus load of >20 copies/mL was associated with higher CSF level of neopterin. In conclusion, CSF escape was rare, and increased CSF HIV RNA loads usually represented CSF virus load blips. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  10. Hybrid ARIMAX quantile regression method for forecasting short term electricity consumption in east java

    NASA Astrophysics Data System (ADS)

    Prastuti, M.; Suhartono; Salehah, NA

    2018-04-01

    The need for energy supply, especially for electricity in Indonesia has been increasing in the last past years. Furthermore, the high electricity usage by people at different times leads to the occurrence of heteroscedasticity issue. Estimate the electricity supply that could fulfilled the community’s need is very important, but the heteroscedasticity issue often made electricity forecasting hard to be done. An accurate forecast of electricity consumptions is one of the key challenges for energy provider to make better resources and service planning and also take control actions in order to balance the electricity supply and demand for community. In this paper, hybrid ARIMAX Quantile Regression (ARIMAX-QR) approach was proposed to predict the short-term electricity consumption in East Java. This method will also be compared to time series regression using RMSE, MAPE, and MdAPE criteria. The data used in this research was the electricity consumption per half-an-hour data during the period of September 2015 to April 2016. The results show that the proposed approach can be a competitive alternative to forecast short-term electricity in East Java. ARIMAX-QR using lag values and dummy variables as predictors yield more accurate prediction in both in-sample and out-sample data. Moreover, both time series regression and ARIMAX-QR methods with addition of lag values as predictor could capture accurately the patterns in the data. Hence, it produces better predictions compared to the models that not use additional lag variables.

  11. Minimization of Impact from Electric Vehicle Supply Equipment to the Electric Grid Using a Dynamically Controlled Battery Bank for Peak Load Shaving

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castello, Charles C

    This research presents a comparison of two control systems for peak load shaving using local solar power generation (i.e., photovoltaic array) and local energy storage (i.e., battery bank). The purpose is to minimize load demand of electric vehicle supply equipment (EVSE) on the electric grid. A static and dynamic control system is compared to decrease demand from EVSE. Static control of the battery bank is based on charging and discharging to the electric grid at fixed times. Dynamic control, with 15-minute resolution, forecasts EVSE load based on data analysis of collected data. In the proposed dynamic control system, the sigmoidmore » function is used to shave peak loads while limiting scenarios that can quickly drain the battery bank. These control systems are applied to Oak Ridge National Laboratory s (ORNL) solar-assisted electric vehicle (EV) charging stations. This installation is composed of three independently grid-tied sub-systems: (1) 25 EVSE; (2) 47 kW photovoltaic (PV) array; and (3) 60 kWh battery bank. The dynamic control system achieved the greatest peak load shaving, up to 34% on a cloudy day and 38% on a sunny day. The static control system was not ideal; peak load shaving was 14.6% on a cloudy day and 12.7% on a sunny day. Simulations based on ORNL data shows solar-assisted EV charging stations combined with the proposed dynamic battery control system can negate up to 89% of EVSE load demand on sunny days.« less

  12. Gas loading apparatus for the Paris-Edinburgh press

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bocian, A.; Kamenev, K. V.; Bull, C. L.

    2010-09-15

    We describe the design and operation of an apparatus for loading gases into the sample volume of the Paris-Edinburgh press at room temperature and high pressure. The system can be used for studies of samples loaded as pure or mixed gases as well as for loading gases as pressure-transmitting media in neutron-scattering experiments. The apparatus consists of a high-pressure vessel and an anvil holder with a clamp mechanism. The vessel, designed to operate at gas pressures of up to 150 MPa, is used for applying the load onto the anvils located inside the clamp. This initial load is sufficient formore » sealing the pressurized gas inside the sample containing gasket. The clamp containing the anvils and the sample is then transferred into the Paris-Edinburgh press by which further load can be applied to the sample. The clamp has apertures for scattered neutron beams and remains in the press for the duration of the experiment. The performance of the gas loading system is illustrated with the results of neutron-diffraction experiments on compressed nitrogen.« less

  13. Watershed-scale impacts of bioenergy crops on hydrology and water quality using improved SWAT model

    DOE PAGES

    Cibin, Raj; Trybula, Elizabeth; Chaubey, Indrajeet; ...

    2016-01-08

    Cellulosic bioenergy feedstock such as perennial grasses and crop residues are expected to play a significant role in meeting US biofuel production targets. Here, we used an improved version of the Soil and Water Assessment Tool (SWAT) to forecast impacts on watershed hydrology and water quality by implementing an array of plausible land-use changes associated with commercial bioenergy crop production for two watersheds in the Midwest USA. Watershed-scale impacts were estimated for 13 bioenergy crop production scenarios, including: production of Miscanthus 9 giganteus and upland Shawnee switchgrass on highly erodible landscape positions, agricultural marginal land areas and pastures, removal ofmore » corn stover and combinations of these options. We also measured water quality as erosion and sediment loading; this was forecasted to improve compared to baseline when perennial grasses were used for bioenergy production, but not with stover removal scenarios. Erosion reduction with perennial energy crop production scenarios ranged between 0.2% and 59%. Stream flow at the watershed outlet was reduced between 0 and 8% across these bioenergy crop production scenarios compared to baseline across the study watersheds. Our results indicate that bioenergy production scenarios that incorporate perennial grasses reduced the nonpoint source pollutant load at the watershed outlet compared to the baseline conditions (0–20% for nitrate-nitrogen and 3–56% for mineral phosphorus); but, the reduction rates were specific to site characteristics and management practices.« less

  14. Stochastic Multi-Timescale Power System Operations With Variable Wind Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Hongyu; Krad, Ibrahim; Florita, Anthony

    This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less

  15. Preliminary identification of unicellular algal genus by using combined confocal resonance Raman spectroscopy with PCA and DPLS analysis

    NASA Astrophysics Data System (ADS)

    He, Shixuan; Xie, Wanyi; Zhang, Ping; Fang, Shaoxi; Li, Zhe; Tang, Peng; Gao, Xia; Guo, Jinsong; Tlili, Chaker; Wang, Deqiang

    2018-02-01

    The analysis of algae and dominant alga plays important roles in ecological and environmental fields since it can be used to forecast water bloom and control its potential deleterious effects. Herein, we combine in vivo confocal resonance Raman spectroscopy with multivariate analysis methods to preliminary identify the three algal genera in water blooms at unicellular scale. Statistical analysis of characteristic Raman peaks demonstrates that certain shifts and different normalized intensities, resulting from composition of different carotenoids, exist in Raman spectra of three algal cells. Principal component analysis (PCA) scores and corresponding loading weights show some differences from Raman spectral characteristics which are caused by vibrations of carotenoids in unicellular algae. Then, discriminant partial least squares (DPLS) classification method is used to verify the effectiveness of algal identification with confocal resonance Raman spectroscopy. Our results show that confocal resonance Raman spectroscopy combined with PCA and DPLS could handle the preliminary identification of dominant alga for forecasting and controlling of water blooms.

  16. Energy data sourcebook for the US residential sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, T.P.; Koomey, J.G.; Sanchez, M.

    Analysts assessing policies and programs to improve energy efficiency in the residential sector require disparate input data from a variety of sources. This sourcebook, which updates a previous report, compiles these input data into a single location. The data provided include information on end-use unit energy consumption (UEC) values of appliances and equipment efficiency; historical and current appliance and equipment market shares; appliances and equipment efficiency and sales trends; appliance and equipment efficiency standards; cost vs. efficiency data for appliances and equipment; product lifetime estimates; thermal shell characteristics of buildings; heating and cooling loads; shell measure cost data for newmore » and retrofit buildings; baseline housing stocks; forecasts of housing starts; and forecasts of energy prices and other economic drivers. This report is the essential sourcebook for policy analysts interested in residential sector energy use. The report can be downloaded from the Web at http://enduse.lbl. gov/Projects/RED.html. Future updates to the report, errata, and related links, will also be posted at this address.« less

  17. Evaluation and prediction of solar radiation for energy management based on neural networks

    NASA Astrophysics Data System (ADS)

    Aldoshina, O. V.; Van Tai, Dinh

    2017-08-01

    Currently, there is a high rate of distribution of renewable energy sources and distributed power generation based on intelligent networks; therefore, meteorological forecasts are particularly useful for planning and managing the energy system in order to increase its overall efficiency and productivity. The application of artificial neural networks (ANN) in the field of photovoltaic energy is presented in this article. Implemented in this study, two periodically repeating dynamic ANS, that are the concentration of the time delay of a neural network (CTDNN) and the non-linear autoregression of a network with exogenous inputs of the NAEI, are used in the development of a model for estimating and daily forecasting of solar radiation. ANN show good productivity, as reliable and accurate models of daily solar radiation are obtained. This allows to successfully predict the photovoltaic output power for this installation. The potential of the proposed method for controlling the energy of the electrical network is shown using the example of the application of the NAEI network for predicting the electric load.

  18. An operational hydrological ensemble prediction system for the city of Zurich (Switzerland): skill, case studies and scenarios

    NASA Astrophysics Data System (ADS)

    Addor, N.; Jaun, S.; Fundel, F.; Zappa, M.

    2011-07-01

    The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This model chain relies on limited-area atmospheric forecasts provided by the deterministic model COSMO-7 and the probabilistic model COSMO-LEPS. These atmospheric forecasts are used to force a semi-distributed hydrological model (PREVAH), coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework with which to compare the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added-value conveyed by the probability information, a reforecast was made for the period June 2007 to December 2009 for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain can be of up to 2 days lead time for the catchment considered. Brier skill scores show that overall COSMO-LEPS-based hydrological forecasts outperforms their COSMO-7-based counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts, as shown by comparisons with a reference run driven by observed meteorological parameters. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. The two most intense events of the study period are investigated utilising a novel graphical representation of probability forecasts, and are used to generate high discharge scenarios. They highlight challenges for making decisions on the basis of hydrological predictions, and indicate the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment. No definitive conclusion on the model chain capacity to forecast flooding events endangering the city of Zurich could be drawn because of the under-sampling of extreme events. Further research on the form of the reforecasts needed to infer on floods associated to return periods of several decades, centuries, is encouraged.

  19. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973

  20. Single-Lap-Joint Screening of Hysol EA 9309NA Epoxy Adhesive

    DTIC Science & Technology

    2017-05-01

    1 Fig. 2 Load vs. displacement for RT (no conditioning) samples .................... 6 Fig. 3...Load vs. displacement for RT (hot/wet conditioning) samples ............ 7 Fig. 5 Failure surface for RT (hot/wet conditioning) samples. MSAT ID...20140469, mode of failure = adhesive. ................................................. 8 Fig. 6 Load vs. displacement for ET samples (66 °C postcure

Top