A class of spatial economic - demographic forecasting models is proposed. The models combine elements of traditional Markov and economic gravity models. A base-period probability structure is modified by the changing relative distribution of economic opportunity. Estimation issues are addressed, and an empirical application to US interstate migration during the late 1970s is described. It is contended that the framework
D A Plane; P A Rogerson
The design of a general multiregional econometric model of the USA and the design of a regional electricity consumption and demand submodel are presented. The multiregional econometric model is intended to provide forecasts of regional population, economic activity by industrial sector, regional wages, and incomes. The electricity submodel is designed to take forecasts of such general economic indicators (together with forecasts of relative electricity and othe energy costs) and to produce forecasts of electricity (kWh) consumption by customer category and forecasts of peak load. While the ultimate purpose of the present effort is regional electricity forecasting, it is clear that the multiregional econometric model which supports the electricity submodel has a great many other uses. The multiregional econometric model design presented in the document represents a natural extension to the regional level of the Wharton Long-Term Annual and Industry Forecasting Model of the USA. The parts of that model that lend themselves to regional disaggregation (employment and wages, for example) are disaggregated. Aggregate US forecasts for such variables are determined by adding up from the bottom. This bottom-up design marks a major departure from earlier regional efforts. In addition to providing a description of the theoretical design of the model, this document provides an extensive review and evaluation of the economic and electricity-energy database needed for its construction.
Adams, F.G.; McCarthy, M.D.; Hill, J.
This paper investigates the forecasting ability of Mallows Model Averaging (MMA) by conducting an empirical analysis of five Asia countries, Malaysia, Thailand, Philippines, Indonesia and China's GDP growth rate. Results reveal that MMA has no noticeable differences in predictive ability compared to the general autoregressive fractional integrated moving average model (ARFIMA) and its predictive ability is sensitive to the effect of financial crisis. MMA could be an alternative forecasting method for samples without recent outliers such as financial crisis.
Yin, Yip Chee; Hock-Eam, Lim
Background: We developed an economic model of prostate cancer management from diagnosis until death. We have used the Montreal Prostate Cancer Model to estimate the total economic burden of the disease in a cohort of Canadian men. Methods: Using this Markov state-transition simulation model, we estimated the probability of prostate cancer, annual prostate cancer progression rates and as- sociated direct
Steven A. Grover; Louis Coupal; Hanna Zowall; Raghu Rajan; John Trachtenberg; Mostafa Elhilali; Michael Chetner; Larry Goldenberg
Discusses four reasons why economic forecasting courses are important: (1) forecasting skills are in demand by businesses; (2) forecasters are in demand; (3) forecasting courses have positive externalities; (4) and forecasting provides a real-world context. Describes what should be taught in an economic forecasting course. (CMK)
Loomis, David G.; Cox, James E., Jr.
In this paper we review the methodology of forecasting with log-linearised DSGE models using Bayesian methods. We focus on the estimation of their predictive distributions, with special attention being paid to the mean and the covariance matrix of h-step ahead forecasts. In the empirical analysis, we examine the forecasting performance of the New Area-Wide Model (NAWM) that has been designed
Kai Christoffel; Günter Coenen; Anders Warne
Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models.
Alwee, Razana; Hj Shamsuddin, Siti Mariyam; Sallehuddin, Roselina
This report describes a model for forecasting total US highway travel by all vehicle types, and its implementation in the form of a personal computer program. The model comprises a short-run, econometrically-based module for forecasting through the year 2000, as well as a structural, scenario-based longer term module for forecasting through 2030. The short-term module is driven primarily by economic variables. It includes a detailed vehicle stock model and permits the estimation of fuel use as well as vehicle travel. The longer-tenn module depends on demographic factors to a greater extent, but also on trends in key parameters such as vehicle load factors, and the dematerialization of GNP. Both passenger and freight vehicle movements are accounted for in both modules. The model has been implemented as a compiled program in the Fox-Pro database management system operating in the Windows environment.
Greene, D.L.; Chin, Shih-Miao; Gibson, R. [Tennessee Univ., Knoxville, TN (United States)
As regions look to increase their economic development activities, technology-based developments and the penchant for long-term developments in disruptive technologies like nanotechnology become an important part of the options available to these regions. There are typically many technologies and therefore product areas that the region, however, can further develop by investing resources in these areas. At the same time, other
Sul Kassicieh; Nabeel Rahal
This publication describes data in the National Weather Services's Model Output Statistics Final Forecast Guidance teletype bulletins. It is intended to serve as a comprehensive guide to the interpretation and use of the forecast bulletins by AWS forecast...
This paper describes the Western Area Gaining and Economic Response Simulator (WAGERS), a forecasting model that emphasizes the role of the gaming industry in Clark County, Nevada. It is designed to generate forecasts of gaming revenues in Clark County, w...
B. Edwards A. Bando G. Bassett A. Rosen J. Carlson
Motivated by the common finding that linear autoregressive models forecast better than models that incorporate additional information, this paper presents analytical, Monte Carlo, and empirical evidence on the effectiveness of combining forecasts from nes...
T. E. Clark M. W. McCracken
What is econophysics and its relationship with economics? What is the state of economics after the global economic crisis, and is there a future for the paradigm of market equilibrium, with imaginary perfect competition and rational agents? Can the next paradigm of economics adopt important assumptions derived from econophysics models: that markets are chaotic systems, striving to extremes as bubbles and crashes show, with psychologically motivated, statistically predictable individual behaviors? Is the future of econophysics, as predicted here, to disappear and become a part of economics? A good test of the current state of econophysics and its methods is the valuation of Facebook immediately after the initial public offering - this forecast indicates that Facebook is highly overvalued, and its IPO valuation of 104 billion dollars is mostly the new financial bubble based on the expectations of unlimited growth, although it’s easy to prove that Facebook is close to the upper limit of its users.
Gajic, Nenad; Budinski-Petkovic, Ljuba
The purpose of this paper is to describe the procedures followed by the Research Department of the Federal Reserve Bank of Minneapolis in producing a forecast of natural economic activity with the aid of a large econometric model. We produce such a foreca...
T. M. Supel
Features selection in multivariate forecasting model is very important to ensure that the model is accurate. The purpose of this study is to apply the Cooperative Feature Selection method for features selection. The features are economic indicators that will be used in crime rate forecasting model. The Cooperative Feature Selection combines grey relational analysis and artificial neural network to establish a cooperative model that can rank and select the significant economic indicators. Grey relational analysis is used to select the best data series to represent each economic indicator and is also used to rank the economic indicators according to its importance to the crime rate. After that, the artificial neural network is used to select the significant economic indicators for forecasting the crime rates. In this study, we used economic indicators of unemployment rate, consumer price index, gross domestic product and consumer sentiment index, as well as data rates of property crime and violent crime for the United States. Levenberg-Marquardt neural network is used in this study. From our experiments, we found that consumer price index is an important economic indicator that has a significant influence on the violent crime rate. While for property crime rate, the gross domestic product, unemployment rate and consumer price index are the influential economic indicators. The Cooperative Feature Selection is also found to produce smaller errors as compared to Multiple Linear Regression in forecasting property and violent crime rates.
Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Salleh Sallehuddin, Roselina
The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.
Attanasi, E. D.; Schuenemeyer, J. H.
This paper describes a water flow forecasting methods for dams using neural networks and regression models. Water flow forecasting task is very important for reliable and economic operations. Many conventional methods have been used. They take much time to develop an accurate forecasting system, because it is difficult to adjust parameters. Water flow forecasting system for dams, which have much
T. Egawa; K. Suzuki; Y. Ichikawa; T. Iizaka; T. Matsui; Y. Shikagawa
A method is presented for the estimation of undiscovered oil and gas resources in partially explored areas where economic truncation has caused some discoveries to go unreported; therefore distorting the relationship between the observed discovery size distribution and the parent or ultimate field size distribution. The method is applied to the UK's northern and central North Sea provinces. A discovery process model is developed to estimate the number and size distribution of undiscovered fields in this area as of 1983. The model is also used to forecast the rate at which fields will be discovered in the future. The appraisal and forecasts pertain to fields in size classes as small as 24 million barrels of oil equivalent (BOE). Estimated undiscovered hydrocarbon resources of 11.79 billion BOE are expected to be contained in 170 remaining fields. Over the first 500 wildcat wells after 1 January 1983, the discovery rate in this areas is expected to decline by 60% from 15 million BOE per wildcat well to six million BOE per wildcat well. ?? 1984.
Schuenemeyer, J. H.; Attanasi, E. D.
All forecasters are familiar with occasional run-to-run changes in forecast direction that occur with medium-range (and sometimes even short-range) forecasts in the Global Forecast Model (aka AVN/MRF). This case describes two recent model flipflops in a pair of time-adjacent operational MRF runs, and shows how MRF ensemble forecasts shed light on what is actually going on in the operational MRF seasons.
The Collaborative Planning, Forecasting and Replenishment (CPFR) is an application of Supply Chain Management concept in the retailing. In this paper, the collaborative forecasting process between retailers and manufacturers which is the core of CPFR is mainly discussed. A combination-forecasting model is created to improve forecasting accuracy and collaboration in CPFR process. Finally, the formulation results showed the effectiveness of
WenJie Wang; Glorious Sun
Deterministic forecasts of wind production for the next 72 h at a single wind farm or at the regional level are among the main end-users requirement. However, for an optimal management of wind power production and distribution it is important to provide, together with a deterministic prediction, a probabilistic one. A deterministic forecast consists of a single value for each time in the future for the variable to be predicted, while probabilistic forecasting informs on probabilities for potential future events. This means providing information about uncertainty (i.e. a forecast of the PDF of power) in addition to the commonly provided single-valued power prediction. A significant probabilistic application is related to the trading of energy in day-ahead electricity markets. It has been shown that, when trading future wind energy production, using probabilistic wind power predictions can lead to higher benefits than those obtained by using deterministic forecasts alone. In fact, by using probabilistic forecasting it is possible to solve economic model equations trying to optimize the revenue for the producer depending, for example, on the specific penalties for forecast errors valid in that market. In this work we have applied a probabilistic wind power forecast systems based on the "analog ensemble" method for bidding wind energy during the day-ahead market in the case of a wind farm located in Italy. The actual hourly income for the plant is computed considering the actual selling energy prices and penalties proportional to the unbalancing, defined as the difference between the day-ahead offered energy and the actual production. The economic benefit of using a probabilistic approach for the day-ahead energy bidding are evaluated, resulting in an increase of 23% of the annual income for a wind farm owner in the case of knowing "a priori" the future energy prices. The uncertainty on price forecasting partly reduces the economic benefit gained by using a probabilistic energy forecast system.
Alessandrini, S.; Davò, F.; Sperati, S.; Benini, M.; Delle Monache, L.
A conceptual models of the commercial air transportation industry is developed which can be used to predict trends in economics, demand, and consumption. The methodology is based on digraph theory, which considers the interaction of variables and propagation of changes. Air transportation economics are treated by examination of major variables, their relationships, historic trends, and calculation of regression coefficients. A description of the modeling technique and a compilation of historic airline industry statistics used to determine interaction coefficients are included. Results of model validations show negligible difference between actual and projected values over the twenty-eight year period of 1959 to 1976. A limited application of the method presents forecasts of air tranportation industry demand, growth, revenue, costs, and fuel consumption to 2020 for two scenarios of future economic growth and energy consumption.
Ayati, M. B.; Liu, C. Y.; English, J. M.
Our empirical results show that we can predict GDP growth rate more accurately in continent with fewer large economies, compared to smaller economies like Malaysia. This difficulty is very likely positively correlated with subsidy or social security policies. The stage of economic development and level of competiveness also appears to have interactive effects on this forecast stability. These results are generally independent of the forecasting procedures. Countries with high stability in their economic growth, forecasting by model selection is better than model averaging. Overall forecast weight averaging (FWA) is a better forecasting procedure in most countries. FWA also outperforms simple model averaging (SMA) and has the same forecasting ability as Bayesian model averaging (BMA) in almost all countries.
Yin, Yip Chee; Hock-Eam, Lim
This article analyzes the economic impact of price forecast errors on the optimal operation schedules of distributed (battery) storage systems. The presented simulation model extends a linear optimization model that achieves up to 17% annual savings for a storage system in an environment with dynamically changing electricity prices and under the assumptions of ex-ante known load and price data. The
Klaus-Henning Ahlert; Carsten Block
1. Abstract Prediction tools can make wind energy be competitive in a liberalized energy market, where deviations in production have a penalty which is usually an obstacle for developers to access to the energy market. This paper analyses these deviations and the economic impact over any developer according to the actual pricing policy. This is made through simulations based on
I. Marti; M. J. San Isidro; M. Gastón; Y. Loureiro; J. Sanz; I. Pérez
A large literature in exchange rate economics has investigated the forecasting performance of empirical exchange rate models using conventional point forecast accuracy criteria. However, in the context of managing exchange rate risk, interest centers on more than just point forecasts. This paper provides a formal evaluation of recent exchange rate models based on the term structure of forward exchange rates,
Lucio Sarno; Giorgio Valente
Offshore wind power projects are critically reliant on accurate wind resources assessment and large offshore wind farm operations require timely weather forecast. Most often, however, the offshore measurement is scarce and a conventional 6 hourly large scale weather model outputs with a typical resolution 0.5º x 0.5º is rather limited in application. The lack of measurement and the need for
Jiri Beran; Barbara Jimenez; Abha Sood
The information contained in one model's forecast compared to that in another can be assessed from a regression of actual values on predicted values from the two models. The authors do this for forecasts of real GNP growth rates for different pairs of models. The models include a structural model (the Fair model), various versions of the vector autoregressive model,
Ray C Fair; Robert J Shiller
Different from conventional studies developing reservoir operation models and treating forecast as input to obtain operation decisions case by case, this study issues a hydro-economic analysis framework and derives some general relationships between optimal flood control decision and streamflow forecast. By analogy with the hedging rule theory for water supply, we formulate reservoir flood control with a two-stage optimization model, in which the properties of flood damage (i.e., diminishing marginal damage) and the characteristics of forecast uncertainty (i.e., the longer the forecast horizon, the larger the forecast uncertainty) are incorporated to minimize flood risk. We define flood conveying capacity surplus (FCCS) variables to elaborate the trade-offs between the release of current stage (i.e., stage 1) and in the release of future stage (i.e., stage 2). Using Karush-Kuhn-Tucker conditions, the flood risk trade-off between the two stages is theoretically represented and illustrated by three typical situations depending on forecast uncertainty and flood magnitude. The analytical results also show some complicated effects of forecast uncertainty and flood magnitude on real-time flood control decision: 1) When there is a big flood with a small FCCS, the whole FCCS should be allocated to the current stage to hedge against the more certain and urgent flood risk in the current stage; 2) when there is a medium flood with a moderate FCCS, some FCCS should be allocated to the future stage but more FCCS still should be allocated to the current stage; and 3) when there is a small flood with a large FCCS, more FCCS should be allocated to the future stage than the current stage, as a large FCCS in the future stage can still induce some flood risk (distribution of future stage forecast uncertainty is more disperse) while a moderate FCCS in the current stage can induce a small risk. Moreover, this study also presents a hypothetical case study to analyze the flood risk under Pseudo probabilistic streamflow forecast (pPSF, deterministic forecast with variance) and Real probabilistic streamflow forecast (rPSF, ensemble forecast) forecast uncertainties, which shows ensemble forecast techniques are more efficient on mitigating flood risk.
Zhao, T.; Zhao, J.; Cai, X.; Yang, D.
Dollar losses beyond the farm gate resulting from the entry and establishment of an exotic crop pest may far exceed the direct losses farmers incur. This case study uses an econometric-simulation model to estimate the benefits to U.S. agriculture of preve...
F. Kuchler M. Duffy
Economic forecasting in the world of international finance confronts economists with challenging cross-cultural writing tasks. Producing forecasts in English which convey confidence and credibility entails an understanding of linguistic conventions which typify the genre. A typical linguistic feature of commercial economic forecasts produced by…
Donohue, James P.
In a hydrologic basin where precipitation rates have strong seasonal characteristics, simple seasonal forecasts of rainfall along with regression analysis on a few related meteorological observations can be used to obtain an estimate of the anticipated rainfall rate one day in advance. In this paper a model for forecasting daily rainfall with one day lead time is presented. The model uses smoothed normal daily rainfall rates as seasonal forecasts and a linear regression model on deviations of atmospheric pressure, temperature, and humidity from their seasonal mean values for estimating departures from seasonal rainfall forecasts. The model has been applied to and tested with daily rainfall data of Dhaka (Dacca) city in Bangladesh.
Wasimi, Saleh A.
A demonstration experiment is being planned to show that frost and freeze prediction improvements are possible utilizing timely Synchronous Meteorological Satellite temperature measurements and that this information can affect Florida citrus grower operations and decisions. An economic experiment was carried out which will monitor citrus growers' decisions, actions, costs and losses, and meteorological forecasts and actual weather events and will establish the economic benefits of improved temperature forecasts. A summary is given of the economic experiment, the results obtained to date, and the work which still remains to be done. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service, and Federal Crop Insurance Corp., resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements.
The Office of the Actuary, U.S. Social Security Administration, produces alternative forecasts of mortality to reflect uncertainty about the future. Appropriate probabilistic interpretations of the intervals have not been provided, however, because explicit stochastic models are not used. In this article we identify the components and assumptions of the official forecasts and approximate them by stochastic parametric models. We estimate
Juha M. Alho; Bruce D. Spencer
Differing decision models and operational characteristics affecting the economic expenses (i.e., the costs of protection and losses suffered if no protective measures have been taken) associated with the use of predictive weather information have been examined.
Carter, G. M.
Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. In order to achieve spatiotemporal forecasting, some mature analysis tools, e.g., time series and spatial statistics are extended to the spatial dimension and the temporal dimension, respectively. Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Despite the widespread application of nonlinear mathematical models, comparative studies on spatiotemporal drought forecasting using different models are still a huge task for modellers. This study uses a promising approach, the Gamma Test (GT), to select the input variables and the training data length, so that the trial and error workload could be greatly reduced. The GT enables to quickly evaluate and estimate the best mean squared error that can be achieved by a smooth model on any unseen data for a given selection of inputs, prior to model construction. The GT is applied to forecast droughts using monthly Standardized Precipitation Index (SPI) timeseries at multiple timescales in several precipitation stations at Pinios river basin in Thessaly region, Greece. Several nonlinear models have been developed efficiently, with the aid of the GT, for 1-month up to 12-month ahead forecasting. Several temporal and spatial statistical indices were considered for the performance evaluation of the models. The predicted results show reasonably good agreement with the actual data for short lead times, whereas the forecasting accuracy decreases with increase in lead time. Finally, the developed nonlinear models could be used in an early warning system for risk and decision analyses at the study area.
Vasiliades, Lampros; Loukas, Athanasios
The National Weather Service (NWS) River Forecast Centers (RFCs) issue deterministic river stage forecasts for 110 locations across the Northeast USA. Nevertheless, the uncertainty information can be as important as the forecast itself for forecast users. This paper presents a conditional characterization of river stage values given each forecast value through a four-parameter skewed t distribution, with each parameter modeled as a function of the point forecast value and the 1-day-ago observed value. The model was applied to nine years of daily observed stage values in warm season and matching 6-hour-lead forecast at the Plymouth station on the Pemigewasset River in New Hampshire. For each point forecast value, the conditional distribution and resulting prediction intervals provide uncertainty information that are potentially very important to forecast users and algorithm developers in decision making and improvement of forecast quality
Yan, J.; Liao, G.; Gebremichael, M.; Shedd, R.; Vallee, D.
Weather forecasters, particularly those in broadcasting, are the primary conduit to the public for information on climate and climate change. However, many weather forecasters remain skeptical of model-based climate projections. To address this issue, The COMET Program developed an hour-long online lesson of how climate models work, targeting an audience of weather forecasters. The module draws on forecasters' pre-existing knowledge of weather, climate, and numerical weather prediction (NWP) models. In order to measure learning outcomes, quizzes were given before and after the lesson. Preliminary results show large learning gains. For all people that took both pre and post-tests (n=238), scores improved from 48% to 80%. Similar pre/post improvement occurred for National Weather Service employees (51% to 87%, n=22 ) and college faculty (50% to 90%, n=7). We believe these results indicate a fundamental misunderstanding among many weather forecasters of (1) the difference between weather and climate models, (2) how researchers use climate models, and (3) how they interpret model results. The quiz results indicate that efforts to educate the public about climate change need to include weather forecasters, a vital link between the research community and the general public.
Bol, A.; Kiehl, J. T.; Abshire, W. E.
The aim of this study is monitoring, mapping and forecast of pollen distribution for the city of Rome using in-situ measurements of 10 species of common allergenic pollens and measurements of PM10. The production of daily concentration maps, associated to a mobile phone app, are innovative compared to existing dedicated services to people who suffer from respiratory allergies. The dispersal pollen is one of the most well-known causes of allergic disease that is manifested by disorders of the respiratory functions. Allergies are the third leading cause of chronic disease and it is estimated that tens millions of people in Italy suffer from it. Recent works reveal that during the last few years there was a progressive increase of affected subjects, especially in urban areas. This situation may depend: on the ability to transport of pollutants, on the ability to react between pollutants and pollen and from a combination of other irritants, existing in densely populated and polluted urban areas. The methodology used to produce maps is based on in-situ measurements time series relative to 2012, obtained from networks of air quality and pollen stations in the metropolitan area of Rome. The monitoring station aerobiological of University of Rome "Tor Vergata" is located at the Department of Biology. The instrument used to pollen monitoring is a volumetric sampler type Hirst (Hirst 1952), Model 2000 VPPS Lanzoni; the data acquisition is carried out as reported in Standard UNI 11008:2004 - "Qualità dell'aria - Metodo di campionamento e conteggio dei granuli pollinici e delle spore fungine aerodisperse" - the protocol that describes the procedure for measuring of the concentration of pollen grains and fungal spores dispersed into the atmosphere, and reported in the "Manuale di gestione e qualità della R.I.M.A" (Travaglini et. al. 2009). All 10 allergenic pollen are monitored since 1996. At Tor Vergata university is also operating a meteorological station (SP2000, CAE Bologna, Italy). With pollen and meteorological dataset was created a provisional model for Poaceae. A PLSDA (Partial Least Squares Discriminant Analysis) approach was used in order to predict Poaceae pollen critical concentration (Brighetti et al. 2013) To preserve spatial correlation between pollens and PM10, we choose a Multiavariate Linear Spatial Interpolation Method to quantify pollen concentration in function of PM10, wind, rain and temperature. A test and validation procedure have been conducted to estimate the error associated to the pollen concentration. Validation for the year 2012 shows a good agreement between measured and estimated data , in each area depending of orography and of road traffic (r >0.83, 1%< RRMSE <5% ). This study aims to be a added value to agro-meteorological data in a different branch from the classic sector of defence and of crop production, emphasizing the importance of monitoring and forecast the pollen dispersal in urban areas, evaluated its effect on health and quality of life. In the health area the combined analysis between climate, pollution and dispersal of pollen allows to realize significant operational tools and to develop a reference for subsequent implementations.
Costantini, Monica; Di Giuseppe, Fabio; Medaglia, Carlo Maria; Travaglini, Alessandro; Tocci, Raffaella; Brighetti, M. Antonia; Petitta, Marcello
Through the water information research and development alliance (WIRADA) project, CSIRO is conducting research to improve flood and short-term streamflow forecasting services delivered by the Australian Bureau of Meteorology. WIRADA aims to build and test systems to generate ensemble flood and short-term streamflow forecasts with lead times of up to 10 days by integrating rainfall forecasts from Numerical Weather Prediction (NWP) models and hydrological modelling. Here we present an overview of the latest progress towards developing this system. Rainfall during the forecast period is a major source of uncertainty in streamflow forecasting. Ensemble rainfall forecasts are used in streamflow forecasting to characterise the rainfall uncertainty. In Australia, NWP models provide forecasts of rainfall and other weather conditions for lead times of up to 10 days. However, rainfall forecasts from Australian NWP models are deterministic and often contain systematic errors. We use a simplified Bayesian joint probability (BJP) method to post-process rainfall forecasts from the latest generation of Australian NWP models. The BJP method generates reliable and skilful ensemble rainfall forecasts. The post-processed rainfall ensembles are then used to force a semi-distributed conceptual rainfall runoff model to produce ensemble streamflow forecasts. The performance of the ensemble streamflow forecasts is evaluated on a number of Australian catchments and the benefits of using post processed rainfall forecasts are demonstrated.
Shrestha, D. L.; Robertson, D.; Bennett, J.; Ward, P.; Wang, Q. J.
The fluctuation of oil price decides the security of energy and economics. So the crude oil price forecasting performs importantly. In the paper, we apply the improved model based on wavelet transform and radial basis function (RBF) neural network to forecast the future oil price. Wavelet transform decomposes the original price which is used as the output layer of RBF
Wu Qunli; Hao Ge; Cheng Xiaodong
Data from weather satellites have become integral to the weather forecast process in the United States and abroad. Satellite data are used to derive improved forecasts for short-term routine weather, long-term climate change, and for predicting natural disasters. The resulting forecasts have saved lives, reduced weather-related economic losses, and improved the quality of life. Weather information routinely assists in managing
Henry R. Hertzfeld; Ray A. Williamson; Avery Sen
The NASA Forecast Model WMS (NFMW) provides on-demand visualizations of Earth science data. The current usage focuses on field campaigns and other projects that use the output of the Goddard Earth Observing System (GEOS) and Weather Research Framework (WRF) models, but other models can be supported. The NFMW implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) interoperability specification.
J. de La Beaujardière
Fog has a significant impact on economical aspect (traffic management and safety) as well as on environmental issues (fresh water source for the population and the biosphere in arid region). However, reliable fog and visibility forecasts stay challenging issue. Fog is generally a small scale phenomenon which is mostly affected by local advective transport, radiation, topography, vegetation, turbulent mixing at the surface as well as its microphysical structure. In order to consider these intertwined processes, the three-dimensional fog forecast model, COSMO-FOG, with a high vertical resolution with different microphysical complexity has been developed. This model includes a microphysical parameterisation based on the one-dimensional fog forecast model. The implementation of the cloud water droplets as a new prognostic variable allows a detailed definition of the sedimentation processes and the variations in visibility. Moreover, the turbulence scheme, based on a Mellor-Yamada 2.5 order and a closure of a 2nd order has been modified to improve the model behaviour in case of a stable atmosphere structure, occurring typically during night radiative fog episodes. The potential of COSMO-FOG will be presented in some realistic fog situations (flat, bumpy and complex terrain). The fog spatial extension will be compared with MSG satellite products for fog and low cloud. The interplays between dynamical, thermodynamical patterns and the soil-atmosphere interactions will be presented.
This case study and generalization quantify benefits made possible through improved weather forecasting resulting from the integration of SEASAT data into local weather forecasts. The major source of avoidable economic losses to shipping from inadequate weather forecasting data is shown to be dependent on local precipitation forecasting. The ports of Philadelphia and Boston were selected for study.
We examine the economic benefits of using realized volatility to forecast future implied volatility for pricing, trading, and hedging in the S&P 500 index options market. We propose an encompassing regression approach to forecast future implied volatility, and hence future option prices, by combining historical realized volatility and current implied volatility. Although the use of realized volatility results in superior
Wing Hong Chan; Ranjini Jha; Madhu Kalimipalli
This module describes model parameterizations of sub-surface, boundary-layer,and free atmospheric processes, such as surface snow processes, soil characteristics, vegetation, evapotranspiration, PBL processes and parameterizations, and trace gases, and their interaction with the radiative transfer process. It specifically addresses how models treat these physical processes and how they can influence forecasts of sensible weather elements.
A strategy for building models for an observed time series is presented in this paper. We seek to fit time domain models which can be interpreted in terms of trend and seasonal components, provide forecasts, and provide spectral estimators. Our time serie...
This module describes model parameterizations of sub-surface, boundary-layer,and free atmospheric processes, such as surface snow processes, soil characteristics, vegetation, evapotranspiration, PBL processes and parameterizations, and trace gases, and their interaction with the radiative transfer process. It specifically addresses how models treat these physical processes and how they can influence forecasts of sensible weather elements.
Benefit-cost relationships for the development of meteorological satellites are outlined. The weather forecast capabilities of the various weather satellites (Tiros, SEOS, Nimbus) are discussed, and the development of additional satellite systems is examined. A rational approach is development that leads to the establishment of the economic benefits which may result from the utilization of meteorological satellite data. The economic and social impacts of improved weather forecasting for industries and resources management are discussed, and significant weather sensitive industries are listed.
Bhattacharyya, R.; Greenberg, J.
The article considers dynamic processes involving non-linear power-law behavior in such apparently diverse spheres, as demographic dynamics and dynamics of prices of highly liquid commodities such as oil and gold. All the respective variables exhibit features of explosive growth containing precursors indicating approaching phase transitions/catastrophes/crises. The first part of the article analyzes mathematical models of demographic dynamics that describe various scenarios of demographic development in the post-phase-transition period, including a model that takes the limitedness of the Earth carrying capacity into account. This model points to a critical point in the early 2050s, when the world population, after reaching its maximum value may decrease afterward stabilizing then at a certain stationary level. The article presents an analysis of the influence of the demographic transition (directly connected with the hyperexponential growth of the world population) on the global socioeconomic and geopolitical development. The second part deals with the phenomenon of explosive growth of prices of such highly liquid commodities as oil and gold. It is demonstrated that at present the respective processes could be regarded as precursors of waves of the global financial-economic crisis that will demand the change of the current global economic and political system. It is also shown that the moments of the start of the first and second waves of the current global crisis could have been forecasted with a model of accelerating log-periodic fluctuations superimposed over a power-law trend with a finite singularity developed by Didier Sornette and collaborators. With respect to the oil prices, it is shown that it was possible to forecast the 2008 crisis with a precision up to a month already in 2007. The gold price dynamics was used to calculate the possible time of the start of the second wave of the global crisis (July-August 2011); note that this forecast has turned out to be quite correct.
Akaev, A.; Sadovnichy, V.; Korotayev, A.
Any research or policy analysis in economics must be consistent with the time-series properties of observed macroeconomic data. Numerous previous studies of such time series reinforce the need to specify correctly a model's multivariate stochastic structure. This paper discusses in detail the speciation of a vector error correction forecasting model that is anchored by long-run equilibrium relationships suggested by economic
Richard G. Anderson; Dennis L. Hoffman; Robert H. Rasche
Research conducted at Polish Institute of Meteorology and Water Management, National Research Institute, in collaboration with Consortium for Small Scale Modeling (COSMO) are aimed at developing new conservative dynamical core for next generation operational weather prediction model. Within the frames of the project a new prototype model has been developed. The dynamical core of the model is based on anelastic set of equation and numerics adopted from the EULAG model. An employment of EULAG allowed to profit from its desirable conservative properties and numerical robustness confirmed in number of benchmark tests and widely documented in scientific literature. The first stage of the project has been already successfully completed. Its main achievement is a hybrid model capable to compute weather forecast. The model consists of EULAG dynamical core implemented into the software environment of the operational COSMO model and basic COSMO physical parameterizations involving turbulence, friction, radiation, moist processes and surface fluxes (COSMO-EULAG). The presentation shows the case studies comparing results of 24-hour forecasts calculated via the hybrid model with analogous results obtained with the Runge-Kutta dynamical core standard for the COSMO operational applications. The experiments are performed with 2.2 km resolution over Alpine domain of operational MeteoSwiss numerical forecasts. The results demonstrate that the short-term forecasts employing different dynamical cores are qualitatively and quantitatively similar, especially in the middle and upper troposphere. Near the surface the COSMO-EULAG results, while similar to the Runge-Kutta ones, show more small-scale variability. It is seen that the anelastic approximation does not impose measurable adverse affects on the forecast. The presentation shows also results of another class of experiments. They involve 24-hour forecast with COSMO-EULAG over realistic Alpine domain with the horizontal resolutions of 1.1 and 0.55 km, and employing non-filtered orography calculated for every of these resolutions from the SRTM data. The results show a dependence of the forecasted flow structure on the model resolution not only for the surface features but also for the structure of upper level flow and especially structure of the jet stream over Alpine area. The results document also numerical robustness of the COSMO-EULAG dynamical core which for the horizontal resolution of 0.55 km deals with Alpine slopes reaching 56 degrees of inclination.
Wójcik, Damian; Kurowski, Marcin; Piotrowski, Zbigniew; Rosa, Bogdan; Ziemia?ski, Micha?
The long term goals of this report are: (1) Develop a global ionospheric and upper atmospheric forecasting model over a range of spatial and temporal scales with space and ground-based data assimilation capability; and (2) Develop an improved understandin...
M. J. Keskinen
The planetary geomagnetic Kp index (3-hour average recorded every 3 hours) exhibits a high degree of correlation from one value to the next. In fact, a simple persistence model that forecasts the next 3-hr value as being equal to the current value shows a...
C. J. Wetterer K. Scro M. Jah
A service called the Optimum Path Aircraft Routing System (OPARS) supplies products based on output data from the Naval Oceanographic Global Atmospheric Prediction System (NOGAPS), a model run on a Cyber-205 computer. Temperatures and winds are extracted from the surface to 100 mb, approximately 55,000 ft. Forecast winds are available in six-hour time steps.
Garthner, John P.
Streamflow forecasts are dynamically updated in real-time, thus facilitating a process of forecast uncertainty evolution. Forecast uncertainty generally decreases over time and as more hydrologic information becomes available. The process of forecasting and uncertainty updating can be described by the martingale model of forecast evolution (MMFE), which formulates the total forecast uncertainty of a streamflow in one future period as the sum of forecast improvements in the intermediate periods. This study tests the assumptions, i.e., unbiasedness, Gaussianity, temporal independence, and stationarity, of MMFE using real-world streamflow forecast data. The results show that (1) real-world forecasts can be biased and tend to underestimate the actual streamflow, and (2) real-world forecast uncertainty is non-Gaussian and heavy-tailed. Based on these statistical tests, this study proposes a generalized martingale model GMMFE for the simulation of biased and non-Gaussian forecast uncertainties. The new model combines the normal quantile transform (NQT) with MMFE to formulate the uncertainty evolution of real-world streamflow forecasts. Reservoir operations based on a synthetic forecast by GMMFE illustrates that applications of streamflow forecasting facilitate utility improvements and that special attention should be focused on the statistical distribution of forecast uncertainty.
Zhao, Tongtiegang; Zhao, Jianshi; Yang, Dawen; Wang, Hao
A planning workshop on "Modeling, Simulation and Forecasting of Subseasonal Variability" was held in June 2003. This workshop was the first of a number of meetings planned to follow the NASA-sponsored workshop entitled "Prospects For Improved Forecasts Of Weather And Short-Term Climate Variability On Sub-Seasonal Time Scales" that was held April 2002. The 2002 workshop highlighted a number of key sources of unrealized predictability on subseasonal time scales including tropical heating, soil wetness, the Madden Julian Oscillation (MJO) [a.k.a Intraseasonal Oscillation (ISO)], the Arctic Oscillation (AO) and the Pacific/North American (PNA) pattern. The overarching objective of the 2003 follow-up workshop was to proceed with a number of recommendations made from the 2002 workshop, as well as to set an agenda and collate efforts in the areas of modeling, simulation and forecasting intraseasonal and short-term climate variability. More specifically, the aims of the 2003 workshop were to: 1) develop a baseline of the "state of the art" in subseasonal prediction capabilities, 2) implement a program to carry out experimental subseasonal forecasts, and 3) develop strategies for tapping the above sources of predictability by focusing research, model development, and the development/acquisition of new observations on the subseasonal problem. The workshop was held over two days and was attended by over 80 scientists, modelers, forecasters and agency personnel. The agenda of the workshop focused on issues related to the MJO and tropicalextratropical interactions as they relate to the subseasonal simulation and prediction problem. This included the development of plans for a coordinated set of GCM hindcast experiments to assess current model subseasonal prediction capabilities and shortcomings, an emphasis on developing a strategy to rectify shortcomings associated with tropical intraseasonal variability, namely diabatic processes, and continuing the implementation of an experimental forecast and model development program that focuses on one of the key sources of untapped predictability, namely the MJO. The tangible outcomes of the meeting included: 1) the development of a recommended framework for a set of multi-year ensembles of 45-day hindcasts to be carried out by a number of GCMs so that they can be analyzed in regards to their representations of subseasonal variability, predictability and forecast skill, 2) an assessment of the present status of GCM representations of the MJO and recommendations for future steps to take in order to remedy the remaining shortcomings in these representations, and 3) a final implementation plan for a multi-institute/multi-nation Experimental MJO Prediction Program.
Waliser, Duane; Schubert, Siegfried; Kumar, Arun; Weickmann, Klaus; Dole, Randall
The behavior of multiple CGCMs in long simulations is investigated as the cause of forecast error in short-term forecasts with respect to lead time. The main analysis focuses on the CFS having 9-month forecast integrations for all 12 calendar months. The SINTEX, SNU and UKMO models are analyzed also, since they provide both more than 50-year control simulations and 23-year forecasts. For the ENSO forecasts in CFS, a constant phase shift with respect to lead month is clear, using monthly forecast composite data. This feature is related with model properties having a long life cycle with a summer peak that differs from observations, as shown in the long run case. For other models, the systematic errors in the long run - for example, mean bias, phase shift, weak amplitude, and wrong seasonal cycle - are reflected in the forecast skill as a major factor limiting predictability. In addition, the gradual decline of forecast skill due to model error is also shown in Indian monsoon predictability associated with ENSO. Accordingly, the influence of coupled model errors on real forecasts is an important factor degrading the predictability after the influence of initial conditions fades out with respect to lead time. Therefore, investigating the model capability in long simulations is one key to understanding the behavior of forecast error and potential correct.
Jin, E. K.; Kinter, J. L.; Kug, J.; Wang, B.
Macroeconomic policy decisions in real-time are based on the assessment of current and future economic conditions. Crucially, these assessments are made difficult by the presence of incomplete and noisy data. The problem is more acute for emerging market economies, where most economic data are released infrequently with a (sometimes substantial) lag. This paper evaluates nowcasts and forecasts of real GDP
Philip Liu; Troy Matheson; Rafael Romeu
The following sections are included: * Introduction * Basic Dynamical Systems: the Toy Model * Initial Condition Ensemble Forecasting * Model Error * Weather and Climate Modelling * Ways Out? * References
Bradley, Seamus; Frigg, Roman; Du, Hailiang; Smith, Leonard A.
Logistics parks' demand is an important basis of establishing the development policy of logistics industry and logistics infrastructure for planning. In order to improve the forecast accuracy of logistics parks' demand, a combination forecasting model is proposed in this paper. Firstly, we use grey forecast model and exponential smoothing method to predict the demand respectively, then we combine the two
Chen Qin; Qi Ming
A predictor based on interacting multiple model (IMM) algorithm is proposed to forecast hourly travel time index (TTI) data\\u000a in the paper. It is the first time to propose the approach to time series prediction. Seven baseline individual predictors\\u000a are selected as combination components. Experimental results demonstrate that the IMM-based predictor can significantly outperform\\u000a the other predictors and provide a
Yang Zhang; Yuncai Liu
Summary In spite of widespread criticism, macroeconometric models are still most popular for forecasting and policy, analysis. When\\u000a the most recent data available on both the exogenous and the endogenous variable are preliminaryestimates subject to a revision\\u000a process, the estimators of the coefficients are affected by the presence of the preliminary data, the projections for the\\u000a exogenous variables are affected by
Giampiero M. Gallo
Volume I provides a description of forecasts of economic growth for California and its large metropolitan areas. The California economy will continue to lag the national economy until 1996. In the longer term, California will surpass the nation in growth near the end of the decade, as the State`s comparative advantages in industry structure, natural and human resources, and geography
The Short-Term Integrated Forecasting System (STIFS) Demand Model consists of a set of energy demand and price models that are used to forecast monthly demand and prices of various energy products up to eight quarters in the future. The STIFS demand model is based on monthly data (unless otherwise noted), but the forecast is published on a quarterly basis. All of the forecasts are presented at the national level, and no regional detail is available. The model discussed in this report is the April 1985 version of the STIFS demand model. The relationships described by this model include: the specification of retail energy prices as a function of input prices, seasonal factors, and other significant variables; and the specification of energy demand by product as a function of price, a measure of economic activity, and other appropriate variables. The STIFS demand model is actually a collection of 18 individual models representing the demand for each type of fuel. The individual fuel models are listed below: motor gasoline; nonutility distillate fuel oil, (a) diesel, (b) nondiesel; nonutility residual fuel oil; jet fuel, kerosene-type and naphtha-type; liquefied petroleum gases; petrochemical feedstocks and ethane; kerosene; road oil and asphalt; still gas; petroleum coke; miscellaneous products; coking coal; electric utility coal; retail and general industry coal; electricity generation; nonutility natural gas; and utility petroleum. The demand estimates produced by these models are used in the STIFS integrating model to produce a full energy balance of energy supply, demand, and stock change. These forecasts are published quarterly in the Outlook. Details of the major changes in the forecasting methodology and an evaluation of previous forecast errors are presented once a year in Volume 2 of the Outlook, the Methodology publication.
In the framework of competitive electricity markets, power producers and consumers need accurate price forecasting tools. Price forecasts embody crucial information for producers and consumers when planning bidding strategies in order to maximize their benefits and utilities, respectively. This paper provides two highly accurate yet efficient price forecasting tools based on time series analysis: dynamic regression and transfer function models.
F. J. Nogales; J. Contreras; A. J. Conejo; R. Espinola
In the framework of competitive electricity markets, power producers and consumers need accurate price forecasting tools. Price forecasts embody crucial information for producers and consumers when planning bidding strategies in order to maximize their benefits and utilities, respectively. This paper provides two highly accurate yet efficient price forecasting tools based on time series analysis: dynamic regression and transfer function models.
Francisco J. Nogales; Javier Contreras; Antonio J. Conejo; Rosario Espínola
Burlando, P., Rosso, R., Cadavid, L.G. and Salas, J.D., 1993. Forecasting of short-term rainfall using ARMA models. J. Hydrol., 144: 193-211. Flood forecasting depends essentially on forecasting of rainfall or snow melt. In this paper, rainfall forecasting is approached assuming that hourly rainfall follows an autoregressive moving average (ARMA) process. This assumption is based on the fact that the autocovariance
Paolo Burlando; Renzo Rosso; Luis G. Cadavid; Jose D. Salas
This study used the Weather Research and Forecasting (WRF) modeling system and the Distributed Hydrology-Soil-Vegetation Model (DHSVM) to forecast the snowmelt runoff in the 800 km2 Juntanghu watershed of the northern slope of Tianshan Mountains from 29 February-6 March 2008. This paper made an exploration for snowmelt runoff forecasting model combing closely practical application in meso-microscale. It included: (1) A limited-region 24-h Numeric Weather Forecasting System was established by using the new generation atmospheric model system WRF with the initial fields and lateral boundaries forced by Chinese T213L31 model. (2) The DHSVM hydrological model driven by WRF forecasts was used to predicate 24 h snowmelt runoff at the outlet of Juntanghu watershed. The forecasted result shows a good agreement with the observed data, and the average absolute relative error of maximum runoff simulation result is less than 15%. The result demonstrates the potential of using meso-microscale snowmelt runoff forecasting model for flood forecast. The model can provide a longer forecast period compared to traditional models such as those based on rain gauges, statistical forecast.
Zhao, Q.; Liu, Z.; Li, M.; Wei, Z.; Fang, S.
Forecasts of certain weather elements are improved by linearly relating observed elements to past observations, climatological information, and numerical weather prediction model output. Model output statistics (MOS) is a statistical post-processing of model output that is capable of forecasting sub-grid scale, synoptically-forced events and of correcting some systematic, but state dependent, model bias. We propose to exploit this tendency of accounting for model error by feeding MOS forecasts back into the state estimation problem. MOS forecasts and their associated uncertainty are treated as ``observations'' of the future system state and a four-dimensional variational assimilation procedure is employed to improve the original analysis and resulting model forecast. In a simple-model scenario, it is found that this approach has a small negative impact on the magnitude of forecast errors relative to MOS, but a large positive impact on the variance about the forecast errors: forecast busts are reduced. As a further step, a second round of MOS is performed on the new model forecasts in a manner identical to the original MOS approach. This second application of MOS results in a significant reduction in both the forecast errors and the variance about those errors relative to the first application of MOS.
Hansen, J. A.; Emanuel, K. A.
Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).
Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.
The NASA Forecast Model WMS (NFMW) provides on-demand visualizations of Earth science data. The current usage focuses on field campaigns and other projects that use the output of the Goddard Earth Observing System (GEOS) and Weather Research Framework (WRF) models, but other models can be supported. The NFMW implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) interoperability specification. WMS provisions for handling time and other dimensions are used extensively to support the multi-dimensional nature of the forecast data. Scientists and other interested parties access the WMS using a variety of clients including our own web-based Viewer, Google Earth, a multi-screen Hyperwall installation, or other WMS-compliant applications. We have found that offering an open-standard interface to our data collection has simplified usage of the data and has permitted users to visualize the data from remote locations using only a web browser. The NFMW and Viewer may be accessed at http://map.nasa.gov/tools.html. The NFMW software is available as open source, and is written in a combination of Perl and Interactive Data Language (IDL). This work is supported by the Geosciences Interoperability Office (GIO) and the Modeling and Analysis Program (MAP) at NASA.
de La Beaujardière, J.
In recent years there has been a considerable development in modelling nonlinearities and asymmetries in economic and financial variables. The aim of the current paper is to compare the forecasting performance of different models for the returns of three of the most traded exchange rates in terms of the U.S. dollar, namely the French franc (FF\\/$), the German mark (DM\\/$)
Gianna Boero; Emanuela Marrocu
Timely economic forecasts by means of dynamic models rely on updated time series, the last figure(s) of which are provisional and will be typically subjected to a number of revisions. A general approach to the efficient use of provisional observations in dynamic models is presented, based on the state-space methodology and the Kalman filter. Suitable adaptations are introduced, chiefly involving
Silvano Bordignon; Ugo Trivellato
The Railbelt Electricity Demand (RED) Model, reported in this paper, is a simulation model designed to forecast annual electricity consumption for the residential, commercial-industrial-government and miscellaneous end-use sectors of Alaska's Railbelt region. The model also takes into account government intervention in the energy markets via conservation programs in Alaska and produces forecasts of system annual peak demand. The forecasts of
M. J. King; M. J. Scott
This paper provides an account of the performance of a multimodel ensemble for real time forecasts of Atlantic tropical cyclones during 2004, 2005 and 2006. The Florida State University (FSU) superensemble is based on a suite of model forecasts and the interpolated official forecast that were received in real time at the National Hurricane Center. The FSU superensemble is a multimodel ensemble that utilizes forecasts from the member models by removing their individual biases based on a recent past history of their performances. This superensemble carries separate statistical weights for track and intensity forecasts for every 6 h of the member model forecasts. The real time results from 2004 show an improvement up to 15% for track forecasts and up to 11% for intensity forecasts for the superensemble compared to other models and consensus aids. During 2005, the superensemble intensity performance was best for most lead times. The consistency of the superensemble forecasts of track are also illustrated for several storms of 2004 season. The superensemble methodology produced impressive intensity forecasts for Rita and Wilma during 2005. The study shows the capability of the superensemble in predicting rapidly intensifying storms when most member models failed to capture their strengthening.
Krishnamurti, T. N.; Biswas, Mrinal K.; Mackey, Brian P.; Ellingson, Robert G.; Ruscher, Paul H.
The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100). Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray's MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray's MS-GARCH model. Therefore, the models are promising for various economic applications. PMID:24977200
Bildirici, Melike; Ersin, Ozgür
The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100). Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray's MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray's MS-GARCH model. Therefore, the models are promising for various economic applications.
Bildirici, Melike; Ersin, Ozgur
We establish an on-line optimization framework to exploit weather forecast information in the operation of energy systems. We argue that anticipating the weather conditions can lead to more proactive and cost-effective operations. The framework is based on the solution of a stochastic dynamic real-time optimization (D-RTO) problem incorporating forecasts generated from a state-of-the-art weather prediction model. The necessary uncertainty information is extracted from the weather model using an ensemble approach. The accuracy of the forecast trends and uncertainty bounds are validated using real meteorological data. We present a numerical simulation study in a building system to demonstrate the developments.
Zavala, V. M.; Constantinescu, E. M.; Krause, T.; Anitescu, M.
The paper discusses different approaches to, and models for, project level cash flow forecasting. The importance of cash flow management, both at the project and at the company level is also discussed. The paper presents a resource-based computerized cash-flow forecasting model. The main issue addressed by that model is the solution of the compatibility problem caused by the different data
In recent years, Smart Water Grid (SWG) concept has globally emerged over the last decade and also gained significant recognition in South Korea. Especially, there has been growing interest in water demand forecast and optimal pump operation and this has led to various studies regarding energy saving and improvement of water supply reliability. Existing water demand forecasting models are categorized into two groups in view of modeling and predicting their behavior in time series. One is to consider embedded patterns such as seasonality, periodicity and trends, and the other one is an autoregressive model that is using short memory Markovian processes (Emmanuel et al., 2012). The main disadvantage of the abovementioned model is that there is a limit to predictability of water demands of about sub-daily scale because the system is nonlinear. In this regard, this study aims to develop a nonlinear ensemble model for hourly water demand forecasting which allow us to estimate uncertainties across different model classes. The proposed model is consist of two parts. One is a multi-model scheme that is based on combination of independent prediction model. The other one is a cross validation scheme named Bagging approach introduced by Brieman (1996) to derive weighting factors corresponding to individual models. Individual forecasting models that used in this study are linear regression analysis model, polynomial regression, multivariate adaptive regression splines(MARS), SVM(support vector machine). The concepts are demonstrated through application to observed from water plant at several locations in the South Korea. Keywords: water demand, non-linear model, the ensemble forecasting model, uncertainty. Acknowledgements This subject is supported by Korea Ministry of Environment as "Projects for Developing Eco-Innovation Technologies (GT-11-G-02-001-6)
Kwon, Hyun-Han; So, Byung-Jin; Kim, Seong-Hyeon; Kim, Byung-Seop
Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.
Summer, R. A.; Smolensky, S. M.; Muir, A. H.
AEM (Arctic Economics Model) for oil and gas was developed to provide an analytic framework for understanding the arctic area resources. It provides the capacity for integrating the resource and technology information gathered by the arctic research and development (R&D) program, measuring the benefits of alternaive R&D programs, and providing updated estimates of the future oil and gas potential from arctic areas. AEM enables the user to examine field or basin-level oil and gas recovery, costs, and economics. It provides a standard set of selected basin-specified input values or allows the user to input their own values. AEM consists of five integrated submodels: geologic/resource submodel, which distributes the arctic resource into 15 master regions, consisting of nine arctic offshore regions, three arctic onshore regions, and three souhtern Alaska (non-arctic) regions; technology submodel, which selects the most appropriate exploration and production structure (platform) for each arctic basin and water depth; oil and gas production submodel, which contains the relationship of per well recovery as a function of field size, production decline curves, and production decline curves by product; engineering costing and field development submodel, which develops the capital and operating costs associated with arctic oil and gas development; and the economics submodel, which captures the engineering costs and development timing and links these to oil and gas prices, corporate taxes and tax credits, depreciation, and timing of investment. AEM provides measures of producible oil and gas, costs, and ecomonic viability under alternative technology or financial conditions.
Reister, D.B. [Oak Ridge National Lab., TN (United States)
The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional effort to evaluate the usefulness of dynamically downscaled global seasonal forecasts. Seven regional climate models have downscaled 10-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) for each winter season (December-April) of 1982-2003. The target region for downscaling is the continental United States. MRED investigators also have developed methods and metrics for analysis of downscaled forecasts. These include an Added Value Index that quantifies skill improvement of a downscaled forecast compared to the corresponding global forecast. Results show that added value from downscaling depends on location, forecast variable, and lead time. Locations with added value are generally in the western United States, and added value tends to be greater for precipitation than for temperature. Downscaled forecasts have greatest skill for seasonal precipitation anomalies in strong El Niño events such as 1982-83 and 1997-98. In most circumstances area averaged seasonal precipitation for the regional models closely tracks the corresponding results for the global model, though with an offset that varies considerably amongst the regional models. There is large spread amongst the 15 CFS ensemble members and this carries through to the corresponding downscaled forecasts. Because of the strong dependence of downscaled results on the global model, future experiments should test the use of multiple global models downscaled by multiple regional models.
Multi-model ensemble seasonal forecasting system has expanded in recent years, with a dozen coupled climate models around the world being used to produce hindcasts or real-time forecasts. However, many models are sharing similar atmospheric or oceanic components which may result in similar forecasts. This raises questions of whether the ensemble is over-confident if we treat each model equally, or whether we can obtain an effective subset of models that can retain predictability and skill as well. In this study, we use a hierarchical clustering method based on inverse trigonometric cosine function of the anomaly correlation of pairwise model hindcasts to measure the similarities among twelve American and European seasonal forecast models. Though similarities are found between models sharing the same atmospheric component, different versions of models from the same center sometimes produce quite different temperature forecasts, which indicate that detailed physics packages such as radiation and land surface schemes need to be analyzed in interpreting the clustering result. Uncertainties in clustering for different forecast lead times also make reducing redundant models more complicated. Predictability analysis shows that multi-model ensemble is not necessarily better than a single model, while the cluster ensemble shows consistent improvement against individual models. The eight model-based cluster ensemble forecast shows comparable performance to the total twelve model ensemble in terms of probabilistic forecast skill for accuracy and discrimination. This study also manifests that models developed in U.S. and Europe are more independent from each other, suggesting the necessity of international collaboration in enhancing multi-model ensemble seasonal forecasting.
Yuan, Xing; Wood, Eric F.
The study uses the RM3, the regional climate model at the Center for Climate Systems Research of Columbia University and the NASA/Goddard Institute for Space Studies (CCSR/GISS). The paper evaluates 30 48-hour RM3 weather forecasts over West Africa during September 2006 made on a 0.5 grid nested within 1 Global Forecast System (GFS) global forecasts. September 2006 was the Special Observing Period #3 of the African Monsoon Multidisciplinary Analysis (AMMA). Archived GFS initial conditions and lateral boundary conditions for the simulations from the US National Weather Service, National Oceanographic and Atmospheric Administration were interpolated four times daily. Results for precipitation forecasts are validated against Tropical Rainfall Measurement Mission (TRMM) satellite estimates and data from the Famine Early Warning System (FEWS), which includes rain gauge measurements, and forecasts of circulation are compared to reanalysis 2. Performance statistics for the precipitation forecasts include bias, root-mean-square errors and spatial correlation coefficients. The nested regional model forecasts are compared to GFS forecasts to gauge whether nesting provides additional realistic information. They are also compared to RM3 simulations driven by reanalysis 2, representing high potential skill forecasts, to gauge the sensitivity of results to lateral boundary conditions. Nested RM3/GFS forecasts generate excessive moisture advection toward West Africa, which in turn causes prodigious amounts of model precipitation. This problem is corrected by empirical adjustments in the preparation of lateral boundary conditions and initial conditions. The resulting modified simulations improve on the GFS precipitation forecasts, achieving time-space correlations with TRMM of 0.77 on the first day and 0.63 on the second day. One realtime RM3/GFS precipitation forecast made at and posted by the African Centre of Meteorological Application for Development (ACMAD) in Niamey, Niger is shown.
Druyan, Leonard M.; Fulakeza, Matthew; Lonergan, Patrick; Worrell, Ruben
This paper investigates the impact and potential use of the cut-cell vertical discretisation for forecasts covering five days and climate simulations. A first indication of the usefulness of this new method is obtained by a set of five-day forecasts, covering January 1989 with six forecasts. The model area was chosen to include much of Asia, the Himalayas and Australia. The cut-cell model LMZ (Lokal Modell with z-coordinates) provides a much more accurate representation of mountains on model forecasts than the terrain-following coordinate used for comparison. Therefore we are in particular interested in potential forecast improvements in the target area downwind of the Himalayas, over southeastern China, Korea and Japan. The LMZ has previously been tested extensively for one-day forecasts on a European area. Following indications of a reduced temperature error for the short forecasts, this paper investigates the model error for five days in an area influenced by strong orography. The forecasts indicated a strong impact of the cut-cell discretisation on forecast quality. The cut-cell model is available only for an older (2003) version of the model LM (Lokal Modell). It was compared using a control model differing by the use of the terrain-following coordinate only. The cut-cell model improved the precipitation forecasts of this old control model everywhere by a large margin. An improved, more transferable version of the terrain-following model LM has been developed since then under the name CLM (Climate version of the Lokal Modell). The CLM has been used and tested in all climates, while the LM was used for small areas in higher latitudes. The precipitation forecasts of the cut-cell model were compared also to the CLM. As the cut-cell model LMZ did not incorporate the developments for CLM since 2003, the precipitation forecast of the CLM was not improved in all aspects. However, for the target area downstream of the Himalayas, the cut-cell model considerably improved the prediction of the monthly precipitation forecast even in comparison with the modern CLM version. The cut-cell discretisation seems to improve in particular the localisation of precipitation, while the improvements leading from LM to CLM had a positive effect mainly on amplitude.
Steppeler, J.; Park, S.-H.; Dobler, A.
This study creates an adaptive procedure for sequential forecasting of incident duration. This adaptive procedure includes two adaptive Artificial Neural Network-based models as well as the data fusion techniques to forecast incident duration. Model A is used to forecast the duration time at the time of incident notification, while Model B provides multi-period updates of duration time after the incident notification. These two models together provide a sequential forecast of incident duration from the point of incident notification to the incident road clearance. Model inputs include incident characteristics, traffic data, time gap, space gap, and geometric characteristics. The model performance of mean absolute percentage error for forecasted incident duration at each time point of forecast are mostly under 40%, which indicates that the proposed models have a reasonable forecast ability. With these two models, the estimated duration time can be provided by plugging in relevant traffic data as soon as an incident is reported. Thereby travelers and traffic management units can better understand the impact of the existing incident. Based on the model effect assessments, this study shows that the proposed models are feasible in the Intelligent Transportation Systems (ITS) context. PMID:17303059
Wei, Chien-Hung; Lee, Ying
Many efforts have been presented in the bibliography for wind power forecasting in power systems and few of them have been used for autonomous power systems. The impact of knowing the distribution function of wind power forecasting error in the economic operation of a power system is studied in this paper. The papers proposes that the distribution of the wind
Antonis G. Tsikalakis; Yiannis A. Katsigiannis; Pavlos S. Georgilakis; Nikos D. Hatziargyriou
In the first article the basic principles and methods of economic forecasting were considered. We pass on now to concrete forecasts of the prospects of development of the American economy which are derived from these methodological principles. Let us begin with the estimates given by American economists of the condition of the United States economy in 5 to 15 years.
A. Shapiro; O. Bogdanov
This paper investigates the impact and potential use of the cut cell vertical discretisation for forecasts of 5 days and climate simulations. A first indication of the usefulness of this new method is obtained by a set of five-day forecasts, covering January 1989 by 6 forecasts. The model area was chosen to include much of Asia, the Himalayas and Australia. The cut cell model LMZ provides a much more accurate representation of mountains on model forecasts than the terrain following coordinate used for comparison. Therefore we are in particular interested in potential forecast improvements in the target area downwind of the Himalaya, over South East China, Korea and Japan. The LMZ has been tested so far extensively for one-day forecasts on an European area. Following indications of a reduced temperature error for the short forecasts, this paper investigates the model error for five days in an area influenced by strong orography. The forecasts indicated a strong impact of the cut cell discretisation on forecast quality. The cut cell model is available only of an older (2003) Version of the model LM. It was compared using a control model differing by the use of the terrain following coordinate only. The cut cell model improved the precipitation forecasts of this old control model everywhere by a large margin. An improved version of the terrain following model LM has been developed since then under the name CLM. The CLM has been used and tested in all climates, while the LM was used for small areas in higher latitudes. The precipitation forecasts of cut cell model were compared also to the CLM. As the cut cell model LMZ did not incorporate the developments for CLM since 2003, the precipitation forecast of the CLM was not improved in all aspects. However, for the target area downstream of the Himalaya, the cut cell model improved the prediction of the monthly precipitation forecast even in comparison with the modern model version CLM considerably. The cut cell discretisation seems to improve in particular the localisation of precipitation, while the improvements leading from LM to CLM had a positive effect mainly on amplitude.
Steppeler, J.; Park, S.-H.; Dobler, A.
The economic future of methanol is reviewed in light of its potential uses as a substitute for traditional hydrocarbon fuels and feedstocks as well as some evolving new uses. Methanol's future market position will depend strongly on its production cost in comparison with competitive products. One promising way to reduce the production cost is by use of an improved catalyst in the process by which methanol is obtained from the feedstock - which can be either natural gas or a similar product such as synthesis gas from coal gasification. To estimate the potential cost savings with an improved catalyst, we have based our analysis on a recent study which assumed use of synthesis gas from underground coal gasification as a feedstock for making methanol. The improved catalyst we studied was an actinide oxide whose features include high tolerance to sulfur and heat, and a yield of about 4 mol% methanol per pass with a 2/1 mixture of H/sub 2//CO. We calculated the effect of this catalyst on methanol production costs in a 12,000-bbl/day plant. The result was a saving of from 1 cent to 2.5 cent per gallon on the total methanol synthesis cost of 23 cents per gallon (i.e., a saving in the conversion process of 4.4% to 10.9%), excluding the cost of the raw feed gas. We conclude from this study that the improved catalyst could bring important savings in methanol production. The estimated savings range from 4.4% to 10.9% in the cost of methanol synthesis from the feedstock material. Another possibility for lowering methanol production costs in the future may lie in switching from a natural-gas-based feedstock to a coal-based feedstock - for example, using synthesis gas from underground coal gasification as the raw material. Our projections suggest that coal will eventually become a less expensive feedstock than natural gas.
Grens, J.; Borg, I.; Stephens, D.; Colmenares, C.
Data from weather satellites have become integral to the weather forecast process in the United States and abroad. Satellite data are used to derive improved forecasts for short-term routine weather, long-term climate change, and for predicting natural disasters. The resulting forecasts have saved lives, reduced weather-related economic losses, and improved the quality of life. Weather information routinely assists in managing resources more efficiently and reducing industrial operating costs. The electric energy industry in particular makes extensive use of weather information supplied by both government and commercial suppliers. Through direct purchases of weather data and information, and through participating in the increasing market for weather derivatives, this sector provides measurable indicators of the economic importance of weather information. Space weather in the form of magnetic disturbances caused by coronal mass ejections from the sun creates geomagnetically induced currents that disturb the electric power grid, sometimes causing significant economic impacts on electric power distribution. This paper examines the use of space-derived weather information on the U.S. electric power industry. It also explores issues that may impair the most optimum use of the information and reviews the longer-term opportunities for employing weather data acquired from satellites in future commercial and government activity.
Hertzfeld, Henry R.; Williamson, Ray A.; Sen, Avery
In this study, the visibility parameterizations developed during Fog Remote Sensing And Modeling (FRAM) projects, conducted in central and eastern Canada, will be summarized and their use for forecasting/nowcasting applications will be discussed. Parameterizations developed for reductions in visibility due to 1) fog, 2) rain, 3) snow, and 4) relative humidity (RH) during FRAM will be given and uncertainties in the parameterizations will be discussed. Comparisons made between Canadian GEM NWP model (with 1 and 2.5 km horizontal grid spacing) and observations collected during the Science of Nowcasting Winter Weather for Vancouver 2010 (SNOW-V10) project and FRAM projects, using the new parameterizations, will be given Observations used in this study were obtained using a fog measuring device (FMD) for fog parameterization, a Vaisala all weather precipitation sensor called FD12P for rain and snow parameterizations and visibility measurements, and a total precipitation sensor (TPS), and distrometers called OTT ParSiVel and Laser Precipitation Measurement (LPM) for rain/snow particle spectra. The results from the three SNOW-V10 sites suggested that visibility values given by the GEM model using the new parameterizations were comparable with observed visibility values when model based input parameters such as liquid water content, RH, and precipitation rate for visibility parameterizations were predicted accurately.
Gultepe, I.; Milbrandt, J.; Binbin, Z.
Forecasts of time averages of 1-10 days in duration by an operational numerical weather prediction model are documented for the global 500 mb height field in spectral space. Error growth in very idealized models is described in order to anticipate various features of these forecasts and in order to anticipate what the results might be if forecasts longer than 10 days were carried out by present day numerical weather prediction models. The data set for this study is described, and the equilibrium spectra and error spectra are documented; then, the total error is documented. It is shown how forecasts can immediately be improved by removing the systematic error, by using statistical filters, and by ignoring forecasts beyond about a week. Temporal variations in the error field are also documented.
Roads, J. O.
A new channel dynamics scheme ASPIRE (alternative system predictor in real time), designed specifically for real-time river flow forecasting, is introduced to reduce uncertainty in the forecast. ASPIRE is a storage routing model that limits the influence of catchment model forecast errors to the downstream station closest to the catchment. Comparisons with the Muskingum routing scheme in field tests suggest that the ASPIRE scheme can provide more accurate forecasts, probably because discharge observations are used to a maximum advantage and routing reaches (and model errors in each reach) are uncoupled. Using ASPIRE in conjunction with the Kalman filter did not improve forecast accuracy relative to a deterministic updating procedure. Theoretical analysis suggests that this is due to a large process noise to measurement noise ratio. -Authors
Hoos, A. B.; Koussis, A. D.; Beale, G. O.
We propose the use of signal detection theory (SDT) to evaluate the performance of both probabilistic forecasting systems and individual forecasters. The main advantage of SDT is that it provides a principled way to distinguish the response from system diagnosticity, which is defined as the ability to distinguish events that occur from those that do not. There are two challenges in applying SDT to probabilistic forecasts. First, the SDT model must handle judged probabilities rather than the conventional binary decisions. Second, the model must be able to operate in the presence of sparse data generated within the context of human forecasting systems. Our approach is to specify a model of how individual forecasts are generated from underlying representations and use Bayesian inference to estimate the underlying latent parameters. Given our estimate of the underlying representations, features of the classic SDT model, such as the receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC), follow immediately. We show how our approach allows ROC curves and AUCs to be applied to individuals within a group of forecasters, estimated as a function of time, and extended to measure differences in forecastability across different domains. Among the advantages of this method is that it depends only on the ordinal properties of the probabilistic forecasts. We conclude with a brief discussion of how this approach might facilitate decision making. PMID:24147636
Steyvers, Mark; Wallsten, Thomas S; Merkle, Edgar C; Turner, Brandon M
This paper introduces nearest neighbour based fuzzy model (NNFM) based on membership values for forecasting the daily maximum temperature at Delhi . Fuzzy mem- bership values has been used to make single point forecasts into the future on the basis of past nearest neighbours. Compared with other statistical method and artiflcial neural network (ANN) technique, this approach has the advantages
A. K MITRA; SANKAR NATH
This rpeort supplements the Economic Impact Forecast System (EIFS) Revised Users Manual (ADA073667, July 1979) developed by the U.S. Army Construction Engineering Research Laboratory. It describes the modifications made to EIFS to allow military personnel...
R. D. Webster S. Odom
This report describes a computerized traffic forecasting model, developed by Brookhaven National Laboratory (BNL) for a portion of the Long Island INFORM Traffic Corridor. The model has gone through a testing phase, and currently is able to make accurate ...
A. Azarm S. Mughabghab D. Stock
In 2009, the Collaboratory for the Study of the Earthquake Predictability (CSEP) initiated a prototype global earthquake forecast experiment. Three models participated in this experiment for 2009, 2010 and 2011—each model forecast the number of earthquakes above magnitude 6 in 1x1 degree cells that span the globe. Here we use likelihood-based metrics to evaluate the consistency of the forecasts with the observed seismicity. We compare model performance with statistical tests and a new method based on the peer-to-peer gambling score. The results of the comparisons are used to build ensemble models that are a weighted combination of the individual models. Notably, in these experiments the ensemble model always performs significantly better than the single best-performing model. Our results indicate the following: i) time-varying forecasts, if not updated after each major shock, may not provide significant advantages with respect to time-invariant models in 1-year forecast experiments; ii) the spatial distribution seems to be the most important feature to characterize the different forecasting performances of the models; iii) the interpretation of consistency tests may be misleading because some good models may be rejected while trivial models may pass consistency tests; iv) a proper ensemble modeling seems to be a valuable procedure to get the best performing model for practical purposes.
Taroni, Matteo; Zechar, Jeremy; Marzocchi, Warner
The GISS general circulation model was used to compute global monthly mean forecasts for January 1973, 1974, and 1975 from initial conditions on the first day of each month and constant sea surface temperatures. Forecasts were evaluated in terms of global and hemispheric energetics, zonally averaged meridional and vertical profiles, forecast error statistics, and monthly mean synoptic fields. Although it generated a realistic mean meridional structure, the model did not adequately reproduce the observed interannual variations in the large scale monthly mean energetics and zonally averaged circulation. The monthly mean sea level pressure field was not predicted satisfactorily, but annual changes in the Icelandic low were simulated. The impact of temporal sea surface temperature variations on the forecasts was investigated by comparing two parallel forecasts for January 1974, one using climatological ocean temperatures and the other observed daily ocean temperatures. The use of daily updated sea surface temperatures produced no discernible beneficial effect.
Spar, J.; Atlas, R. M.; Kuo, E.
This paper develops alternatively structured trip frequency\\/generation models, and investigates their forecast performance. The first model presented is the simple linear model with a discussion of its theoretical shortcomings. Models that address, in a progressive fashion, the underlying shortcomings of the linear model are then presented. These models are namely the truncated normal model, the Poisson model, the negative binomial
Daniel A. Badoe
In order to get maximum benefits from operational forecast systems based on different model approaches, it is necessary to find an optimal way to combine the forecasts in real-time and to derive the predictive probability distribution by assigning different weights to the different actual forecasts according to the forecast performance of the previous days. In the European Flood Alert System (EFAS) a Bayesian Forecast System has been implemented in order to derive the overall predictive probability distribution. The EFAS is driven by different numerical weather prediction systems like the deterministic forecasts from the German Weather Service and from the ECMWF, as well as Ensemble Prediction Systems from the ECMWS and COSMO-LEPS. In this study the effect of combining these different forecast systems in respect of the total predictive uncertainty are investigated by applying different weighting methods like the Non-homogenous Gaussian Regression (NGR) model, the Bayesian Model Averaging (BMA) and an empirical method. Besides that different methods of bias removal are applied, namely additive and regression based ones, and the applicability in operational forecast is tested. One of the problems identified is the difficulty in optimizing the weight parameters for each lead-time separately resulting in highly inconsistent forecasts, especially for regression based bias removal methods. Therefore in operational use methods with only sub-optimal skill score results, could be preferable showing more realistic shapes of uncertainty bands for the predicted future stream-flow values. Another possible approach could be the optimization of the weighting parameters not for each lead-time separately, but to look at different levels of aggregations over expanding windows of time ranges. First results indicate the importance of the proper choice of the model combination method in view of reliability and sharpness of the forecast system.
Bogner, Konrad; Pappenberger, Florian; Cloke, Hannah L.
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, "least-cost," and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor. PMID:24511292
Zhao, Xiuli; Asante Antwi, Henry; Yiranbon, Ethel
Electricity markets in the United States are evolving. Accurate wind power forecasts are beneficial for wind plant operators, utility operators, and utility customers. An accurate forecast makes it possible for grid operators to schedule the economically ...
M. Milligan M. Schwartz Y. Wan
Two distinct elements seem to be required to make accurate wind-speed forecasts for wind-farms: the first is deterministic output from a weather forecast model, and the second is some probabilistic or statistical post-processing to account for local biases, or systematic errors in the model. A variety of statistical post-processing schemes are available, and are generally worthwhile since they are cheap and at worst do no harm. More typically, they demonstrably improve the accuracy of the deterministic forecast. Gridded output from the operational HARMONIE mesoscale weather forecast model has been interpolated to forecast winds at the precise (3-dimensional) location of the met-mast of a wind farm in southwest Ireland. A sequence of 48-hour forecasts run at 6-hourly intervals for over one year have been validated against winds recorded at turbine height on the mast. All the interpolated deterministic forecasts are also post-processed using Bayesian Model Averaging (BMA) to remove systematic local bias, and to provide forecasts in a calibrated probabilistic format. Three variants of the HARMONIE model were also run during October 2010 and validated against the winds recorded at the met-mast. The HARMONIE variant with the most advanced physics and the larger domain was the most accurate in forecasting met-mast windspeed, with mean average error (MAE) of 1.5 ms-1 (i.e., about 10% of mean wind speed). The BMA analysis for this short period (using a 25-day training period) did not change the MAE for the best HARMONIE configuration, but did improve the MAE of the other two by about 15%. The most advanced HARMONIE configuration proved more accurate than an ensemble of all three. There was negligible degradation in the skill of the hourly forecasts, at least out to 24 hours (i.e., 24-hr forecasts were only marginally less accurate than 0-hr analyses or 1-hr forecasts). Results are presented from the operational 48-hr HARMONIE forecasts collected during Jan.-Mar. 2012, as compared with recorded winds at the met-mast. The added value of BMA post-processing (using a moving 25-day training period) is quantified. Forecasts from an experimental extra high-resolution HARMONIE (1km resolution, on a 1,000 x 1,000 km domain) are available for a continuous 30-day period starting 10 Nov. 2012, and the extra skill provided by this for the specific wind-farm site is also quantified.
Peters, Martin; McKinstry, Alastair; O'Brien, Enda; Ralph, Adam; Sheehy, Michael
One can observe deterministic seismogenic processes evolving into large earthquakes (EQs) by the time series analyses of EQ source parameters collected from a catalog (Takeda, Japanese patent 2003; Takeda and Takeo, AIP Conf. Proc. 2004). The observation has been successfully applied to a short-term (weeks or months) deterministic forecasting of large EQs in Japan since 2003 at www.tec21.jp. The prediction of the time, focus and magnitude M of a large EQ are all within narrow limits. The accuracy of time prediction is particularly successful, which is within a day or two. A key to the observation is to use magnitude Mc of about 3 to 4 corresponding to the unique size of fractures of a few hundred meters to one km, respectively (Aki, EPS 2004). With the EQs collected by a magnitude window of M larger than Mc for a small mesh area of about 5 degrees by 5 degrees, one can detect a subtle departure from the self-similar seismicity to dominate the brittle part of the earth lithosphere. This departure is also a signature of low dimensional deterministic chaos in the seismogenic process of large EQs. For example, the largest Lyapunov exponents of every source parameter series are all positive values, statistically distinct from those of the original series surrogated by randomly shuffling their chorological order (Takeda and Takeo, AIP Conf. Proc. 2004). Proposed is a physical model to describe how major EQs are deterministically generated. The central to the model is how a large fracture size of M (more than about 10km) is to be created by the characteristic fractures of Mc initiated by ductile fractures. Thus the model is a progress of the brittle-ductile interaction hypothesis envisioned by Aki (Aki, EPS 2004; Jin and Aki, EPS 2005).
Takeda, F.; Takeo, M.
This report summarizes the results obtained from a research project sponsored by Florida Department of Transportation to develop a Backpropagation Neural Network (BPNN) model for the forecasting of pavement crack condition of Florida's highway network. Th...
Z. Lou, J. J. Lu M. Gunaratne
The report includes an evaluation of current data on podiatry manpower, description of podiatry forecasting models developed, and recommendations for future podiatry manpower studies. The report is presented in four volumes: Volume I. Report Narrative and...
S. P. Nyman L. G. Buttell
Diabatic processes can alter Rossby wave structure; consequently, errors arising from model processes propagate downstream. However, the chaotic spread of forecasts from initial condition uncertainty renders it difficult to trace back from root-mean-square forecast errors to model errors. Here diagnostics unaffected by phase errors are used, enabling investigation of systematic errors in Rossby waves in winter season forecasts from three operational centers. Tropopause sharpness adjacent to ridges decreases with forecast lead time. It depends strongly on model resolution, even though models are examined on a common grid. Rossby wave amplitude reduces with lead 5 days, consistent with underrepresentation of diabatic modification and transport of air from the lower troposphere into upper tropospheric ridges, and with too weak humidity gradients across the tropopause. However, amplitude also decreases when resolution is decreased. Further work is necessary to isolate the contribution from errors in the representation of diabatic processes.
Gray, S. L.; Dunning, C. M.; Methven, J.; Masato, G.; Chagnon, J. M.
Using density forecasts, we compare the predictive performance of duration models that have been developed for modelling intra-day data on stock markets. Our model portfolio encompasses the autoregressive conditional duration (ACD) model, its logarithmic version (Log-ACD), the threshold ACD (TACD) model - in each case with alternative error distributions -, the stochastic conditional duration model (SCD), and the stochastic volatility
Luc BAUWENS; Pierre GIOT; Joachim GRAMMIG; David VEREDAS
In a water-stressed region, such as the western United States, it is essential to have long lead-time streamflow forecast for reservoir operation and water resources management. In this study, we develop and examine the accuracy of a data driven model incorporating large-scale climate information for extending streamflow forecast lead-time. A data driven model i.e. Support Vector Machine (SVM) based on
A. Kalra; W. P. Miller; S. Ahmad; K. W. Lamb
The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market.
Dai, Yonghui; Han, Dongmei; Dai, Weihui
We propose a method to combine earthquake forecast models. The general procedure is to successively create new generations of a rate-based model by injecting into the current generation the additional knowledge carried by other input models. For a single iteration, we use the differential probability gain calculated in the Molchan diagram that evaluates the performance of the input model with respect to the current generation of the rate-based model. Then, at each point in space and time, the new rate is the product of the current rate times the local differential probability gain. The main advantage of our combining method is to produce high expected event rates using all types of numerical forecast models. The only restriction is that the input model has to bring additional amount of information with respect to the current generation of the alarm-based model. Here, we apply this method to EAST and EEPAS, two forecast models currently tested in the California testing center of the Collaboratory for the Study of Earthquake Predictability (CSEP). During the testing period from July 2009 to December 2011, the combined model shows better performance than the input model (EAST) and the initial rate-based model (EEPAS), both in terms of Molchan diagrams and likelihood tests. We show that a large number of events occurs in a limited space of higher forecasted rates. Most importantly, these rates are significantly higher than a linear combination of the two forecast models.
Shebalin, P.; Narteau, C.; Zechar, J.; Holschneider, M.
Two of the primary goals of the Collaboratory for the Study of Earthquake Predictability (CSEP) are (1) reducing the controversy surrounding earthquake prediction and (2) promoting rigorous research on earthquake predictability. An essential part of achieving these goals is rigorous and transparent testing and evaluation of submitted earthquake forecasts. Many types of tests have been adopted by CSEP, but the list is neither exhaustive nor fully understood. Repeating the CSEP calculations and performing a suite of additional tests, we analyzed the forecasts submitted to the 2005 RELM experiment to better determine which aspects of the forecasts were being evaluated by each test. We summarize our findings by listing the strengths and weaknesses of each test and suggest additional tests to be incorporated into the CSEP framework.
Holliday, J. R.; Rundle, J. B.; Turcotte, D. L.
Purpose – Airplane technology is undergoing several exciting developments, particularly in avionics, material composites, and design tool capabilities, and, though there are many studies conducted on subsets of airplane technology, market, and economic parameters, few exist in forecasting new commercial aircraft model introduction. In fact, existing research indicates the difficulty in quantitatively forecasting commercial airplanes due in part to the
Ann-Marie Lamb; Tugrul U. Daim; Timothy R. Anderson
In the field of hydrological prediction for medium-sized watersheds, characterized by complex orography and short response times, forecasts cannot rely only upon observed precipitation: predicted rainfall is in this case an essential input for hydrological models. However, the quality and reliability of deterministic numerical precipitation forecasts driving a hydrological model are often unsatisfactory, because uncertainty in Quantitative Precipitation Forecasts (QPFs) is considerable at the scales of interest for hydrological purposes. The uncertainty inherent in precipitation forecast can be accounted for better estimating the uncertainty associated with the flood forecast, in order to provide a more informative hydrological prediction. The methodology proposed and adopted in this work is based on a hydrological ensemble forecasting approach that uses multiple precipitation scenarios provided by different high-resolution numerical weather prediction models, driving the same hydrological model. In this way, the uncertainty associated with the meteorological forecasts can propagate into the hydrological models and be used in warnings and decision making procedures relying upon a probabilistic approach. In the framework of RISK AWARE, an INTERREG III B EU project, a detailed analysis of two cases of intense precipitation affecting the Reno river basin, a medium-sized catchment in northern Italy, has been performed. One case study has been performed using lateral boundary values derived from analysed fields, the other simulating a real time forecast, i.e., using forecasted boundary conditions. Four different meteorological models (Lokal Modell, RAMS, BOLAM and MOLOCH), operating at different horizontal resolutions, provide QPFs which are used to force the hydrological model. The discharge predictions are obtained by means of the physically based rainfall-runoff model TOPKAPI. The results provide examples of the uncertainties inherent in the QPF and show that the hydrological response of the Reno river basin, as simulated by the TOPKAPI model, is highly sensitive to the correct space-time localization of precipitation, even if the total amount of rainfall is, on average, well forecasted. The system seems able to provide useful information concerning the discharge peaks (amount and timing) for warning purposes.
Diomede, T.; Davolio, S.; Marsigli, C.; Miglietta, M. M.; Moscatello, A.; Papetti, P.; Paccagnella, T.; Buzzi, A.; Malguzzi, P.
Earthquake forecasting is one of the most relevant scientific contributions for society, being one of the primary ingredients for a well-founded seismic hazard assessment. Despite the importance of the issue, we are still far from a commonly accepted methodology to accomplish this goal. The epistemic uncertainty on this issue is well depicted by the simultaneous use, in practical cases, of
W. Marzocchi; A. Lombardi
In the framework of the National Project “Sviluppo di distretti industriali per le Osservazioni della Terra” (Development of Industrial Districts for Earth Observations) funded by MIUR (Ministero dell'Università e della Ricerca Scientifica --Italian Ministry of the University and Scientific Research) two operational mesoscale models were set-up for Calabria, the southernmost tip of the Italian peninsula. Models are RAMS (Regional Atmospheric Modeling System) and MM5 (Mesoscale Modeling 5) that are run every day at Crati scrl to produce weather forecast over Calabria (http://www.crati.it). This paper reports model intercomparison for Quantitative Precipitation Forecast evaluated for a 20 month period from 1th October 2000 to 31th May 2002. In addition to RAMS and MM5 outputs, QBOLAM rainfall fields are available for the period selected and included in the comparison. This model runs operationally at “Agenzia per la Protezione dell'Ambiente e per i Servizi Tecnici”. Forecasts are verified comparing models outputs with raingauge data recorded by the regional meteorological network, which has 75 raingauges. Large-scale forcing is the same for all models considered and differences are due to physical/numerical parameterizations and horizontal resolutions. QPFs show differences between models. Largest differences are for BIA compared to the other considered scores. Performances decrease with increasing forecast time for RAMS and MM5, whilst QBOLAM scores better for second day forecast.
Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Lavagnini, A.; Accadia, C.; Mariani, S.; Casaioli, M.
This study aims to propose an approach which applies Weather Research and Forecasting (WRF) model forecasts and satellite rainfalls by Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to physiographic inundation-drainage model for real-time flood forecasting. The study area is Dianbao River Basin in southern Taiwan, which is a low-relief area easily suffering flood disasters. Since the study area lacks reliable rainfall forecasting and inundation simulation models, the study proposes an approach to refine WRF model forecasts (abbreviated as WRFMFs hereafter) using satellite rainfalls by PERSIANN (abbreviated as PERSIANN rainfalls hereafter) for enhancing the inundation forecasts and prolonging the lead time. Twenty one sets of on-line WRFMFs under different hypothesized boundary conditions are provided by Taiwan Typhoon and Flood Research Institute. The WRFMFs with a spatial resolution of 5 km*5 km cover the extent of Taiwan (120°E~122°E, 22°N~25°N), which are issued for 72 hours ahead for every 6 hours. However, WRFMFs have a 6-hour delay and are quite different due to their different non-isolated boundary conditions. On the other hand, PERSIANN rainfalls provided by CHRS/UCI are based on the real-time satellite images and can provide real-time global rainfall estimation. Therefore, integrating WRFMFs and PERSIANN rainfalls may be a good approach to provide better rainfall forecasts. The main idea of this approach is to give different WRFMFs different weights by comparing to the PERSIANN rainfalls when a typhoon is formed in the open sea and approaching to Taiwan. Based on the 21 sets of WRFMFs, a pattern recognition method is used to compare the PERSIANN rainfalls to each of the 21 sets of WRFMFs during a same time period for every 6 hours. For example, at a present time (18:00) the WRFMFs are issued with a 6-hour delay from 12:00 for 72 hours ahead. The comparison between each of the 21 sets of WRFMFs and the PERSIANN rainfalls during the past 6 hours (12:00~18:00) is made. Based on the comparisons, 21 errors can be calculated for assigning the weights to the 21 sets of WRFMFs for the 66 hours ahead (herein, six hours ahead are adopted). A set of WRFMF with a smaller error is assigned to have a higher weight. Then, the ensemble approach for the 21 sets of WRFMFs with different weights is performed to obtain more reliable rainfall forecasts. Finally, the study uses physiographic inundation-drainage model for flood inundation simulation. This inundation-drainage model is a pseudo 2-D model which can reasonably simulate flood inundation under the condition of complex topography. By inputting the ensemble of WRFMFs, the inundation-drainage model can forecast the flood extent and depth with less computational time in the study area. These forecasted inundation information can be used to plot the flood inundation maps and help decision makers quickly identify the flood prone areas and make emergency preparedness in advance.
Kuo, C.; Chen, J.; Yang, T.; Lin, Y.; Wang, Y.; Hsu, K.; Sorooshian, S.; Lee, C.; Yu, P.
The INDEPTH industrial planning methodology will enable utilities to forecast service area electricity demand. The system allows the user to develop energy forecasts for the whole industrial sector, to examine industries most important to the service area, and to study uses of electricity that are of interest in demand-side management programs. The econometric model in this volume forecasts energy use for the entire industrial sector using a set of simultaneous factor demand equations with an imposed structure derived from the economic theory of cost-minimizing behavior.
Andrews, L.M.; King, M.J.; Leary, N.; Perry, D.M.; Snow, C.C.
Energy demand forecasting is a critical task and it allows to anticipate any problems that might affect power systems operators, especially during periods with high demand peaks. The difficulties of this task are due to the complexity of the systems involved: energy usage patterns are particularly variable and influenced by many factors, such as weather conditions, social, economic and political aspects (i.e. national regulations, international relations). The strong influence of weather on electricity demand in Italy is due to the wide use of residential air-conditioning devices and, more in general, refrigeration and ventilation equipments. For this reasons, accurate climate information may help in obtaining precise energy demand forecasts, usually performed with statistical methods which show their effectiveness particularly where large amount of data is available. We present a study with the aim of assess the effects of the quality of weather data on statistical modelling performance on energy demand forecasting, using data provided by national transmission grid operator.
De Felice, M.; Alessandri, A.; Ruti, P. M.
Using density forecast evaluation techniques, we compare the predictive performance of econometric specifications that have been developed for modeling duration processes in intra-day financial markets. The model portfolio encompasses various variants of the Autoregressive Conditional Duration (ACD) model and recently proposed dynamic factor models. The evaluation is conducted on time series of trade, price and volume durations computed from transaction
Luc Bauwens; Pierre Giot; Joachim Grammig; David Veredas
We analyse empirical errors observed in historical population forecasts produced by statistical agencies in 14 European countries since 1950. The focus is on forecasts for three demographic variables: fertility (Total Fertility Rate - TFR), mortality (life expectancy at birth), and migration (net migration). We inspect forecast bias and forecast accuracy in the historical forecasts, as well as the distribution of
Distributed Hydrologic Models for Flow Forecasts Part 2 is the second release in a two-part series focused on the science of distributed models and their applicability to different flow forecasting situations. Presented by Dr. Dennis Johnson, the module provides a more detailed look at the processes and mechanisms involved in distributed hydrologic models. It examines the rainfall/runoff component, snowmelt, overland flow routing, and channel response in a basin as represented in a distributed model. Calibration issues and situations in which distributed hydrologic models might be most appropriate are also addressed.
The current generation of time stepping hydrological models used by operational forecasting agencies are process-weak, where model parameters are often assigned unrealistic values to compensate for model structural weaknesses. These time stepping simulation models are therefore subject to the same stationarity predicament that plagues statistical streamflow forecasting systems. Consequently, the operational forecasting community has similar research priorities to the science community, that is, to develop physically realistic hydrological models. This paper describes development of a new modeling framework to improve the representation of hydrological processes within operational streamflow forecasting models. The framework recognizes that the majority of process-based models use the same set of physics - most models use Darcy's Law to represent the flow of water through the soil matrix and Fourier's Law for thermodynamics. The new modeling framework uses numerically robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including different methods to represent spatial variability and different parameterizations of surface fluxes and shallow groundwater. Use of multivariate research data to evaluate these different modeling options reveals that the new modeling framework can provide realistic simulations of both point-scale measurements of hydrologic states and fluxes as well as realistic simulations of streamflow in headwater catchments, with minimal calibration. Moreover, the availability of multiple modeling options improves representation of model uncertainty.
Restrepo, Pedro; Wood, Andy; Clark, Martyn
Recently, data with complex characteristics such as epilepsy electroencephalography (EEG) time series has emerged. Epilepsy EEG data has special characteristics including nonlinearity, nonnormality, and nonperiodicity. Therefore, it is important to find a suitable forecasting method that covers these special characteristics. In this paper, we propose a coercively adjusted autoregression (CA-AR) method that forecasts future values from a multivariable epilepsy EEG time series. We use the technique of random coefficients, which forcefully adjusts the coefficients with ?1 and 1. The fractal dimension is used to determine the order of the CA-AR model. We applied the CA-AR method reflecting special characteristics of data to forecast the future value of epilepsy EEG data. Experimental results show that when compared to previous methods, the proposed method can forecast faster and accurately.
Kim, Sun-Hee; Faloutsos, Christos; Yang, Hyung-Jeong
This paper presents two avalanche forecasting applications NXD2000 and NXD-REG which were developed at the Swiss Federal Institute for Snow and Avalanche Re-search (SLF). Even both are based on the nearest neighbour method they are targeted to different scales. NXD2000 is used to forecast avalanches on a local scale. It is operated by avalanche forecasters responsible for snow safety at snow sport areas, villages or cross country roads. The area covered ranges from 10 km2 up to 100 km2 depending on the climatological homogeneity. It provides the forecaster with ten most similar days to a given situation. The observed avalanches of these days are an indication of the actual avalanche danger. NXD-REG is used operationally by the Swiss avalanche warning service for regional avalanche forecasting. The Nearest Neighbour approach is applied to the data sets of 60 observer stations. The results of each station are then compiled into a map of current and future avalanche hazard. Evaluation of the model by cross-validation has shown that the model can reproduce the official SLF avalanche forecasts in about 52% of the days.
Gassner, M.; Brabec, B.
A global forecast model is used to examine various sensitivities of numerical predictions of three extreme winter storms that occurred near the eastern continental margin of North America: the Ohio Valley blizzard of January 1978, the New England blizzard of February 1978, and the Mid-Atlantic cyclone of February 1979. While medium-resolution simulations capture much of the intensification, the forecasts of the precise timing and intensity levels suffer from various degrees of error. The coastal cyclones show a 5-10 hPa dependence on the western North Atlantic sea surface temperature, which is varied within a range (± 2.5°C ) compatible with interannual fluctuations. The associated vertical velocities and precipitation rates show proportionately stronger dependences on the ocean temperature perturbations. The Ohio Valley blizzard, which intensified along a track 700-800 km from the coast, shows little sensitivity to ocean temperature. The effect of a shift of ˜ 10° latitude in the position of the snow boundary is negligible in each case. The forecasts depend strongly on the model resolution, and the coarse-resolution forecasts are consistently inferior to the medium-resolution forecasts. Studies of the corresponding sensitivities of extreme cyclonic events over eastern Asia are encouraged in order to identify characteristics that are common to numerical forecasts for the two regions.
Yih, A. C.; Walsh, J. E.
A methodology for constructing spatial-temporal probabilistic inflow forecasts based on output from deterministic precipitation runoff models is introduced. The post processor combines the deterministic forecast with the inflow climatology and the persistent forecast using a regression model. The methodology was tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology has enough flexibility to model operationally important features in this case study such as heteroscedasticity, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. In operational use it is strait forward to use the models to sample inflow ensembles (5 catchments and 10 lead times) that inherits the catchment and lead time dependencies. Our model was tested against deterministic inflow forecast, climatology forecast and a persistent forecast, and our approach was found to be the better forecast. This approach has a large flexibility since the regression coefficients depends on lead time. For the first lead time the hydrological forecast is given the largest weight, whereas for the longest lead time the climatology get the largest weight.
Engeland, Kolbjørn; Steinsland, Ingelin
Forecasting accuracy is particularly important when forecasting tourism demand on account of the perishable nature of the product. This study compares a range of forecasting models in the context of predicting annual tourist flows into Hong Kong from the major long-haul markets of the US, the UK, Germany and major short-haul markets of China, Japan and Taiwan. Econometric forecasting models
Koon Nam Lee
A coupled general circulation model for global seasonal forecasting, GloSea, has been developed at the Met Office. GloSea is based on the climate version of the Met Office Unified Model, HadCM3, with a number of enhancements appropriate for seasonal forecasting purposes. These include increased vertical ocean resolution, a variable spatial horizontal grid which gives increased meridional ocean resolution in the tropics and a coastal tiling scheme. As well as being used for real-time seasonal forecasting, GloSea is also contributing to the EU DEMETER project in which six state-of-the-art coupled ocean-atmosphere models from European Institutes are used to explore the potential for multi-model ensemble seasonal forecasting. An accurate description of the ocean initial state is an important component of the coupled forecast system. Assimilation of sub-surface ocean temperature data com- bined with ECMWF analysis (real time forecast) or ERA40 (DEMETER) surface flux forcing is used to produce an ocean analysis. An ensemble of ocean initial condi- tions for each start date is generated by applying a combination of wind stress and sea surface temperature perturbations, each designed to explore the uncertainity in the observational data. The atmosphere initial state is taken from the ECMWF analysis or ERA40. The length of each prediction is 6 months. The results presented here will examine the skill of the Met Office model in hindcast- ing ENSO events and associated tropical variability. Our most recent ENSO forecast will also be shown.
Ineson, S.; Barnes, R. T. H.; Davey, M. K.; Huddleston, M. R.
Genetic programming (or GP) is a random search technique that emerged in the late 1980s and early 1990s. A formal description of the method was introduced in Koza (1992). GP applies to many optimization areas. One of them is modeling time series and using those models in forecasting. Unlike other modeling techniques, GP is a computer program that 'searches' for
M. A. Kaboudan
Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge(WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.
MacNeice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.
Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.
Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.
An economic development model was formulated to foster and strengthen commerce and industry retention and expansion in the state of Illinois. The main thrust of the model was on increasing productivity, decreasing business failures, encouraging entrepreneurship, and creating a favorable business climate through community support. To meet these…
Monitoring water scarcity conditions requires medium term streamflow forecasting. In this contribution stochastic models for the forecasting of monthly flows were compared. Data measured in monthly time step from the Hron and the Morava Rivers in Slovakia were considered. When analyzing this data in a shorter, daily time step, it was verified, that the from econometry known, so - called heteroscedasticity effect, i.e. the non-constant variance of the time series was present. Here it was investigated, whether this was the case if considering the data with a monthly time step. In addition, the time series were analyzed from two different perspectives: using a purely data driven stochastic model and a hybrid approach, combining physics based conceptual model with a data driven model for the residuals. To model the heteroscedasticity in the time series, the GARCH (generalized autoregressive conditional heteroscedasticity) family of models was fitted to the time series. So far, only a few attempts to apply GARCH class models used on discharge data were reported in the hydrological modelling literature. The goal of investigation was to try to expand the knowledge in the time series modelling of hydrological time series with the aim to test the possibility to use the GARCH family of models on time series with monthly time step and comparing forecasting performance with traditional ARMA models. In order to achieve this, following steps were taken: 1. The presence of heteroscedasticity was verified in time series. 2. An ARMA type model combined with a GARCH model was fitted to the data (either directly on the discharge time series or on the error series resulting from a conceptual model). 3. One - step - ahead forecasts from the fitted models were produced, performing comparisons to forecasts obtained by using only an ARMA class model on the same data. In the case of the purely data driven model it was found, that the medium time step was not fine enough to catch the heteroscedasticity effect, which is present in the data when considering a finer time step at all. Considering the hybrid framework, even though heteroscedasticity was not rejected in the error series, the GARCH family of models did not offer any forecasting improvement compared to the simpler ARMA class of models. This result shows the existence and thus the need of modelling the non-linearities in some cases in the medium step, even if different methods offering better forecasting performance need to be investigated.
I argue that hazard models are more appropriate than single-period models for forecasting bankruptcy. Single-period models are inconsistent, while hazard models produce consistent estimates. I describe a simple technique for estimating a discrete-time hazard model. I find that about half of the accounting ratios that have been used in previous models are not statistically significant. Moreover, market size, past stock
A technology able to rapidly forecast wildlfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the on-going fire. The article at hand presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and a forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the high capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event). This work opens the door to further advances framework and more sophisticated models while keeping the computational time suitable for operativeness.
Rios, O.; Jahn, W.; Rein, G.
A newly developed model of the U.S. egg industry provides quarterly forecasts of egg prices and quantities for use in short- to medium-term outlook and policy analysis. The model incorporates both behavioral and biological factors to generate supply and u...
R. P. Stillman
A neural network model was developed to analyze and forecast the behavior of the river Tagliamento, in Italy, during heavy rain periods. The model makes use of distributed rainfall information coming from several rain gauges in the mountain district and predicts the water level of the river at the section closing the mountain district. The water level at the closing
Marina Campolo; Paolo Andreussi; Alfredo Soldati
Travel time estimation is a key factor in the successful implementation of routing guidance applications in intelligent transportation systems. Traditional models of traffic congestion and management lack the adaptability and sophistication needed to effectively and reliably deal with increasing traffic volume on certain road stretches. Many existing models base either on speed or traffic flows for traffic condition forecasting. This
D. Boto-Giralda; F. J. Diaz-Pernas; J. F. Diez-Higuera; M. Anton-Rodriguez
One of main objectives established in 2000 for the development of a global data assimilation model for the Earth's ionosphere was to enable the forecast of ionospheric electron and ion densities. Following the exciting development of Global Assimilative Ionospheric Model (GAIM, also known as the Global Assimilation of Ionospheric Measurements) by two teams, the Utah State University team and the
C. Wang; V. Akopian; X. Pi; A. J. Mannucci
Space weather series incorporate several distinct components, cycles at multiple frequencies, irregular trends, and nonlinear variability. The cycles are stochastic, i.e., the amplitude varies over time. Similarly, the trend is stochastic: the slope and direction of trending change repeatedly. This study sets out a combined model using both frequency and time domain methods, in two stages. In the first stage, a frequency domain algorithm is estimated and forecasted. In the second stage, the forecast is used as an input in a neural network. The combined model also includes a term enabling the model to react inversely to large deviations between the actual values and forecast. The models are evaluated using two data sets, the hemispheric power data obtained from the Polar Orbiting Environment satellites, and the Aa geomagnetic index. All the series are at a daily resolution. Forecasting experiments are run over horizons of 1-7 days. The models are estimated using a moving window or adaptive approach. The combined model consistently achieves the most accurate results. Among single equation methods, the frequency domain model is more accurate for the geomagnetic index because it is able to capture the underlying cycles more effectively. In the hemispheric power series, the cycles are less pronounced, so that time domain methods are more accurate, except at very short horizons. Nevertheless, in both data sets, the combined model works well because the frequency domain algorithm captures cyclical behavior, while the neural net is better able to capture short-term dependence and trending.
Particularly in the cold season unfavorable dissemination conditions of the ambient air lead to higher-than-average PM10 concentrations in parts of the western Alpe-Adria-Region, covering the provinces South Tyrol, Carinthia and Styria. Therefore, EU pollution standards cannot be met in the cold season and partial traffic regulation measures are taken in Bolzano, Klagenfurt and Graz, the three capitals in this region. Decision making for these regulations may be based on the average PM10 concentration of the next day provided that reliable forecasts of these values can be offered. In the present paper we show how multiple linear regression models combining information of the present day with meteorological forecasts of the next day can help forecasting daily PM10 concentrations for sites located in the three cities. Special emphasis is given to an appropriate selection of the regressor variables readily available as measured values, factors or meteorological forecasts suitable in operational mode. To reflect the quality of the forecast properly, we define a quality function where prediction errors near the threshold PM10 of 50?gm-3 are assumed to be more severe than errors in regions that are either far below or above the threshold. Since December 2004, the forecasts are used as a monitoring and information tool in Graz. Our daily forecasts have been carried out in cooperation with the meteorologists from the ZAMG Styria (Styrian meteorological office). The investigations in terms of the quality function and according possible decision rules show that our prediction models may support future decisions concerning possible traffic restrictions not only in Graz, but also in Bolzano and Klagenfurt.
Stadlober, Ernst; Hörmann, Siegfried; Pfeiler, Brigitte
A coupled atmospheric-hydrologic-hydraulic ensemble flood forecasting model, driven by The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data, has been developed for flood forecasting over the Huaihe River. The incorporation of numerical weather prediction (NWP) information into flood forecasting systems may increase forecast lead time from a few hours to a few days. A single NWP model forecast from a single forecast center, however, is insufficient as it involves considerable non-predictable uncertainties and leads to a high number of false alarms. The availability of global ensemble NWP systems through TIGGE offers a new opportunity for flood forecast. The Xinanjiang model used for hydrological rainfall-runoff modeling and the one-dimensional unsteady flow model applied to channel flood routing are coupled with ensemble weather predictions based on the TIGGE data from the Canadian Meteorological Centre (CMC), the European Centre for Medium-Range Weather Forecasts (ECMWF), the UK Met Office (UKMO), and the US National Centers for Environmental Prediction (NCEP). The developed ensemble flood forecasting model is applied to flood forecasting of the 2007 flood season as a test case. The test case is chosen over the upper reaches of the Huaihe River above Lutaizi station with flood diversion and retarding areas. The input flood discharge hydrograph from the main channel to the flood diversion area is estimated with the fixed split ratio of the main channel discharge. The flood flow inside the flood retarding area is calculated as a reservoir with the water balance method. The Muskingum method is used for flood routing in the flood diversion area. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE ensemble forecasts. The results demonstrate satisfactory flood forecasting with clear signals of probability of floods up to a few days in advance, and show that TIGGE ensemble forecast data are a promising tool for forecasting of flood inundation, comparable with that driven by raingauge observations.
Bao, Hongjun; Zhao, Linna
For the huge-investment project like rail transit in city, the forecast passenger demand is very important to its planning and feasibility studying. Traditional forecasting methods or models can not fully use all the survey data in passenger demand forecast, and some information is wasted. They have another deficiency that receives a low accuracy value in demand forecasting for incomplete factors
A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…
Sterling Inst., Washington, DC. Educational Technology Center.
The Regional Short-Term Energy Model (RSTEM) uses macroeconomic variables such as income, employment, industrial production and consumer prices at both the national and regional1 levels as explanatory variables in the generation of the Short-Term Energy Outlook (STEO). This documentation explains how national macroeconomic forecasts are used to update regional macroeconomic forecasts through the RSTEM Macro Bridge procedure.
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
This publication documents the load forecast scenarios and assumptions used to prepare BPA`s Whitebook. It is divided into: intoduction, summary of 1993 Whitebook electricity demand forecast, conservation in the load forecast, projection of medium case electricity sales and underlying drivers, residential sector forecast, commercial sector forecast, industrial sector forecast, non-DSI industrial forecast, direct service industry forecast, and irrigation forecast. Four appendices are included: long-term forecasts, LTOUT forecast, rates and fuel price forecasts, and forecast ranges-calculations.
United States. Bonneville Power Administration.
The motivation for this paper is to determine the potential economic value of advanced modelling methods for devising trading decision tools for 10-year Government bonds. Two advanced methods are used: time-varying parameter models with the implementation of state space modelling using a Kalman filter and nonparametric nonlinear models with Neural Network Regression (NNR). These are benchmarked against more traditional forecasting
Christian L. Dunis; Vincent Morrison
For operational purposes predictions of the forecasts of the lumped output of groups of wind farms spread over larger geographic areas will often be of interest. A naive approach is to make forecasts for each individual site and sum them up to get the group forecast. It is however well documented that a better choice is to use a model that also takes advantage of spatial smoothing effects. It might however be the case that some sites tends to more accurately reflect the total output of the region, either in general or for certain wind directions. It will then be of interest giving these a greater influence over the group forecast. Bayesian model averaging (BMA) is a statistical post-processing method for producing probabilistic forecasts from ensembles. Raftery et al.  show how BMA can be used for statistical post processing of forecast ensembles, producing PDFs of future weather quantities. The BMA predictive PDF of a future weather quantity is a weighted average of the ensemble members' PDFs, where the weights can be interpreted as posterior probabilities and reflect the ensemble members' contribution to overall forecasting skill over a training period. In Revheim and Beyer  the BMA procedure used in Sloughter, Gneiting and Raftery  were found to produce fairly accurate PDFs for the future mean wind speed of a group of sites from the single sites wind speeds. However, when the procedure was attempted applied to wind power it resulted in either problems with the estimation of the parameters (mainly caused by longer consecutive periods of no power production) or severe underestimation (mainly caused by problems with reflecting the power curve). In this paper the problems that arose when applying BMA to wind power forecasting is met through two strategies. First, the BMA procedure is run with a combination of single site wind speeds and single site wind power production as input. This solves the problem with longer consecutive periods where the input data does not contain information, but it has the disadvantage of nearly doubling the number of model parameters to be estimated. Second, the BMA procedure is run with group mean wind power as the response variable instead of group mean wind speed. This also solves the problem with longer consecutive periods without information in the input data, but it leaves the power curve to also be estimated from the data.  Raftery, A. E., et al. (2005). Using Bayesian Model Averaging to Calibrate Forecast Ensembles. Monthly Weather Review, 133, 1155-1174. Revheim, P. P. and H. G. Beyer (2013). Using Bayesian Model Averaging for wind farm group forecasts. EWEA Wind Power Forecasting Technology Workshop,Rotterdam, 4-5 December 2013. Sloughter, J. M., T. Gneiting and A. E. Raftery (2010). Probabilistic Wind Speed Forecasting Using Ensembles and Bayesian Model Averaging. Journal of the American Statistical Association, Vol. 105, No. 489, 25-35
Preede Revheim, Pål; Beyer, Hans Georg
Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand. PMID:25053208
Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian
This paper describes the Western Area Gaming and Economic Response Simulator (WAGERS), a forecasting model that emphasizes the role of the gaming industry in Clark County, Nevada. It is designed to generate forecasts of gaming revenues in Clark County, wh...
B. K. Edwards A. Bando
This paper uses the global optimization of genetic algorithm to construct a genetic neural network model (GANN) forecasting listed company financial crisis. The model optimizes input variables of neural network model forecasting financial crisis. Forecasting of financial distress of listed companies in Shanghai and Shenzhen A share markets indicates that this model bears a better ability to predict financial distress
A field experimental study of wave energy dissipation is presented. The experiment was conducted at Lake George, Australia and allowed simultaneous measurements of the source functions in a broad range of conditions, including extreme wind-wave circumstances. Results revealed new physical mechanisms in the processes of spectral dissipation of wave energy, which are presently not accounted for in wave forecast models.
Alexander Babanin; Ian Young; Richard Manasseh; Eric Schultz
A method of forecasting the heat sensitive portion of electrical demand and energy utilizing a summer weather load model and taking into account probability variation of weather factors is discussed in this paper. The heat sensitive portion of the load is separated from base load and historical data is used to determine the effect of weather on the system load.
C. E. Asbury
We present new approaches for building yearly and seasonal models for 5-minute ahead electricity load forecasting. They are evaluated using two full years of Australian electricity load data. We first analyze the cyclic nature of the electricity load and show that the autocorrelation function captures these patterns and can be used to extract useful features, as the data is highly
Irena Koprinska; Mashud Rana; Vassilios G. Agelidis
In recent years a large amount of literature has evolved on the usage of artificial neural network (ANNs) for weather forecasting, particularly because of ANNs' ability to model an unspecified nonlinear relationship of various meteorological variables. In this paper we proposed a dynamic competitive neural network classifier to predict the maximum potential intensity (MPI) of a given tropical cyclone, based
JAMES N. K. LIU; Bo Feng
In this paper, we take into consideration some issues related to the use of a nonlinear structural econometric model in the presence of a data revision process. We analyse the consequences on the parameter estimation (consistency is still attainable) and on forecast. In the latter case, we show that the asymptotic bias and mean squared prediction error of the deterministic
The preparation of two-week duration case studies for the initialization and verification of Gulf Stream forecast models was begun jointly by Harvard and NOARL in 1988. The six case studies were chosen by determining which time periods had both the REX GE...
D. Crout L. Perkins S. M. Glenn
Wind power presented a remarkable growth in the first decade of the 21st century, highly sustained by the economical and ecological benefits of this technology. Not only has it significantly contributed to reduce the dependence on fossil fuels in the production of electrical energy, wind power has also allowed to save great amounts of greenhouse gases emissions. This growth leads
Pedro Gomes; Rui Castro
This dissertation proposes strategies not only for modelling price behavior in the dry bulk market, but also for modelling relationships between economic and technical variables of dry bulk ships, by using modern time series approaches, Monte Carlo simulation and other economic techniques. The time series modelling techniques, described extensively in Appendix A, primarily consist of the Vector Error Correction model
Flow estimation at a point in a river is vital for a number of hydrologic applications including flood forecast. This paper presents the results of a basin scale rainfall-runoff modeling on Bagmati basin in Nepal using the hydrologic model HEC-HMS in a GIS environment. The model, in combination with the GIS extension HEC-GeoHMS, was used to convert the precipitation excess
T. P. KAFLE; M. K. HAZARIKA; S. KARKI; R. M. SSHRESTHA; R. SHARMA
At present continental to global scale flood forecasting predicts at a point discharge, with little attention to detail and accuracy of local scale inundation predictions. Yet, inundation variables are of interest and all flood impacts are inherently local in nature. This paper proposes a large-scale flood inundation ensemble forecasting model that uses best available data and modeling approaches in data scarce areas. The model was built for the Lower Zambezi River to demonstrate current flood inundation forecasting capabilities in large data-scarce regions. ECMWF ensemble forecast (ENS) data were used to force the VIC (Variable Infiltration Capacity) hydrologic model, which simulated and routed daily flows to the input boundary locations of a 2-D hydrodynamic model. Efficient hydrodynamic modeling over large areas still requires model grid resolutions that are typically larger than the width of channels that play a key role in flood wave propagation. We therefore employed a novel subgrid channel scheme to describe the river network in detail while representing the floodplain at an appropriate scale. The modeling system was calibrated using channel water levels from satellite laser altimetry and then applied to predict the February 2007 Mozambique floods. Model evaluation showed that simulated flood edge cells were within a distance of between one and two model resolutions compared to an observed flood edge and inundation area agreement was on average 86%. Our study highlights that physically plausible parameter values and satisfactory performance can be achieved at spatial scales ranging from tens to several hundreds of thousands of km2 and at model grid resolutions up to several km2.
Schumann, G. J.-P.; Neal, J. C.; Voisin, N.; Andreadis, K. M.; Pappenberger, F.; Phanthuwongpakdee, N.; Hall, A. C.; Bates, P. D.
When independence is assumed, forecasts of mortality for subpopulations are almost always divergent in the long term. We propose a method for coherent forecasting of mortality rates for two or more subpopulations, based on functional principal components models of simple and interpretable functions of rates. The product-ratio functional forecasting method models and forecasts the geometric mean of subpopulation rates and the ratio of subpopulation rates to product rates. Coherence is imposed by constraining the forecast ratio function through stationary time series models. The method is applied to sex-specific data for Sweden and state-specific data for Australia. Based on out-of-sample forecasts, the coherent forecasts are at least as accurate in overall terms as comparable independent forecasts, and forecast accuracy is homogenized across subpopulations. PMID:23055234
Hyndman, Rob J; Booth, Heather; Yasmeen, Farah
The Mid-Range Energy Forecasting System (MEFS) is a model used by the Department of Energy to forecast domestic production, consumption and price for conventional energy sources on a regional basis over a period of 5 to 15 years. Among the energy sources included in the model are oil, gas and other petroleum fuels, coal, uranium, and electricity. Final consumption of alternative energy sources is broken into end-use categories, such as residential, commercial and industrial uses. Regional prices for all energy sources are calculated by iteratively equating domestic supply and demand. The purpose of this paper is to assess the ability of the Oil and Gas Supply Submodels of MEFS to reliably and accurately project oil and gas supply curves, which are used in the integrating model, along with fuel demand curves to estimate market price. The reliability and accuracy of the oil and gas model cannot be judged by comparing its predictions against actual observations because those observations have not yet occurred. The reliability and reasonableness of the oil and gas supply model can be judged, however, by analyzing how well its assumptions and predictions correspond to accepted economic principles. This is the approach taken in this critique. The remainder of this paper describes the general structure of the oil and gas supply model and how it functions to project the quantity of oil and gas forthcoming at given prices in a particular year, then discusses the economic soundness of the model, and finally suggests model changes to improve its performance.
This publication provides actual historical and long-term forecast data on labor force, total wage and salary employment, industry employment, and personal income for the state of Washington. The data are based upon the Washington Office of Financial Management long-term population forecast. Chapter 1 presents long-term forecasts of Washington…
Lefberg, Irv; And Others
As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because models tend to have more difficulty in correctly predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models, the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of cloud-allowing forecasts become available.
McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.
Long lead rainfall forecasts are highly valuable for planning and management of water resources and agriculture. In this study, we establish multiple statistical calibration and bridging models that use general circulation model (GCM) outputs as predictors to produce monthly rainfall forecasts for Australia with lead times up to 8 months. The statistical calibration models make use of raw forecasts of rainfall from a coupled GCM, and the statistical bridging models make use of sea surface temperature (SST) forecasts of the GCM. The forecasts from the multiple models are merged through Bayesian model averaging to take advantage of the strengths of individual models. The skill of monthly rainfall forecasts is generally low. Compared to forecasting seasonal rainfall totals, it is more challenging to forecast monthly rainfall. However, there are regions and months for which forecasts are skillful. In particular, there are months of the year for which forecasts can be skillfully made at long lead times. This is most evident for the period of November and December. Using GCM forecasts of SST through bridging clearly improves monthly rainfall forecasts. For lead time 0, the improvement is particularly evident for February to March, July and October to December. For longer lead times, the benefit of bridging is more apparent. As lead time increases, bridging is able to maintain forecast skill much better than when only calibration is applied.
Hawthorne, Sandra; Wang, Q. J.; Schepen, Andrew; Robertson, David
Tall onshore wind turbines, with hub heights between 80 m and 100 m, can extract large amounts of energy from the atmosphere since they generally encounter higher wind speeds, but they face challenges given the complexity of boundary layer flows. This complexity of the lowest layers of the atmosphere, where wind turbines reside, has made conventional modeling efforts less than ideal. To meet the nation's goal of increasing wind power into the U.S. electrical grid, the accuracy of wind power forecasts must be improved. In this report, the Lawrence Livermore National Laboratory, in collaboration with the University of Colorado at Boulder, University of California at Berkeley, and Colorado School of Mines, evaluates innovative approaches to forecasting sudden changes in wind speed or 'ramping events' at an onshore, multimegawatt wind farm. The forecast simulations are compared to observations of wind speed and direction from tall meteorological towers and a remote-sensing Sound Detection and Ranging (SODAR) instrument. Ramping events, i.e., sudden increases or decreases in wind speed and hence, power generated by a turbine, are especially problematic for wind farm operators. Sudden changes in wind speed or direction can lead to large power generation differences across a wind farm and are very difficult to predict with current forecasting tools. Here, we quantify the ability of three models, mesoscale WRF, WRF-LES, and PF.WRF, which vary in sophistication and required user expertise, to predict three ramping events at a North American wind farm.
Wharton, S; Lundquist, J K; Marjanovic, N; Williams, J L; Rhodes, M; Chow, T K; Maxwell, R
A technology able to rapidly forecast wildfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the ongoing fire. This paper presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real-time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event) in the order of 10 min for a spatial scale of 100 m. The greatest strengths of our method are lightness, speed and flexibility. We specifically tailor the forecast to be efficient and computationally cheap so it can be used in mobile systems for field deployment and operativeness. Thus, we put emphasis on producing a positive lead time and the means to maximise it.
Rios, O.; Jahn, W.; Rein, G.
The adaptive use of a conceptual model for real-time flow forecasting is investigated. Maximum likelihood and ordinary least squares estimation criteria are considered, and the performance of maximum likelihood techniques for autocorrelated (AMLE) and heteroscedastic (HMLE) errors is analyzed jointly with that provided by the commonly used ordinary least squares estimation (OLSE) technique. Streamflow forecasts are compared for three rivers in central Italy, obtained by AMLE, HMLE, and OLSE adaptive calibration of a simple conceptual model describing the rainfall-runoff transformation by accounting for Hortonian infiltration and linear basin response to rainfall excess. Although model residuals display both autocorrelation and heteroscedasticity, OLSE is found to provide a rather satisfactory performance. Because the OLSE technique also requires less computational effort compared to that for AMLE and HMLE, one could consider OLSE as a suitable option for real-time model operation.
Brath, Armando; Rosso, Renzo
There is a growing interest on physical and biogeochemical oceanic hindcasts and forecasts from a wide range of users and businesses. In this contribution we present an operational biogeochemical forecast system for the Portuguese and Galician oceanographic regions, where atmospheric, hydrodynamic and biogeochemical variables are integrated. The ocean model ROMS, with a horizontal resolution of 3 km, is forced by the atmospheric model WRF and includes a Nutrients-Phytoplankton-Zooplankton-Detritus biogeochemical module (NPZD). In addition to oceanographic variables, the system predicts the concentration of nitrate, phytoplankton, zooplankton and detritus (mmol N m(-3)). Model results are compared against radar currents and remote sensed SST and chlorophyll. Quantitative skill assessment during a summer upwelling period shows that our modelling system adequately represents the surface circulation over the shelf including the observed spatial variability and trends of temperature and chlorophyll concentration. Additionally, the skill assessment also shows some deficiencies like the overestimation of upwelling circulation and consequently, of the duration and intensity of the phytoplankton blooms. These and other departures from the observations are discussed, their origins identified and future improvements suggested. The forecast system is the first of its kind in the region and provides free online distribution of model input and output, as well as comparisons of model results with satellite imagery for qualitative operational assessment of model skill. PMID:22666349
Marta-Almeida, Martinho; Reboreda, Rosa; Rocha, Carlos; Dubert, Jesus; Nolasco, Rita; Cordeiro, Nuno; Luna, Tiago; Rocha, Alfredo; Lencart E Silva, João D; Queiroga, Henrique; Peliz, Alvaro; Ruiz-Villarreal, Manuel
There is a growing interest on physical and biogeochemical oceanic hindcasts and forecasts from a wide range of users and businesses. In this contribution we present an operational biogeochemical forecast system for the Portuguese and Galician oceanographic regions, where atmospheric, hydrodynamic and biogeochemical variables are integrated. The ocean model ROMS, with a horizontal resolution of 3 km, is forced by the atmospheric model WRF and includes a NPZD biogeochemical module. In addition to oceanographic variables, the system predicts the concentration of nitrate, phytoplankton, zooplankton and detritus (mmolN m-3). Model results are compared against radar currents and remote sensed SST and chlorophyll. Quantitative skill assessment during a summer upwelling period shows that our modelling system adequately represents the surface circulation over the shelf including the observed spatial variability and trends of temperature and chlorophyll concentration. Additionally, the skill assessment also shows some deficiencies like the overestimation of upwelling circulation and consequently, of the duration and intensity of the phytoplankton blooms. These and other departures from the observations are discussed, their origins identified and future improvements suggested. The forecast system is the first of its kind in the region and provides free online distribution of model input and output, as well as comparisons of model results with satellite imagery for qualitative operational assessment of model skill.
Marta-Almeida, M.; Reboreda, R.; Rocha, C.; Dubert, J.; Nolasco, R.; Cordeiro, N.; Luna, T.; Rocha, A.; Silva, J. Lencart e.; Queiroga, H.; Peliz, A.; Ruiz-Villarreal, M.
Background A forecast can be defined as an endeavor to quantitatively estimate a future event or probabilities assigned to a future occurrence. Forecasting stochastic processes such as epidemics is challenging since there are several biological, behavioral, and environmental factors that influence the number of cases observed at each point during an epidemic. However, accurate forecasts of epidemics would impact timely and effective implementation of public health interventions. In this study, we introduce a Dirichlet process (DP) model for classifying and forecasting influenza epidemic curves. Methods The DP model is a nonparametric Bayesian approach that enables the matching of current influenza activity to simulated and historical patterns, identifies epidemic curves different from those observed in the past and enables prediction of the expected epidemic peak time. The method was validated using simulated influenza epidemics from an individual-based model and the accuracy was compared to that of the tree-based classification technique, Random Forest (RF), which has been shown to achieve high accuracy in the early prediction of epidemic curves using a classification approach. We also applied the method to forecasting influenza outbreaks in the United States from 1997–2013 using influenza-like illness (ILI) data from the Centers for Disease Control and Prevention (CDC). Results We made the following observations. First, the DP model performed as well as RF in identifying several of the simulated epidemics. Second, the DP model correctly forecasted the peak time several days in advance for most of the simulated epidemics. Third, the accuracy of identifying epidemics different from those already observed improved with additional data, as expected. Fourth, both methods correctly classified epidemics with higher reproduction numbers (R) with a higher accuracy compared to epidemics with lower R values. Lastly, in the classification of seasonal influenza epidemics based on ILI data from the CDC, the methods’ performance was comparable. Conclusions Although RF requires less computational time compared to the DP model, the algorithm is fully supervised implying that epidemic curves different from those previously observed will always be misclassified. In contrast, the DP model can be unsupervised, semi-supervised or fully supervised. Since both methods have their relative merits, an approach that uses both RF and the DP model could be beneficial.
The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well-documented volcanoes or for different duration data such as the duration of explosive episodes or the duration of repose periods between eruptions.
Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.
The performance of many components in intelligent transportation systems depends heavily on the quality of traffic forecasting. After analyzing the deficiency of existing algorithm and methods in traffic forecasting, we develop a new traffic forecasting model based on logic reasoning and in this paper, we describe the details of each part of this model. Finally through an example, we introduce the working order of the model in traffic forecasting.
Li, Dancheng; Liu, Zhiliang; Liu, Cheng; Liu, Binsheng; Zhang, Wei
The scope of the report is to present the results of the fourth year's work on the atmospheric modeling part of the global climate studies task. The development testing of computer models and initial results are discussed. The appendices contain studies that provide supporting information and guidance to the modeling work and further details on computer model development. Complete documentation of the models, including user information, will be prepared under separate reports and manuals.
Crowley, T. J.; North, G. R.; Smith, N. R.
In a water-stressed region, such as the western United States, it is essential to have long lead-time streamflow forecast for reservoir operation and water resources management. In this study, we develop and examine the accuracy of a data driven model incorporating large-scale climate information for extending streamflow forecast lead-time. A data driven model i.e. Support Vector Machine (SVM) based on the statistical learning theory is used to predict annual streamflow volume 1-year in advance. The SVM model is a learning system that uses a hypothesis space of linear functions in a Kernel induced higher dimensional feature space, and is trained with a learning algorithm from the optimization theory. Annual oceanic-atmospheric indices, comprising of Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), Atlantic Multidecadal Oscillation (AMO), El Niño-Southern Oscillations (ENSO), and a new Sea Surface Temperature (SST) data set of “Hondo” region for a period of 1906-2006 are used to generate annual streamflow volumes for multiple sites in Gunnison River Basin (GRB) and San Juan River Basin (SJRB) located in the Upper Colorado River Basin (UCRB). Based on Correlation Coefficient, Root Means Square Error, and Mean Absolute Error the model shows satisfactory results, and the predictions are in good agreement with measured streamflow volumes. Previous research has identified NAO and ENSO as main drivers for extending streamflow forecast lead-time in the UCRB. Contrary to this, the current research shows a stronger signal between the “Hondo” region SST and GRB and SJRB streamflow for 1-year lead-time. Streamflow predictions from the SVM model are found to be better when compared with the predictions obtained from feed-forward back propagation Artificial Neural Network model and Multiple Linear Regression model. The streamflow forecast provide valuable and useful information for optimal management and planning of water resources in the basins.
Kalra, A.; Miller, W. P.; Ahmad, S.; Lamb, K. W.
Econometric models were developed and estimated for the purpose of forecasting electricity and petroleum demand in US agriculture. A structural approach is pursued which takes account of the fact that the quantity demanded of any one input is a decision made in conjunction with other input decisions. Three different functional forms of varying degrees of complexity are specified for the structural cost function, which describes the cost of production as a function of the level of output and factor prices. Demand for materials (all purchased inputs) is derived from these models. A separate model which break this demand up into demand for the four components of materials is used to produce forecasts of electricity and petroleum is a stepwise manner.
Christensen, L. R.
Progress in formulating, solving and implementing models with multiple user classes that combine several travel choices into a single, consistent mathematical formulation is reviewed. Models in which the travel times and costs on the road network are link flow-dependent are considered; such models seek to represent congestion endogenously. The paper briefly summarizes the origins of this field in the 1950s
David Boyce; Hillel Bar-Gera
The study aims at developing a new scheme to investigate the potential use of ENSO (El Niño/Southern Oscillation) for drought forecasting. In this regard, objective of this study is to extend a previously developed nonhomogeneous hidden Markov chain model (NHMM) to identify climate states associated with drought that can be potentially used to forecast drought conditions using climate information. As a target variable for forecasting, SPI(standardized precipitation index) is mainly utilized. This study collected monthly precipitation data over 56 stations that cover more than 30 years and K-means cluster analysis using drought properties was applied to partition regions into mutually exclusive clusters. In this study, six main clusters were distinguished through the regionalization procedure. For each cluster, the NHMM was applied to estimate the transition probability of hidden states as well as drought conditions informed by large scale climate indices (e.g. SOI, Nino1.2, Nino3, Nino3.4, MJO and PDO). The NHMM coupled with large scale climate information shows promise as a technique for forecasting drought scenarios. A more detailed explanation of large scale climate patterns associated with the identified hidden states will be provided with anomaly composites of SSTs and SLPs. Acknowledgement This research was supported by a grant(11CTIPC02) from Construction Technology Innovation Program (CTIP) funded by Ministry of Land, Transport and Maritime Affairs of Korean government.
Kwon, H.; Yoo, J.; Kim, T.
Accurate forecasting in short term traffic flow is of vital importance in management of good road traffic. In order to increase the precision, this paper proposes a forecasting model in short term traffic flow based on data mining technology. The model consists of three stages: first, the rough set theory and the genetic algorithm are applied to select relevant forecasting
Bin-sheng Liu; Yi-jun Li; Hai-tao Yang; Xue-sheng Sui; Dong-feng Niu
Forecasting methods are routinely employed to predict the outcome of competitive events (CEs) and to shed light on the factors that influence participants’ winning prospects (e.g., in sports events, political elections). Combining statistical models’ forecasts, shown to be highly successful in other settings, has been neglected in CE prediction. Two particular difficulties arise when developing model-based composite forecasts of CE
Stefan Lessmann; Ming-Chien Sung; Johnnie E. V. Johnson; Tiejun Ma
The purpose of this paper is to empirically analyze a hybrid travel time forecasting model with Geographic Information Systems (GIS) technologies for predicting link travel times in congested road networks. In a separate study by You and Kim (2000), a nonparametric regression model has been developed as a core forecasting algorithm to reduce computation time and increase the forecasting accuracy.
Jinsoo You; Tschangho John Kim
We propose the optimal combination forecasting model based on closeness degree and induced ordered weighted harmonic averaging (IOWHA) operator under the uncertain environment in which the raw data are expressed as interval numbers. It is a new kind of combination forecasting model with variant weights. We can obtain weighted coefficient vectors of combination forecasting methods by maximizing the closeness degree
Lei Jin; Huayou Chen; Xiang Li; Mengjie Yao
This Paper considers forecasting by econometric and time series models using preliminary (or provisional) data. The standard practice is to ignore the distinction between provisional and final data. We call the forecasts that ignore such a distinction naïve forecasts, which are generated as projections from a correctly specified model using the most recent estimates of the unobserved final figures. It
Solar radiation forecasts are mainly demanded by the energy sector besides other applications Accurate short-term forecasts of solar energy resources are required for management of co-generation systems and energy dispatch in transmission lines Mesoscale weather forecast models usually have radiation parameterization codes since solar radiation is the main energy source for atmospheric processes The Eta model running operationally in the
R. A. Guarnieri; E. B. Pereira; S. C. Chou
Daily reservoir inflow predictions with lead-times of several days are essential to the operational planning and scheduling of hydroelectric power system. The demand for quantitative precipitation forecasting (QPF) is increasing in hydropower operation with the dramatic advances in the numerical weather prediction (NWP) models. This paper presents a simple and an effective algorithm for daily reservoir inflow predictions which solicits the observed precipitation, forecasted precipitation from QPF as predictors and discharges in following 1 to 6 days as predicted targets for multilayer perceptron artificial neural networks (MLP-ANNs) modeling. An improved error back-propagation algorithm with self-adaptive learning rate and self-adaptive momentum coefficient is used to make the supervised training procedure more efficient in both time saving and search optimization. Several commonly used error measures are employed to evaluate the performance of the proposed model and the results, compared with that of ARIMA model, show that the proposed model is capable of obtaining satisfactory forecasting not only in goodness of fit but also in generalization. Furthermore, the presented algorithm is integrated into a practical software system which has been severed for daily inflow predictions with lead-times varying from 1 to 6 days of more than twenty reservoirs operated by the Fujian Province Grid Company, China.
Zhang, Jun; Cheng, Chun-Tian; Liao, Sheng-Li; Wu, Xin-Yu; Shen, Jian-Jian
A recently implemented real-time ocean prediction system for the western North Atlantic based on the physical circulation model component of the Harvard Ocean Prediction System (HOPS) was used during an observation simulation experiment (OSE) in November 2009. The modeling system was built to capture the mesoscale dynamics of the Gulf Stream (GS), its meanders and rings, and its interaction with the shelf circulation. To accomplish this, the multiscale velocity-based feature models for the GS region are melded with the water-mass-based feature model for the Gulf of Maine and shelf climatology across the shelf/slope front for synoptic initialization. The feature-based initialization scheme was utilized for 4 short-term forecasts of varying lengths during the first two weeks of November 2009 in an ensemble mode with other forecasts to guide glider control.A reanalysis was then carried out by sequentially assimilating the data from three gliders (RU05, RU21 and RU23) for the two-week period. This two-week-long reanalysis framework was used to (i) study model sensitivity to SST and glider data assimilation; and (ii) analyze the impact of assimilation in space and time with patchy glider data. The temporal decay of salinity assimilation is found to be different than that of temperature. The spatial footprint of assimilated temperature appears to be more defined than that of salinity. A strategy for assimilating temperature and salinity in an SST-glider phased manner is then offered. The reanalysis results point to a number of new research directions for future sensitivity and quantitative studies in modeling and data assimilation.
Gangopadhyay, Avijit; Schmidt, Andre; Agel, Laurie; Schofield, Oscar; Clark, Jenifer
In order to solve the problem of flight delay forecasting including the characteristics of airport flight operation, a new composite forecasting model based on the danger model theory and the grey model theory is proposed in this paper. The composite prediction method in this paper uses the pattern of weighted composition which is according to the occupancy proportion of the mean square errors forecasting .The model use the modified approach reflects the periodicity. The experimental result shows that the prediction results is qualified, the new model can meet the requirement of real-time prediction for the management of emergency departments.
Ding, Jianli; Li, Huafeng
Data assimilation (DA) has been widely used in hydrological models to improve model state and subsequent streamflow estimates. However, for poor or non-existent state observations, the state estimation in hydrological DA can be problematic, leading to inaccurate streamflow updates. This study evaluates the soil moisture and flow variations and forecasts by assimilating streamflow and soil moisture. Three approaches of Ensemble Kalman Filter (EnKF) with dual state-parameter estimation are applied: (1) streamflow assimilation, (2) soil moistue assimilation, and (3) combined assimilation of soil moisture and streamflow. The assimilation approaches are evaluated using the Sacramento Soil Moisture Accounting (SAC-SMA) model in the Spencer Creek catchment in southern Ontario, Canada. The results show that there are significant differences in soil moisture variations and streamflow estimates when the three assimilation approaches were applied. In the streamflow assimilation, soil moisture states were markedly distorted, particularly soil moisture of lower soil layer; whereas, in the soil moisture assimilation, streamflow estimates are inaccurate. The combined assimilation of streamflow and soil moisture provides more accurate forecasts of both soil moisture and streamflow, particularly for shorter lead times. The combined approach has the flexibility to account for model adjustment through the time variation of parameters together with state variables when soil moisture and streamflow observations are integrated into the assimilation procedure. This evaluation is important for the application of DA methods to simultaneously estimate soil moisture states and watershed response and forecasts.
Samuel, Jos; Coulibaly, Paulin; Dumedah, Gift; Moradkhani, Hamid
The advances in weather forecast traditionally have been based on two lines of improvement: 1 - deepening the understanding of physical phenomena that underlies the atmospheric dynamics; and 2 - steady increase in computer power that enables use of finer grid resolution. The meteorological centers model dynamics of the atmosphere with the same basic physical laws, but sometimes take different approaches in capturing small-scale phenomena and generally use different grid sizes. As a result there are dozens operational models around the globe with various parameterizations of the unresolved processes. Newest attempts in forecast improvements are based on using ensemble prediction. Multiple outputs are taken from runs with perturbed initial conditions, or perturbed parameter values. A novel paradigm is exploiting dynamical exchange of variables between simultaneously running models. There are already simulations of models exchanging fluxes between ocean and atmospheric models, but examples with direct coupling of different atmospheric models are rather new. Within this approach the coupling schemes can be different, but as simplest appear those that combine corresponding dynamical variables or tendency components. In this work we present results with an artificial toy model-Lorenz 96 model. To make more faithful example as reality (the atmosphere) is considered one Lorenz 96 class III system, while as its imperfect models are taken three class II systems that have different forcing terms. These resemble the models used in three different meteorological centers. The interactive ensemble has tendency that is weighted combination of the individual models' tendencies. The weights are obtained with statistical techniques based on past observations that target to minimize the mismatch between the truth's and interactive ensemble's tendencies. By means of anomaly correlation it is numerically verified that this ensemble has longer range of forecast than the individual models.
Basnarkov, L.; Duane, G. S.; Kocarev, L.
A real-time flood forecasting system using channel flow routing model was developed for runoff forecasting at water gauged and ungaged points along river channels. The system is based on a flood runoff model composed of upstream part models, tributary part models and downstream part models. The upstream part models and tributary part models are lumped rainfall-runoff models, and the downstream
R. Kudo; H. Chikamori; A. Nagai
Coastal inundations are an increasing threat to the lives and livelihoods of people living in low-lying, highly-populated coastal areas. According to a World Bank Report in 2005, at least 2.6 million people may have drowned due to coastal inundation, particularly caused by storm surges, over the last 200 years. Forecasting and prediction of natural events, such as tropical and extra-tropical cyclones, inland flooding, and severe winter weather, provide critical guidance to emergency managers and decision-makers from the local to the national level, with the goal of minimizing both human and economic losses. This guidance is used to facilitate evacuation route planning, post-disaster response and resource deployment, and critical infrastructure protection and securing, and it must be available within a time window in which decision makers can take appropriate action. Recognizing this extreme vulnerability of coastal areas to inundation/flooding, and with a view to improve safety-related services for the community, research should strongly enhance today's forecasting, prediction and early warning capabilities in order to improve the assessment of coastal vulnerability and risks and develop adequate prevention, mitigation and preparedness measures. This paper tries to develop an impact-oriented quantitative coastal inundation forecasting and early warning system with social and economic assessment to address the challenges faced by coastal communities to enhance their safety and to support sustainable development, through the improvement of coastal inundation forecasting and warning systems.
Fakhruddin, S. H. M.; Babel, Mukand S.; Kawasaki, Akiyuki
The forecasting model of the concentration ratio (CR) of 137Cs in the plants taking into consideration organic carbon, pH, mobile and total content of potassium in soil has been developed on the basis of the radioecological investigations in the valleys of the Resseta and Vytebet rivers. The type of functional dependence of CR from soil characteristics can be used for an estimation of the content of radionuclides in various species and productive parts of plants. PMID:23786034
Spirin, E V; Anisimov, V S; Dikarev, D B; Kochetkov, I V; Krylenkin, D V
An experiment on predicting flood flows at each of the upstream and a down stream section of a river network is presented using focused Time Lagged Recurrent Neural Network with three different memories like TDNN memory, Gamma memory and Laguarre memory. This paper focuses on application of memory to the input layer of a TLRN in developing flood forecasting models for multiple sections in a river system. The study shows the Gamma memory has better applicability followed by TDNN and Laguarre memory.
Roy, Parthajit; Choudhury, P. S.; Saharia, Manabendra
The purpose of this paper is to develop and evaluate a hybrid travel time forecasting model with geographic information systems (GIS) technologies for predicting link travel times in congested road networks. In a separate study by You and Kim (cf. You, J., Kim, T.J., 1999b. In: Proceedings of the Third Bi-Annual Conference of the Eastern Asia Society for Transportation Studies,
Jinsoo You; Tschangho John Kim
Data Assimilation through Kalman filtering [1,2] is a powerful statistical tool which allows to combine modeling and observations to increase the degree of knowledge of a given system. We apply this technique to the forecast of solar wind parameters (proton speed, proton temperature, absolute value of the magnetic field and proton density) at 1 AU, using the model described in  and ACE data as observations. The model, which relies on GOES 12 observations of the percentage of the meridional slice of the sun covered by coronal holes, grants 1-day and 6-hours in advance forecasts of the aforementioned quantities in quiet times (CMEs are not taken into account) during the declining phase of the solar cycle and is tailored for specific time intervals. We show that the application of data assimilation generally improves the quality of the forecasts during quiet times and, more notably, extends the periods of applicability of the model, which can now provide reliable forecasts also in presence of CMEs and for periods other than the ones it was designed for. Acknowledgement: The research leading to these results has received funding from the European Commission’s Seventh Framework Programme (FP7/2007-2013) under the grant agreement N. 218816 (SOTERIA project: http://www.soteria-space.eu). References:  R. Kalman, J. Basic Eng. 82, 35 (1960);  G. Welch and G. Bishop, Technical Report TR 95-041, University of North Carolina, Department of Computer Science (2001);  B. Vrsnak, M. Temmer, and A. Veronig, Solar Phys. 240, 315 (2007).
Innocenti, M.; Lapenta, G.; Vrsnak, B.; Temmer, M.; Veronig, A.; Bettarini, L.; Lee, E.; Markidis, S.; Skender, M.; Crespon, F.; Skandrani, C.; Soteria Space-Weather Forecast; Data Assimilation Team
Electricity demand forecasting is becoming an essential tool for energy management, maintenance scheduling and investment decisions in the future liberalized energy markets and fluctuating fuel prices. To address these needs, appropriate forecasting tools for the electricity demand in Greece have been developed and tested. Electricity demand depends on economic variables and national circumstances as well as on climatic conditions. Following
S. Mirasgedis; Y. Sarafidis; E. Georgopoulou; D. P. Lalas; M. Moschovits; F. Karagiannis; D. Papakonstantinou
The objective of this study was to assess the suitability of 3 different modeling techniques for the prediction of total daily herd milk yield from a herd of 140 lactating pasture-based dairy cows over varying forecast horizons. A nonlinear auto-regressive model with exogenous input, a static artificial neural network, and a multiple linear regression model were developed using 3 yr of historical milk-production data. The models predicted the total daily herd milk yield over a full season using a 305-d forecast horizon and 50-, 30-, and 10-d moving piecewise horizons to test the accuracy of the models over long- and short-term periods. All 3 models predicted the daily production levels for a full lactation of 305 d with a percentage root mean square error (RMSE) of ?12.03%. However, the nonlinear auto-regressive model with exogenous input was capable of increasing its prediction accuracy as the horizon was shortened from 305 to 50, 30, and 10 d [RMSE (%)=8.59, 8.1, 6.77, 5.84], whereas the static artificial neural network [RMSE (%)=12.03, 12.15, 11.74, 10.7] and the multiple linear regression model [RMSE (%)=10.62, 10.68, 10.62, 10.54] were not able to reduce their forecast error over the same horizons to the same extent. For this particular application the nonlinear auto-regressive model with exogenous input can be presented as a more accurate alternative to conventional regression modeling techniques, especially for short-term milk-yield predictions. PMID:24731634
Murphy, M D; O'Mahony, M J; Shalloo, L; French, P; Upton, J
At present continental to global scale flood forecasting focusses on predicting at a point discharge, with little attention to the detail and accuracy of local scale inundation predictions. Yet, inundation is actually the variable of interest and all flood impacts are inherently local in nature. This paper proposes a first large scale flood inundation ensemble forecasting model that uses best available data and modeling approaches in data scarce areas and at continental scales. The model was built for the Lower Zambezi River in southeast Africa to demonstrate current flood inundation forecasting capabilities in large data-scarce regions. The inundation model domain has a surface area of approximately 170k km2. ECMWF meteorological data were used to force the VIC (Variable Infiltration Capacity) macro-scale hydrological model which simulated and routed daily flows to the input boundary locations of the 2-D hydrodynamic model. Efficient hydrodynamic modeling over large areas still requires model grid resolutions that are typically larger than the width of many river channels that play a key a role in flood wave propagation. We therefore employed a novel sub-grid channel scheme to describe the river network in detail whilst at the same time representing the floodplain at an appropriate and efficient scale. The modeling system was first calibrated using water levels on the main channel from the ICESat (Ice, Cloud, and land Elevation Satellite) laser altimeter and then applied to predict the February 2007 Mozambique floods. Model evaluation showed that simulated flood edge cells were within a distance of about 1 km (one model resolution) compared to an observed flood edge of the event. Our study highlights that physically plausible parameter values and satisfactory performance can be achieved at spatial scales ranging from tens to several hundreds of thousands of km2 and at model grid resolutions up to several km2. However, initial model test runs in forecast mode revealed that it is crucial to account for basin-wide hydrological response time when assessing lead time performances notwithstanding structural limitations in the hydrological model and possibly large inaccuracies in precipitation data.
Schumann, Guy J-P; Neal, Jeffrey C.; Voisin, Nathalie; Andreadis, Konstantinos M.; Pappenberger, Florian; Phanthuwongpakdee, Kay; Hall, Amanda C.; Bates, Paul D.
A probabilistic precipitation forecasting model using generalized additive models (GAMs) and Bayesian model averaging (BMA) was proposed in this paper. GAMs were used to fit the spatial-temporal precipitation models to individual ensemble member forecasts. The distributions of the precipitation occurrence and the cumulative precipitation amount were represented simultaneously by a single Tweedie distribution. BMA was then used as a post-processing method to combine the individual models to form a more skillful probabilistic forecasting model. The mixing weights were estimated using the expectation-maximization algorithm. The residual diagnostics was used to examine if the fitted BMA forecasting model had fully captured the spatial and temporal variations of precipitation. The proposed method was applied to daily observations at the Yishusi River basin for July 2007 using the National Centers for Environmental Prediction ensemble forecasts. By applying scoring rules, the BMA forecasts were verified and showed better performances compared with the empirical probabilistic ensemble forecasts, particularly for extreme precipitation. Finally, possible improvements and application of this method to the downscaling of climate change scenarios were discussed.
Yang, Chi; Yan, Zhongwei; Shao, Yuehong
This paper discusses an extended method of “Feature Space Forecast Method" which we proposed before. When we forecast traffic information, we have to consider various factors such as days, seasons, holidays, and so on. Furthermore, for nation-wide forecast services, the number of road links handled by a forecast model reaches more than 0.1 million. Therefore, in order to provide accurate nation-wide services, a forecast method that can efficiently deal with a large amount of traffic data is required. The proposed method achieves an efficient forecast process with a small forecast model that is one-tenth as large as that of traditional methods, by performing forecasting calculation in the feature space shared by multiple road links.
Kumagai, Masatoshi; Fushiki, Takumi; Kimita, Kazuya; Yokota, Takayoshi
The paper outlines a procedure for using an earthquake instability model and repeated geodetic measurements to attempt an earthquake forecast. The procedure differs from other prediction methods, such as recognizing trends in data or assuming failure at a critical stress level, by using a self-contained instability model that simulates both preseismic and coseismic faulting in a natural way. In short, physical theory supplies a family of curves, and the field data select the member curves whose continuation into the future constitutes a prediction. Model inaccuracy and resolving power of the data determine the uncertainty of the selected curves and hence the uncertainty of the earthquake time.
Stuart, William, D.; Archuleta, Ralph, J.; Lindh, Allan, G.
Quantitative methods have nowadays become very important tools for forecasting purposes in financial markets as for improved decisions and investments. Forecasting accuracy is one of the most important factors involved in selecting a forecasting method; hence, never has research directed at improving upon the effectiveness of time series models stopped. Artificial neural networks (ANNs) are flexible computing frameworks and universal
Mehdi Khashei; Seyed Reza Hejazi; Mehdi Bijari
The energy system and economic models described in this paper are designed for general usage in government and industry. Most energy-policy issues and decisions involve a complex mix of technical, economic, environmental, and social value considerations. ...
K. C. Hoffman
Flood inundation poses a major risk to many populated areas around the world. Despite the economic losses and the devastating societal impacts floods have, low frequency, high magnitude events are still poorly monitored, modelled and predicted in many areas across the globe, especially in data-poor regions of the developing world. In these areas, satellite observations and large scale coupled hydrologic-hydrodynamic models are currently the only option to help understand and predict high magnitude flood events. To contribute to these ongoing efforts, this paper presents a simple index for forecasting large-scale flood inundation in data poor regions. Based on a test case in the Lower Zambezi basin (Mozambique), we demonstrate how satellite data, specifically data from the upcoming SMAP mission can be used in conjunction with meteorological forecast data and outputs from a coupled hydrologic-hydrodynamic (VIC-LISFLOOD-FP) model of the region to build up meaningful correlations between rainfall, antecedent soil moisture and simulated flood inundation variables. Along with the data, these correlations can then be used to build up a long term look-up catalogue to develop a simple flood forecast index. Our project illustrates that this index can be applied to forecast flood inundation based on forecast rainfall and observed antecedent soil moisture without the need to run a model.
Schumann, Guy J.-P.; Andreadis, Kostas; Niebuhr, Emily; Rashid, Kashif; Njoku, Eni
This paper investigates the skill of 90 day low flow forecasts using two conceptual hydrological models and two data-driven models based on Artificial Neural Networks (ANNs) for the Moselle River. One data-driven model, ANN-Indicator (ANN-I), requires historical inputs on precipitation (P), potential evapotranspiration (PET), groundwater (G) and observed discharge (Q), whereas the other data-driven model, ANN-Ensemble (ANN-E), and the two conceptual models, HBV and GR4J, use forecasted meteorological inputs (P and PET), whereby we employ ensemble seasonal meteorological forecasts. We compared low flow forecasts without any meteorological forecasts as input (ANN-I) and five different cases of seasonal meteorological forcing: (1) ensemble P and PET forecasts; (2) ensemble P forecasts and observed climate mean PET; (3) observed climate mean P and ensemble PET forecasts; (4) observed climate mean P and PET and (5) zero P and ensemble PET forecasts as input for the other three models (GR4J, HBV and ANN-E). The ensemble P and PET forecasts, each consisting of 40 members, reveal the forecast ranges due to the model inputs. The five cases are compared for a lead time of 90 days based on model output ranges, whereas the four models are compared based on their skill of low flow forecasts for varying lead times up to 90 days. Before forecasting, the hydrological models are calibrated and validated for a period of 30 and 20 years respectively. The smallest difference between calibration and validation performance is found for HBV, whereas the largest difference is found for ANN-E. From the results, it appears that all models are prone to over-predict low flows using ensemble seasonal meteorological forcing. The largest range for 90 day low flow forecasts is found for the GR4J model when using ensemble seasonal meteorological forecasts as input. GR4J, HBV and ANN-E under-predicted 90 day ahead low flows in the very dry year 2003 without precipitation data, whereas ANN-I predicted the magnitude of the low flows better than the other three models. The results of the comparison of forecast skills with varying lead times show that GR4J is less skilful than ANN-E and HBV. Furthermore, the hit rate of ANN-E is higher than the two conceptual models for most lead times. However, ANN-I is not successful in distinguishing between low flow events and non-low flow events. Overall, the uncertainty from ensemble P forecasts has a larger effect on seasonal low flow forecasts than the uncertainty from ensemble PET forecasts and initial model conditions.
Demirel, M. C.; Booij, M. J.; Hoekstra, A. Y.
The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659
Dai, Yonghui; Han, Dongmei; Dai, Weihui
As environmental time series have grown, and computer-intensive statistical methods have become more convenient, fitting mechanistic models that incorporate both process and observation error (i.e. state-space models) has become increasingly popular. It has been suggested that such models are more robust to noise due to their inclusion of a process-error term, however their out-of-sample forecast ability remains largely untested. Therefore, it is important to determine how various forecasting strategies perform under realistic levels of noise and forcing. We compared the forecast accuracy of a model-free forecasting approach based on nonlinear state-space reconstruction (SSR) against a suite of mechanistic models fit to their own time series with realistic levels of noise added. To further favor the mechanistic approach, these models were fit using a Bayesian adaptive MCMC algorithm actually initiated on the correct parameter values. Surprisingly, we found that the SSR forecasts were more accurate than the correct mechanistic models despite being fit to only one time series of a multivariate system. This was true for four different ecological models, and for experimental data from a series of flour beetle experiments. Our results suggest that for forecasting real ecosystems, where the correct model is never known, a robust model-free approach such as SSR may be a more practical alternative to complex fitted models containing many free parameters.
Perretti, C.; Munch, S. B.; Deyle, E. R.; Ye, H.; Sugihara, G.
Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments.
James V. Hansen; Ray D. Nelson
This article introduces a new source of survey data, namely the Bank of England Survey of External Forecasters. The survey collects point and density forecasts of inflation and GDP growth and, hence, offers the opportunity of constructing direct measures of uncertainty. We present a simple statistical framework in which to define and interrelate measures of uncertainty and disagreement. The resulting
Gianna Boero; Jeremy Smith; Kenneth F. Wallis
Mortality models often have inbuilt identification issues challenging the statistician. The statistician can choose to work with well-defined freely varying parameters, derived as maximal invariants in this paper, or with ad hoc identified parameters which at first glance seem more intuitive, but which can introduce a number of unnecessary challenges. In this paper we describe the methodological advantages from using the maximal invariant parameterisation and we go through the extra methodological challenges a statistician has to deal with when insisting on working with ad hoc identifications. These challenges are broadly similar in frequentist and in Bayesian setups. We also go through a number of examples from the literature where ad hoc identifications have been preferred in the statistical analyses.
Nielsen, Jens P.
This paper presents a short-term forecasting model of monthly West Texas Intermediate crude oil spot prices using readily available OECD industrial petroleum inventory levels. The model provides good in-sample and out-of-sample dynamic forecasts for the post-Gulf War time period. In-sample and out-of-sample forecasts from the model are compared with those derived from other models. The model is intended for the
Michael Ye; John Zyren; Joanne Shore
We take a model selection approach to the question of whether a class of adaptive prediction models (artificial neural networks) is useful for predicting future values of nine macroeconomic variables. We use a variety of out-of-sample forecast-based model selection criteria, including forecast error measures and forecast direction accuracy. Ex ante or real-time forecasting results based on rolling window prediction methods
Norman R. Swanson; Halbert White
This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.
Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models.
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
The real-time data of the continuous water quality monitoring station at the Pyeongchang river was analyzed separately during the rainy period and non-rainy period. Total organic carbon data observed during the rainy period showed a greater mean value, maximum value and standard deviation than the data observed during the non-rainy period. Dissolved oxygen values during the rainy period were lower than those observed during the non-rainy period. It was analyzed that the discharge due to rain fall from the basin affects the change of the water quality. A model for the forecasting of water quality was constructed and applied using the neural network model and the adaptive neuro-fuzzy inference system. Regarding the models of levenberg-marquardt neural network, modular neural network and adaptive neuro-fuzzy inference system, all three models showed good results for the simulation of total organic carbon. The levenberg-marquardt neural network and modular neural network models showed better results than the adaptive neuro-fuzzy inference system model in the forecasting of dissolved oxygen. The modular neural network model, which was applied with the qualitative data of time in addition to quantitative data, showed the least error. PMID:18702288
Yeon, I S; Kim, J H; Jun, K W
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
A capability for four-dimensional display of meteorological data is being developed at the Space Science and Engineering Center of the University of Wisconsin. McIDAS is used for all aspects of the analysis, including acquiring data, running the model. storing the output, and displaying the results. A version of the Australian Regional Analysis and Forecast Models was applied to the eastern portion of the USA and adjacent Atlantic Ocean. The assimilation system is being used to analyze intensive observing periods during the GALE (Genesis of Atlantic Lows Experiment) field experiment.
Santek, David; Leslie, Lance; Goodman, Brian; Diak, George; Callan, Geary
In a competitive electricity market, forecast of energy prices is a key information for the market participants. However, price signal usually has a complex behavior due to its nonlinearity, nonstationarity, and time variancy. In spite of all performed researches on this area in the recent years, there is still an essential need for more accurate and robust price forecast methods.
Nima Amjady; Farshid Keynia
Streamflow forecasts are generally produced through the use of a single hydrologic model. In spite of the existence of a wide range of hydrologic models, it is hard to claim that any single model among them performs better than the rest, for all type of watersheds under all conditions. This is because hydrologic models, lumped or distributed, introduce many assumptions and simplifications in their structure. Since various model structures capture different aspects of the watershed processes, one way of exploiting the strength of different models and compensating for their weaknesses is to obtain consensus predictions by combining their results using model combination techniques such as Multi Model SuperEnsmble (MMSE). MMSE is a special case of ensemble techniques, which consider the model outputs as ensemble members. This study surveys the performance of MMSE for flood forecasting by using the simulation results from various distributed models participated in the Distributed Model Intercomparison Project (DMIP), an international project sponsored by National Weather Service. The key questions addressed in this study are: (1) What is the skill level of the consensus forecast compared to those of individual forecasts? (2) How many models do we need to produce accurate consensus forecasts? (3) Can model combination techniques compensate for the inadequacy of model calibration? Simulations for the Illinois River Basin at Watts from 7 uncalibrated DMIP models are combined and the results are compared to the calibrated model results.
Ajami, N. K.; Duan, Q.; Gao, X.; Sorooshian, S.
One of the loftier goals in seismic hazard analysis is the creation of an end-to-end earthquake prediction system: a "rupture to rafters" work flow that takes a prediction of fault rupture, propagates it with a ground shaking model, and outputs a damage or loss profile at a given location. So far, the initial prediction of an earthquake rupture (either as a point source or a fault system) has proven to be the most difficult and least solved step in this chain. However, this may soon change. The Collaboratory for the Study of Earthquake Predictability (CSEP) has amassed a suite of earthquake source models for assorted testing regions worldwide. These models are capable of providing rate-based forecasts for earthquake (point) sources over a range of time horizons. Furthermore, these rate forecasts can be easily refined into probabilistic source forecasts. While it's still difficult to fully assess the "goodness" of each of these models, progress is being made: new evaluation procedures are being devised and earthquake statistics continue to accumulate. The scientific community appears to be heading towards a better understanding of rupture predictability. Ground shaking mechanics are better understood, and many different sophisticated models exists. While these models tend to be computationally expensive and often regionally specific, they do a good job at matching empirical data. It is perhaps time to start addressing the third step in the seismic hazard prediction system. We present a model for rapid economic loss estimation using ground motion (PGA or PGV) and socioeconomic measures as its input. We show that the model can be calibrated on a global scale and applied worldwide. We also suggest how the model can be improved and generalized to non-seismic natural disasters such as hurricane and severe wind storms.
Holliday, J. R.; Rundle, J. B.
The shorter-term variable impact of the Sun's photons, solar wind particles, and interplanetary magnetic field upon the Earth's environment that can adversely affect technological systems is colloquially known as space weather. It includes, for example, the effects of solar coronal mass ejections, solar flares and irradiances, solar and galactic energetic particles, as well as the solar wind, all of which affect Earth's magnetospheric particles and fields, geomagnetic and electrodynamical conditions, radiation belts, aurorae, ionosphere, and the neutral thermosphere and mesosphere. These combined effects create risks to space and ground systems from electric field disturbances, irregularities, and scintillation, for example, where these ionospheric perturbations are a direct result of space weather. A major challenge exists to improve our understanding of ionospheric space weather processes and then translate that knowledge into operational systems. Ionospheric perturbed conditions can be recognized and specified in real-time or predicted through linkages of models and data streams. Linked systems must be based upon multi-spectral observations of the Sun, solar wind measurements by satellites between the Earth and Sun, as well as by measurements from radar and GPS/TEC networks. Models of the solar wind, solar irradiances, the neutral thermosphere, thermospheric winds, joule heating, particle precipitation, substorms, the electric field, and the ionosphere provide climatological best estimates of non-measured current and forecast parameters. We report on a team effort that is developing a prototype operational ionospheric forecast system to detect and predict the conditions leading to dynamic ionospheric changes. The system will provide global-to-local specifications of recent history, current epoch, and 72-hour forecast ionospheric and neutral density profiles, TEC, plasma drifts, neutral winds, and temperatures. Geophysical changes will be captured and/or predicted (modeled) at their relevant time scales ranging from 10-minute to hourly cadences. 4-D ionospheric densities are being specified using data assimilation techniques, coupled with physics-based and empirical models for thermospheric, solar, electric field, particle, and magnetic field parameters that maximize accuracy in locales and regions at the current epoch, maintain global self-consistency, and improve reliable forecasts. We report on a system architecture underlying the linkage of models and data streams that is operationally reliable and robust to serve commercial space weather needs.
Tobiska, W.; Bouwer, D.; Forbes, J.; Frahm, R.; Fry, C.; Hagan, M.; Hajj, G.; Hsu, T.; Knipp, D.; Mannucci, A.; Papitashvili, V.; Pi, X.; Sharber, J.; Storz, M.; Wang, C.; Wilson, B.
A methodology is proposed for constructing a flood forecast model using the adaptive neuro-fuzzy inference system (ANFIS). This is based on a self-organizing rule-base generator, a feedforward network, and fuzzy control arithmetic. Given the rainfall-runoff patterns, ANFIS could systematically and effectively construct flood forecast models. The precipitation and flow data sets of the Choshui River in central Taiwan are analysed to identify the useful input variables and then the forecasting model can be self-constructed through ANFIS. The analysis results suggest that the persistent effect and upstream flow information are the key effects for modelling the flood forecast, and the watershed's average rainfall provides further information and enhances the accuracy of the model performance. For the purpose of comparison, the commonly used back-propagation neural network (BPNN) is also examined. The forecast results demonstrate that ANFIS is superior to the BPNN, and ANFIS can effectively and reliably construct an accurate flood forecast model.
Chen, Shen-Hsien; Lin, Yong-Huang; Chang, Li-Chiu; Chang, Fi-John
This paper draws on the guerrilla warfare literature so as to synthesize and to describe the dynamics of the initial stages of a guerrilla war against an established government. It combines two classical economic models, the Solow growth model and the Ricardian model of economic rents, with two classic studies of guerrilla warfare by T. E. Lawrence and by Mao
Dagobert L. Brito; Michael D. Intriligator
We present the results of a PM10 forecasting model that has been applied for air quality management in Santiago, Chile during recent years. The daily operation of this model has served to inform in advance to the population about the air quality they will find in different areas of the city and to help environmental authorities in the decision to take actions on days when concentrations are in ranges considered significantly harmful and to impose restrictions to the activity of the city in advance, when extreme episodes are foreseen. At present, national PM10 standard for 24 h average is 150 ?g m-3. According to the range where the concentrations fall, five levels or classes of air quality are defined: good (A), regular (B), bad (C), Critical (D) and Emergency (E). Forecasting is based on the combination of artificial neural networks and a nearest neighbor method. Inputs to the models are concentrations measured at several monitoring stations distributed throughout the city and meteorological information in the region. Outputs are the expected maxima concentrations for the following day at the site of the same monitoring stations. Results for last three years (2009, 2010, 2011) indicate that the model may be considered as an important tool for air pollution control.
One of main objectives established in 2000 for the development of a global data assimilation model for the Earth’s ionosphere was to enable the forecast of ionospheric electron and ion densities. Following the exciting development of Global Assimilative Ionospheric Model (GAIM, also known as the Global Assimilation of Ionospheric Measurements) by two teams, the Utah State University team and the University of Southern California and the Jet Propulsion Laboratory team, the goal of forecasting ionospheric electron density has yet to be reached. At the University of Southern California and the Jet Propulsion Laboratory, we have made substantial efforts toward the forecasting of ionospheric conditions. A key component of our efforts is the determination of the driving forces for the ionospheric dynamics using the 4DVAR data assimilation approach. It is well-known that the changes in the electron and ion densities in the Earth’s ionosphere are strongly influenced by the variations of the solar radiation, the geo-electrical field, the neutral gas densities and thermospheric wind velocity. These environmental variables are referred to as the driving forces in an ionospheric model. In early version of the GAIM implementation, the values of these driving forces are taken from climatological models often indexed simply by geomagnetic index AP and solar irradiance index F10.7. Although these crude estimates of the values of these driving forces are sufficient in providing a prior estimate for electron density for approaches based on Kalman filter to produce reasonably good ionospheric now-cast, these values are not sufficient for forecast of ionospheric densities. A 4DVAR version of the USC/JPL GAIM was developed to estimate the Earth’s ionospheric driving forces such as the production rate, the ExB drift velocity and the horizontal neutral wind speed. The implementation uses the approach of adjoint equation to efficiently evaluate the gradient vector of the 4DVAR optimization criterion. The same approach also allows us to simultaneously estimate the driving forces and the electron density. The 4DVAR implementation relies on an intermittent assimilation cycle to process measurement data and to produce forecasts. Sensitivity studies of the ionospheric observables to the driving forces are performed by the USC/JPL team. We have also investigated the feasibility of the estimation of the ionospheric drivers through a series of Observation System Simulation Experiments (OSSE). Our simulation results indicate that when persistence in time in the sun fixed frame is a valid assumption, the 4DVAR approach can be effectively used to forecast the ionospheric conditions. In this presentation, we present our implementation of the 4DVAR model and the scheduling of data assimulation cycles. We shall also present the results of our sensitivity study, as well as, the results of the OSSEs.
Wang, C.; Akopian, V.; Pi, X.; Mannucci, A. J.; Usc/Jpl Gaim Team
The aim of the work is the presentation of validation of result the sea level of the local forecast calculated by the hydrodynamic model used in Operational Sea Level Service in Poland. 3 sea local models and 2 regional models are working everyday for calculation the forecast. Parameters were used for validation: correlation, reliability, effectiveness, productivity indicators. Model simulations and
This study develops a blended version of the monetary and portfolio models for the MK\\/USD exchange rate, and assesses the forecasting performance of the model against a simple random walk. The results indicate that the model performs better than the simple random walk on the 6, 12 and 24 months forecasting horizons. However, the model does not perform well on
Water is a valuable, limited, and highly regulated resource throughout the United States. When making decisions about water allocations, state and federal water project managers must consider the short-term and long-term needs of agriculture, urban users, hydroelectric production, flood control, and the ecosystems downstream. In the Central Valley of California, river water temperature is a critical indicator of habitat quality for endangered salmonid species and affects re-licensing of major water projects and dam operations worth billions of dollars. There is consequently strong interest in modeling water temperature dynamics and the subsequent impacts on fish growth in such regulated rivers. However, the accuracy of current stream temperature models is limited by the lack of spatially detailed meteorological forecasts. To address these issues, we developed a high-resolution deterministic 1-dimensional stream temperature model (sub-hourly time step, sub-kilometer spatial resolution) in a state-space framework, and applied this model to Upper Sacramento River. We then adapted salmon bioenergetics models to incorporate the temperature data at sub-hourly time steps to provide more realistic estimates of salmon growth. The temperature model uses physically-based heat budgets to calculate the rate of heat transfer to/from the river. We use variables provided by the TOPS-WRF (Terrestrial Observation and Prediction System - Weather Research and Forecasting) model—a high-resolution assimilation of satellite-derived meteorological observations and numerical weather simulations—as inputs. The TOPS-WRF framework allows us to improve the spatial and temporal resolution of stream temperature predictions. The salmon growth models are adapted from the Wisconsin bioenergetics model. We have made the output from both models available on an interactive website so that water and fisheries managers can determine the past, current and three day forecasted water temperatures at any point along the river, and view various simulated alterations to the water discharge volume and discharge temperature. The subsequent impacts on fish growth will also be displayed so that managers can view how their operational decisions might impact salmon growth.
Danner, E.; Pike, A.; Lindley, S.; Mendelssohn, R.; Dewitt, L.; Melton, F. S.; Nemani, R. R.; Hashimoto, H.
In a basic forecast model, where the expected activity in a future period is in proportion to the observed activity in the past, i.e. a relative intensity model, there are important options which can improve the performance of the model: (1) The expected number of earthquake occurrences at places where no earthquakes have been observed in the past, and (2) the extent of the area of the past reference activity, are considered to be two of the most significant factors. However, these issues have not been fully examined in previous studies of Japan. Taking into consideration the forecast of the expected number of earthquakes with a magnitude of 5 or greater in the Japan area, as designated by the Japanese test center of the Collaboratory for the Study of Earthquake Predictability (CSEP), retrospective experiments result in an optimized value of 0.00085 per year per unit cell in which no earthquakes were observed in the previous 43 years. Here, the size of a unit cell is 0.1° by 0.1° in latitude and longitude, with depths from the surface to 100 km. In addition, an area of 0.3° by 0.3° in latitude and longitude was found to be the best spatial extent for the reference activity.
Yamashina, K.; Nanjo, K. Z.
This paper introduces a lightning forecasting method called Potential Lightning Region (PLR), which is the probability of the occurrence of lightning over a region of interest. The PLR was calculated using a combination of meteorological variables obtained from high-resolution Weather Research and Forecasting (WRF) model simulations during the summer season in southeastern Brazil. The model parameters used in the PLR definition were: surface-based Convective Available Potential Energy (SBCAPE), Lifted Index (LI), K-Index (KI), average vertical velocity between 850 and 700 hPa (w), and integrated ice-mixing ratio from 700 to 500 hPa (QICE). Short-range runs of twelve non-severe thunderstorm cases were performed with the WRF model, using different convective and microphysical schemes. Through statistical evaluations, the WRF cloud parameterizations that best described the convective thunderstorms with lightning in southeastern Brazil were the combination of Grell-Devenyi and Thompson schemes. Two calculation methods were proposed: the Linear PLR and Normalized PLR. The difference between them is basically how they deal with the influence of lightning flashes over the WRF domain's grid points for the twelve thunderstorms analyzed. Three case studies were used to test both methods. A statistical evaluation lowering the spatial resolution of the WRF grid into larger areas was performed to study the behavior and accuracy of the PLR methods. The Normalized PLR presented the most suitable one, predicting flash occurrence appropriately.
Zepka, G. S.; Pinto, O.; Saraiva, A. C. V.
Flood forecasting in mountainous areas requires accurate partitioning between rain and snowfall; an incorrect snow/rainfall limit (on daily or sub-daily timescales) typically implies a significant over- (or under-)estimation of the source catchment areas contributing to runoff and infiltration. This study details the development of a snow/rainfall partitioning method which incorporates snowfall limit information from Limited Area Models (LAMs) to improve catchment-scale hydrological modeling. LAMs consider the vertical, humid, atmospheric structure including wet bulb temperature in their snowfall limit calculations. Such an approach is more physically-based than inferring snowfall limit estimates based on dry, ground temperature measurements, which is the standard procedure in most hydrological models. A case study involving complex topography in the Swiss Alps demonstrates the potential of the developed method with the integration of COSMO forecast re-analysis snowfall limit information. Such data and the new method are proven here to significantly improve runoff simulation, particularly in the spring when a large part of the catchment is close to saturation. Integrating LAM snowfall limits thereby provides good estimates of runoff contributing areas, with practical implications for operational hydrology in Alpine regions.
Tobin, C.; Rinaldo, A.; Schaefli, B.
Abstract Objective Global achievements in health may be limited by critical shortages of health-care workers. To help guide workforce policy, we estimate the future demand for, need for and supply of physicians, by WHO region, to determine where likely shortages will occur by 2015, the target date of the Millennium Development Goals. Methods Using World Bank and WHO data on physicians per capita from 1980 to 2001 for 158 countries, we employ two modelling approaches for estimating the future global requirement for physicians. A needs-based model determines the number of physicians per capita required to achieve 80% coverage of live births by a skilled health-care attendant. In contrast, our economic model identifies the number of physicians per capita that are likely to be demanded, given each country’s economic growth. These estimates are compared to the future supply of physicians projected by extrapolating the historical rate of increase in physicians per capita for each country. Findings By 2015, the global supply of physicians appears to be in balance with projected economic demand. Because our measure of need reflects the minimum level of workforce density required to provide a basic health service that is met in all but the least developed countries, the needs-based estimates predict a global surplus of physicians. However, on a regional basis, both models predict shortages for many countries in the WHO African Region in 2015, with some countries experiencing a needs-based shortage, a demand-based shortage, or both. Conclusion The type of policy intervention needed to alleviate projected shortages, such as increasing health-care training or adopting measures to discourage migration, depends on the type of shortage projected.
Liu, Jenny X; Kinfu, Yohannes; Dal Poz, Mario R
The impact of wind and load forecast errors on power grid operations is frequently evaluated by conducting multi-variant studies, where these errors are simulated repeatedly as random processes based on their known statistical characteristics. To generate these errors correctly, we need to reflect their distributions (which do not necessarily follow a known distribution law), standard deviations, auto- and cross-correlations. For instance, load and wind forecast errors can be closely correlated in different zones of the system. This paper introduces a new methodology for generating multiple cross-correlated random processes to simulate forecast error curves based on a transition probability matrix computed from an empirical error distribution function. The matrix will be used to generate new error time series with statistical features similar to observed errors. We present the derivation of the method and present some experimental results by generating new error forecasts together with their statistics.
Makarov, Yuri V.; Reyes Spindola, Jorge F.; Samaan, Nader A.; Diao, Ruisheng; Hafen, Ryan P.
The solar wind (SW) and interplanetary magnetic field (IMF) have a significant influence on the near-Earth space environment. In this study we evaluate and compare forecasts from two models that predict SW and IMF conditions: the Hakamada-Akasofu-Fry (HAF) version 2, operational at the Air Force Weather Agency, and Wang-Sheeley-Arge (WSA) version 1.6, executed routinely at the Space Weather Prediction Center. SW speed (Vsw) and IMF polarity (Bpol) forecasts at L1 were compared with Wind and Advanced Composition Explorer satellite observations. Verification statistics were computed by study year and forecast day. Results revealed that both models' mean Vsw are slower than observed. The HAF slow bias increases with forecast duration. WSA had lower Vsw forecast-observation difference (F-O) absolute means and standard deviations than HAF. HAF and WSA Vsw forecast standard deviations were less than observed. Vsw F-O mean square skill rarely exceeds that of recurrence forecasts. Bpol is correctly predicted 65%-85% of the time in both models. Recurrence beats the models in Bpol skill in nearly every year forecast day category. Verification by "event" (flare events ?5 days before forecast start) and "nonevent" (no flares) forecasts showed that most HAF Vsw bias growth, F-O standard deviation decrease, and forecast standard deviation decrease were due to the event forecasts. Analysis of single time step Vsw increases of ?20% in the nonevent forecasts indicated that both models predicted too many occurrences and missed many observed incidences. Neither model had skill above a random guess in predicting Vsw increase arrival time at L1.
Norquist, Donald C.; Meeks, Warner C.
For the efficient management of radio resources, a scientific\\/systematic management system using an engineering approach based on cost-benefit analysis needs to be prepared. In forecasting market competition and demand for various services provided by frequency band, it should be considered necessary to enhance the reliability of forecasting data using the most rational forecast techniques. Many researchers have investigated models of
Sangmun Shin; Yonghee Lee; Yongsun Choi; Charles Kim
Maintenance, repair and overhaul processes (MRO processes) are elaborate and complex. Rising demands on these after sales services require reliable production planning and control methods particularly for maintaining valuable capital goods. Downtimes lead to high costs and an inability to meet delivery due dates results in severe contract penalties. Predicting the required capacities for maintenance orders in advance is often difficult due to unknown part conditions unless the goods are actually inspected. This planning uncertainty results in extensive capital tie-up by rising stock levels within the whole MRO network. The article outlines an approach to planning capacities when maintenance data forecasting is volatile. It focuses on the development of prerequisites for a reliable capacity planning model. This enables a quick response to maintenance orders by employing appropriate measures. The information gained through the model is then systematically applied to forecast both personnel capacities and the demand for spare parts. The improved planning reliability can support MRO service providers in shortening delivery times and reducing stock levels in order to enhance the performance of their maintenance logistics.
The increasing penetration of wind energy into national electricity markets has increased the demand for accurate surface layer wind forecasts. There has recently been a focus on forecasting the wind at wind farm sites using both statistical models and numerical weather prediction (NWP) models. Recent advances in computing capacity and non-hydrostatic NWP models means that it is possible to nest mesoscale models down to Large Eddy Simulation (LES) scales over the spatial area of a typical wind farm. For example, the WRF model (Skamarock 2008) has been run at a resolution of 123 m over a wind farm site in complex terrain in Colorado (Liu et al. 2009). Although these modelling attempts indicate a great hope for applying such models for detailed wind forecasts over wind farms, one of the obvious challenges of running the model at this resolution is that while some boundary layer structures are expected to be modelled explicitly, boundary layer eddies into the inertial sub-range can only be partly captured. Therefore, the amount and nature of sub-grid-scale mixing that is required is uncertain. Analysis of Liu et al. (2009) modelling results in comparison to wind farm observations indicates that unrealistic wind speed fluctuations with a period of around 1 hour occasionally occurred during the two day modelling period. The problem was addressed by re-running the same modelling system with a) a modified diffusion constant and b) two-way nesting between the high resolution model and its parent domain. The model, which was run with horizontal grid spacing of 370 m, had dimensions of 505 grid points in the east-west direction and 490 points in the north-south direction. It received boundary conditions from a mesoscale model of resolution 1111 m. Both models had 37 levels in the vertical. The mesoscale model was run with a non-local-mixing planetary boundary layer scheme, while the 370 m model was run with no planetary boundary layer scheme. It was found that increasing the diffusion constant caused damping of the unrealistic fluctuations, but did not completely solve the problem. Using two-way nesting also mitigated the unrealistic fluctuations significantly. It can be concluded that for real case LES modelling of wind farm circulations, care should be taken to ensure the consistency between the mesoscale weather forcing and LES models to avoid exciting spurious noise along the forcing boundary. The development of algorithms that adequately model the sub-grid-scale mixing that cannot be resolved by LES models is an important area for further research. References Liu, Y. Y._W. Liu, W. Y.Y. Cheng, W. Wu, T. T. Warner and K. Parks, 2009: Simulating intra-farm wind variations with the WRF-RTFDDA-LES modeling system. 10th WRF Users' Workshop, Boulder, C, USA. June 23 - 26, 2009. Skamarock, W., J. Dudhia, D.O. Gill, D.M. Barker, M.G.Duda, X-Y. Huang, W. Wang and J.G. Powers, A Description of the Advanced Research WRF version 3, NCAR Technical Note TN-475+STR, NCAR, Boulder, Colorado, 2008.
Vincent, Claire Louise; Liu, Yubao
This paper presents a new approach using an Artificial Neural Network technique to improve rainfall forecast performance. A real world case study was set up in Bangkok; 4 years of hourly data from 75 rain gauge stations in the area were used to develop the ANN model. The developed ANN model is being applied for real time rainfall forecasting and flood management in Bangkok, Thailand. Aimed at providing forecasts in a near real time schedule, different network types were tested with different kinds of input information. Preliminary tests showed that a generalized feedforward ANN model using hyperbolic tangent transfer function achieved the best generalization of rainfall. Especially, the use of a combination of meteorological parameters (relative humidity, air pressure, wet bulb temperature and cloudiness), the rainfall at the point of forecasting and rainfall at the surrounding stations, as an input data, advanced ANN model to apply with continuous data containing rainy and non-rainy period, allowed model to issue forecast at any moment. Additionally, forecasts by ANN model were compared to the convenient approach namely simple persistent method. Results show that ANN forecasts have superiority over the ones obtained by the persistent model. Rainfall forecasts for Bangkok from 1 to 3 h ahead were highly satisfactory. Sensitivity analysis indicated that the most important input parameter besides rainfall itself is the wet bulb temperature in forecasting rainfall.
Hung, N. Q.; Babel, M. S.; Weesakul, S.; Tripathi, N. K.
Ensemble forecasts are a promising new approach to numerous applications in oceanography. They have for long been an essential tool in meteorology. In marine environment, there is a possibility of even further development, in large part due to the longer predictability. This may, e.g., mean more accurate long-term forecasts for oceanographic parameters. In this work we used the ensemble approach to seasonal forecasting of physical and chemical changes during spring bloom in Baltic Sea. We present results of an ensemble forecasting in the Baltic, and discuss the applicability of this method to operational biogeochemical ocean modelling. FMI's operational 3-dimensional biogeochemical model was used to produce monthly ensemble forecasts for different physical, chemical and biological variables. The modelled variables were temperature, salinity, velocity, silicate, phosphate, nitrate, diatoms, flagellates and two species of potentially toxic filamentous cyanobacteria. Ensembles were produced by running several 30 day runs of the biogeochemical model. The model was forced every run with different set of seasonal weather parameters from ECMWF's mathematically perturbed ensemble prediction forecasts. The ensembles were then analysed by statistical methods and the median, quartiles, minimum and maximum values were calculated for model output variables to gain insight into the applicability of the results. Validation for the forecast method was made by comparing the results against in-situ data. The results of the model demonstrated that ensemble forecasting is a viable tool and it is indeed possible to forecast with useful accuracy the Baltic Sea with these time spans.
Roiha, P.; Westerlund, A.; Stipa, T.
River flow forecasting is an essential procedure that is necessary for proper reservoir operation. Accurate forecasting results\\u000a in good control of water availability, refined operation of reservoirs and improved hydropower generation. Therefore, it becomes\\u000a crucial to develop forecasting models for river inflow. Several approaches have been proposed over the past few years based\\u000a on stochastic modeling or artificial intelligence (AI)
Ahmed El-Shafie; Mahmoud Reda Taha; Aboelmagd Noureldin
The objective of this study is to describe a parsimonious forecasting model for the hourly electricity load in the area covered by an electric utility located in the Midwest of the United States that performs well in out-of-sample forecast evaluation. This study proposes using an autoregressive moving average model with exogenous weather variables (ARMAX) to forecast short-term electricity load using
Jennifer Hinman; Emily Hickey
With the integration of wind energy into electricity grids, it is becoming increasingly important to obtain accurate wind speed\\/power forecasts. Accurate wind speed forecasts are necessary to schedule dispatchable generation and tariffs in the day-ahead electricity market. This paper examines the use of fractional-ARIMA or f-ARIMA models to model, and forecast wind speeds on the day-ahead (24h) and two-day-ahead (48h)
Rajesh G. Kavasseri; Krithika Seetharaman
The purpose of this research is to investigate appropriate collaborative forecasting models, both for acceptable accuracy and effectiveness for information sharing via the Internet to partners. Several forecasting models were investigated in this research. For comparing the models, five years historical data are collected from a lathe machine manufacturer in Taiwan, and one of its partners - a ball screw
Jeng-teng Tsai; Jung-hua Lee; Chung-chieh Hsu; Shui-shun Lin; Chyung Perng; Wen-chih Chiou
This paper introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times. A postprocessing approach is used, and a Gaussian model is applied for transformed variables. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross-validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured.
Engeland, K.; Steinsland, I.
The report describes the structure and data base of a computerized model for projecting localized economic, demographic, and fiscal impacts of new energy developments. The model provides baseline and single or multiple-project impact projections for a 15-...
T. Hertsgaard, S. Murdock, N. Toman, M. Henry, R. Ludtke
This paper proposes an application of a filter method in preprocessing stage for mid-term load demand forecasting to improve electricity load forecasting and to guarantee satisfactory forecasting accuracy. Case study employs the historical electricity consumption demand data in Thailand which were recorded in the 12 years of 1997 through to 2007. The load demand forecasted value is used for unit commitment and fuel reserve planning in the power system. This method consists of a trend component and a cyclical component decomposed from the original load demand using the Hodrick-Prescott (HP) filter in the preprocessing stage and the forecasting of each component using Double Neural Networks (DNNs) in the forecasting stage. Experimental results show that with preprocessing before forecasting can predict the load demand better than that without preprocessing.
Bunnoon, Pituk; Chalermyanont, Kusumal; Limsakul, Chusak
Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast. PMID:20028637
Li, Sheng-Tun; Cheng, Yi-Chung
Current developments in the field of land use modelling point towards greater level of spatial and thematic resolution and the possibility to model large geographical extents. Improvements are taking place as computational capabilities increase and socioeconomic and environmental data are produced with sufficient detail. Integrated approaches to land use modelling rely on the development of interfaces with specialized models from fields like economy, hydrology, and agriculture. Impact assessment of scenarios/policies at various geographical scales can particularly benefit from these advances. A comprehensive land use modelling framework includes necessarily both the estimation of the quantity and the spatial allocation of land uses within a given timeframe. In this paper, we seek to establish straightforward methods to estimate demand for industrial and commercial land uses that can be used in the context of land use modelling, in particular for applications at continental scale, where the unavailability of data is often a major constraint. We propose a set of approaches based on ‘land use intensity’ measures indicating the amount of economic output per existing areal unit of land use. A base model was designed to estimate land demand based on regional-specific land use intensities; in addition, variants accounting for sectoral differences in land use intensity were introduced. A validation was carried out for a set of European countries by estimating land use for 2006 and comparing it to observations. The models’ results were compared with estimations generated using the ‘null model’ (no land use change) and simple trend extrapolations. Results indicate that the proposed approaches clearly outperformed the ‘null model’, but did not consistently outperform the linear extrapolation. An uncertainty analysis further revealed that the models’ performances are particularly sensitive to the quality of the input land use data. In addition, unknown future trends of regional land use intensity widen considerably the uncertainty bands of the predictions.
Batista e Silva, Filipe; Koomen, Eric; Diogo, Vasco; Lavalle, Carlo
This paper gives the technical solutions of implementing the space-time epidemic-type aftershock sequence (ETAS) model for short-term (1-day) earthquake forecasts for the all-Japan region in the Collaboratory for the Study of Earthquake Predictability (CSEP) project in Japan. For illustration, a retrospective forecasting experiment is carried out to forecast the seismicity in the Japan region before and after the Tokachi-Oki earthquake (M 8.0) at 19:50:07 (UTC) on 25 September 2003, in the format of contour images. The optimal model parameters used for the forecasts are estimated by fitting the model to the observation records up to the starting time of the forecasting period, and the probabilities of earthquake occurrences are obtained through simulations. To tackle the difficulty of heavy computations in fitting a complicated point-process to a huge dataset, an "off-line optimization" and "online forecasting" scheme is proposed to keep both the estimates of model parameters and forecasts updated according to the most recent observations. The results show that the forecasts have captured the spatial distribution and temporal evolution of the features of future seismicity. These forecasts are tested against the reference Poisson model that is stationary in time but spatially inhomogeneous.
Wheat is an important food crop of the country. Its productivity lies in a very wide range due to diverse bio-physical and socio-economic conditions in the growing regions. Crop cutting and sample surveys are time consuming as well tedious, and procedure of forecast is delayed. CAPE methodology, which uses remote sensing, ground truth and prevailing weather, has been very successful, but empirical in nature. In a joint IARI-SAC Research Programme, possibility of linking the dynamic wheat growth model with the remote sensing input and other relational database layers was tried. Use of WTGROWS, a wheat growth model developed at IARI, with the remote sensing and relational databases is dynamic and can be updated whenever weather, acreage and fertilizer and other inputs are received. National wheat yield forecast was done for three seasons on meteorological sub-division scale by using WTGROWS, relational database layers and satellite image. WTGROWS was run for historic weather dataset (last 25 years), with the relational database inputs through their associated growth rates and compared with the productivity trends of the met-subdivision. Calibration factor, for each met-subdivision, were obtained to capture the other biotic and abiotic stresses and subsequently used to bring down the yields at each sub-division to realistic scale. The satellite image was used to compute the acreage with wheat in each sub-division. Meteorological data for each-subdivision was obtained from IMD (weekly basis). WTGROWS was run with actual weather data obtained upto a given time, and weather normals use for subsequent period, and the forecast was prepared. This was updated on weekly basis, and the methodology could forecast the wheat yield well in advance with a great accuracy. This procedure shows the pathway for Crop Growth Monitoring System (CGMS) for the country, to be used for land use planning and agri-production estimates, which although looks difficult for diverse agro-ecologies and wide range of bio-physical and socio-economic characters contributing to differential productivity trends.
Kalra, Naveen; Aggarwal, P. K.; Singh, A. K.; Dadhwal, V. K.; Sehgal, V. K.; Harith, R. C.; Sharma, S. K.
Recent studies have found a significant association between climatic variability and basin hydroclimatology, particularly groundwater levels, over the southeast United States. The research reported in this paper evaluates the potential in developing 6-month-ahead groundwater-level forecasts based on the precipitation forecasts from ECHAM 4.5 General Circulation Model Forced with Sea Surface Temperature forecasts. Ten groundwater wells and nine streamgauges from the USGS Groundwater Climate Response Network and Hydro-Climatic Data Network were selected to represent groundwater and surface water flows, respectively, having minimal anthropogenic influences within the Flint River Basin in Georgia, United States. The writers employ two low-dimensional models [principle component regression (PCR) and canonical correlation analysis (CCA)] for predicting groundwater and streamflow at both seasonal and monthly timescales. Three modeling schemes are considered at the beginning of January to predict winter (January, February, and March) and spring (April, May, and June) streamflow and groundwater for the selected sites within the Flint River Basin. The first scheme (model 1) is a null model and is developed using PCR for every streamflow and groundwater site using previous 3-month observations (October, November, and December) available at that particular site as predictors. Modeling schemes 2 and 3 are developed using PCR and CCA, respectively, to evaluate the role of precipitation forecasts in improving monthly and seasonal groundwater predictions. Modeling scheme 3, which employs a CCA approach, is developed for each site by considering observed groundwater levels from nearby sites as predictands. The performance of these three schemes is evaluated using two metrics (correlation coefficient and relative RMS error) by developing groundwater-level forecasts based on leave-five-out cross-validation. Results from the research reported in this paper show that using precipitation forecasts in climate models improves the ability to predict the interannual variability of winter and spring streamflow and groundwater levels over the basin. However, significant conditional bias exists in all the three modeling schemes, which indicates the need to consider improved modeling schemes as well as the availability of longer time-series of observed hydroclimatic information over the basin.
Almanaseer, Naser; Sankarasubramanian, A.; Bales, Jerad
Impact of aerosols on weather in the boundary layer is examined for short-term forecasts issued over eastern part of North America in summer 2012. The study employs WRF-Chem and Gridpoint Statistical Interpolation (GSI) for forecasting and 3D-Var simultaneous assimilation of standard meteorological observations and surface measurements of PM2.5 and PM10, and MODIS AOD. It is demonstrated that the assimilation of species leads to a significant improvement in prediction of aerosol concentrations. It is also shown that simulated aerosols have visible impact on weather in the boundary layer. While it is intuitively obvious that such impact should occur it is not apparent that quality of physical parameterizations is sufficient to improve weather forecasts. Verification statistics will be presented for a two-month-long period for simulations that do and do not account for aerosol feedback to radiation.
Pagowski, Mariusz; mckeen, stuart; grell, georg; hu, ming
Two computer programs that simulate the multiplier and accelerator processes are described. The simulations can be used in college-level economics classes or by pupils using their own personal computers at home. Very little knowledge of programing is required to implement the simulations. (Author/RM)
The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.
This paper discussed the fundamental contents and the process developed of the GM (1,1) forecast and the dynamic gray forecast model, and predicted the vibration state of electrical submersible pump through the above-mentioned two models in the electrical submersible pump unit of oilfield. Vibration state was described by the vibration acceleration signals. The result showed that the precision of tow
Feng Gao; Ting-jun Yan
In order to improve the accuracy and rapidity of forecast of oil spill in Three Gorges Reservoir Area, and provide technical support for oil spill emergency decision-making. The paper designed an oil spill forecast model that can be used in Three Georges Reservoir Area by using EFDC model. Considering the effect of the wind and flow, and integrating with oil
Zhang Fan; Huang Liwen
Global severity of potato late blight was estimated by linking two disease forecast models, Blitecast and Simcast, to a climate data base in a geographic information system (GIS). The disease forecast models indirectly estimate late blight severity by determining how many sprays are needed during a growing season as a function of the weather. Global zonation of estimated late blight
R. J. Hijmans; G. A. Forbes; T. S. Walker
A real-time operation model primarily useful for daily operation of reservoirs is developed. This model is based on a chance constraint formulation and assumes a particular form of the linear decision rule. It uses the conditional distribution function (CDF) of actual streamflows conditioned on the forecasted values. These CDF's are constructed by incorporating the statistical properties of forecast errors for
Bithin Datta; Mark H. Houck
The aim of this paper is to test the effectiveness of feature models in ocean acoustic forecasting. Feature models are simple mathematical representations of the horizontal and vertical structures of ocean features (such as fronts and eddies), and have been used primarily for assimilating new observations into forecasts and for compressing data. In this paper we describe the results of
J. Small; L. Shackleford; G. Pavey
Recent models for credit risk management make use of hidden Markov models (HMMs). HMMs are used to forecast quantiles of corporate default rates. Little research has been done on the quality of such forecasts if the underlying HMM is potentially misspecified. In this paper, we focus on misspecification in the dynamics and dimension of the HMM. We consider both discrete-
Konrad Banachewicz; André Lucas
Recent models for credit risk management make use of Hidden Markov Models (HMMs). The HMMs are used to forecast quantiles of corporate default rates. Little research has been done on the quality of such forecasts if the underlying HMM is potentially mis-specified. In this paper, we focus on mis-specification in the dynamics and the dimension of the HMM. We consider
Konrad Banachewicz; André Lucas
Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices
Q. J. Wang; D. E. Robertson; F. H. S. Chiew
In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...
The development of stream flow forecasting model is one of the most important aspects in water resources planning and management , since it can help in providing early warning of river flooding as well as in short term operation of water supply system. In this research the best ANN artificial neural networks model for simulation and forecasting of Euphrates river
Cheleng A. Arslan
Kalman filtering is a fundamental tool in statistical time series analysis to understand the dynamics of large systems for which limited, noisy observations are available. However, standard implementations of the Kalman filter are prohibitive because they require O(N^2) in memory and O(N^3) in computational cost, where N is the dimension of the state variable. In this work, we focus our attention on the Random walk forecast model which assumes the state transition matrix to be the identity matrix. This model is frequently adopted when the data is acquired at a timescale that is faster than the dynamics of the state variables and there is considerable uncertainty as to the physics governing the state evolution. We derive an efficient representation for the a priori and a posteriori estimate covariance matrices as a weighted sum of two contributions - the process noise covariance matrix and a low rank term which contains eigenvectors from a generalized eigenvalue problem, which combines information from the noise covariance matrix and the data. We describe an efficient algorithm to update the weights of the above terms and the computation of eigenmodes of the generalized eigenvalue problem (GEP). The resulting algorithm for the Kalman filter with Random walk forecast model scales as O(N) or O(N log N), both in memory and computational cost. This opens up the possibility of real-time adaptive experimental design and optimal control in systems of much larger dimension than was previously feasible. For a small number of measurements (~ 300 - 400), this procedure can be made numerically exact. However, as the number of measurements increase, for several choices of measurement operators and noise covariance matrices, the spectrum of the (GEP) decays rapidly and we are justified in only retaining the dominant eigenmodes. We discuss tradeoffs between accuracy and computational cost. The resulting algorithms are applied to an example application from ray-based travel time tomography.
Saibaba, A.; Kitanidis, P. K.
This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.
Engeland, Kolbjorn; Steinsland, Ingelin
The Tampa Bay Operational Forecast System (TBOFS) has been developed based on a hydrodynamic model system, Regional Ocean Model System (ROMS, Haidvogel, 2008). The curvilinear model grid was constructed and populated with bathymetry obtained from NOS surv...
A. Zhang E. Wei
A new parameter-parsimonious rainfall-runoff model, DDD (Distance Distribution Dynamics) has been run operationally at the Norwegian Flood Forecasting Service for approximately a year. DDD has been calibrated for, altogether, 104 catchments throughout Norway, and provide runoff forecasts 8 days ahead on a daily temporal resolution driven by precipitation and temperature from the meteorological forecast models AROME (48 hrs) and EC (192 hrs). The current version of DDD differs from the standard model used for flood forecasting in Norway, the HBV model, in its description of the subsurface and runoff dynamics. In DDD, the capacity of the subsurface water reservoir M, is the only parameter to be calibrated whereas the runoff dynamics is completely parameterised from observed characteristics derived from GIS and runoff recession analysis. Water is conveyed through the soils to the river network by waves with celerities determined by the level of saturation in the catchment. The distributions of distances between points in the catchment to the nearest river reach and of the river network give, together with the celerities, distributions of travel times, and, consequently unit hydrographs. DDD has 6 parameters less to calibrate in the runoff module than the HBV model. Experiences using DDD show that especially the timing of flood peaks has improved considerably and in a comparison between DDD and HBV, when assessing timeseries of 64 years for 75 catchments, DDD had a higher hit rate and a lower false alarm rate than HBV. For flood peaks higher than the mean annual flood the median hit rate is 0.45 and 0.41 for the DDD and HBV models respectively. Corresponding number for the false alarm rate is 0.62 and 0.75 For floods over the five year return interval, the median hit rate is 0.29 and 0.28 for the DDD and HBV models, respectively with false alarm rates equal to 0.67 and 0.80. During 2014 the Norwegian flood forecasting service will run DDD operationally at a 3h temporal resolution. Running DDD at a 3h resolution will give a better prediction of flood peaks in small catchments, where the averaging over 24 hrs will lead to a underestimation of high events, and we can better describe the progress floods in larger catchments. Also, at a 3h temporal resolution we make better use of the meteorological forecasts that for long have been provided at a very detailed temporal resolution.
Skaugen, Thomas; Haddeland, Ingjerd
Enterovirus 71 (EV71) is a growing public health concern, especially in Asia. A surge of EV71 cases in 2008 prompted authorities in China to go on national alert. While there is currently no treatment for EV71 infections, vaccines are under development. We developed a computer simulation model to determine the potential economic value of an EV71 vaccine for children (<5 years old) in China. Our results suggest that routine vaccination in China (EV71 infection incidence 0.04%) may be cost-effective when vaccine cost is $25 and efficacy ?70% or cost is $10 and efficacy ? 50%. For populations with higher infection risk (? 0.4%), a $50 or $75 vaccine would be highly cost-effective even when vaccine efficacy is as low as 50%.
Lee, Bruce Y.; Wateska, Angela R.; Bailey, Rachel R.; Tai, Julie H.Y.; Bacon, Kristina M.; Smith, Kenneth J.
Earthquake forecasting and time dependent earthquake hazard assessment are increasingly being implemented, automated and tested. Within the framework of the Regional Earthquake Likelihood Modeling (RELM) and CSEP initiative in southern California, for example, statistically and physically based models are now forecasting earthquake probabilities. However, initial conditions, parameter estimation and uncertainty in the observations have not been addressed rigorously. We use an example to show how magnitude errors can strongly distort the forecasts of an ETAS model and how because of the distortion the model will be rejected by the suggested RELM evaluation procedure even if we take the (undistorted) most likely forecast as the observation. Data assimilation is the ideal vehicle to help solve problems of uncertain initial conditions. We report on the progress made to implement data assimilation procedures for simple point process seismicity models. As a first step, we present a successive Bayesian parameter estimation and forecast procedure for a noiys Poisson process.
Werner, M. J.; Ide, K.; Sornette, D.
Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
This paper presents four algorithms to generate random forecast error time series. The performance of four algorithms is compared. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets used in power grid operation to study the net load balancing need in variable generation integration studies. The four algorithms are truncated-normal distribution models, state-space based Markov models, seasonal autoregressive moving average (ARMA) models, and a stochastic-optimization based approach. The comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics (i.e., mean, standard deviation, autocorrelation, and cross-correlation). The results show that all methods generate satisfactory results. One method may preserve one or two required statistical characteristics better the other methods, but may not preserve other statistical characteristics as well compared with the other methods. Because the wind and load forecast error generators are used in wind integration studies to produce wind and load forecasts time series for stochastic planning processes, it is sometimes critical to use multiple methods to generate the error time series to obtain a statistically robust result. Therefore, this paper discusses and compares the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.
Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.; Samaan, Nader A.; Makarov, Yuri V.
Coulomb stress changes (?CFS) have been recognized as a major trigger mechanism for earthquakes, in particular aftershock distributions and the spatial patterns of ?CFS are often found to be correlated. However, the Coulomb stress calculations are based on slip inversions and the receiver fault mechanisms which both contain large uncertainties. In particular, slip inversions are usually non-unique and often differ strongly for the same earthquakes. Here we want to address the information content of those inversions with respect to aftershock forecasting. Therefore we compare the slip models to randomized fractal slip models which are only constrained by fault information and moment magnitude. The uncertainty of the aftershock mechanisms is considered by using many receiver fault orientations, and by calculating ?CFS at several depth layers. The stress change is then converted into an aftershock probability map utilizing a clock advance model. To estimate the information content of the slip models, we use an Epidemic Type Aftershock Sequence (ETAS) model approach introduced by Bach and Hainzl (2012), where the spatial probability density of direct aftershocks is related to the ?CFS calculations. Besides the directly triggered aftershocks, this approach also takes secondary aftershock triggering into account. We quantify our results by calculating the information gain of the randomized slip models relative to the corresponding published slip model. As case studies, we investigate the aftershock sequences of several well-known main shocks such as 1992 Landers, 1999 Hector Mine, 2004 Parkfield, 2002 Denali. First results show a huge difference in the information content of slip models. For some of the cases up to 90% of the random slip models are found to perform better than the originally published model, for some other cases only few random models are found performing better than the published slip model.
Bach, Christoph; Hainzl, Sebastian
CAMD performs a variety of economic modeling analyses to evaluate the impact of air emissions control policies on the electric power sector. A range of tools are used for this purpose including linear programming models, general equilibrium models, and spreadsheet models. Examp...
This paper surveys work on dynamic heterogeneous agent models (HAMs) in economics and finance. Emphasis is given to simple models that, at least to some extent, are tractable by analytic methods in combination with computational tools. Most of these models are behavioral models with boundedly rational agents using different heuristics or rule of thumb strategies that may not be perfect,
Cars H. Hommes
The wind power forecast of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast is based on a mesoscale meteorological models that provides the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. The corrected wind data are then used as input in the wind farm power curve to obtain the power forecast. These computations require historical time series of wind measured data (by an anemometer located in the wind farm or on the nacelle) and power data in order to be able to perform the statistical analysis on the past. For this purpose a Neural Network (NN) is trained on the past data and then applied in the forecast task. Considering that the anemometer measurements are not always available in a wind farm a different approach has also been adopted. A training of the NN to link directly the forecasted meteorological data and the power data has also been performed. The normalized RMSE forecast error seems to be lower in most cases by following the second approach. We have examined two wind farms, one located in Denmark on flat terrain and one located in a mountain area in the south of Italy (Sicily). In both cases we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by using two or more models (RAMS, ECMWF deterministic, LAMI, HIRLAM). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error of at least 1% compared to the singles models approach. Moreover the use of a deterministic global model, (e.g. ECMWF deterministic model) seems to reach similar level of accuracy of those of the mesocale models (LAMI and RAMS). Finally we have focused on the possibility of using the ensemble model (ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first day ahead period. In fact low spreads often correspond to low forecast error. For longer forecast horizon the correlation between RMSE and ensemble spread decrease becoming too low to be used for this purpose.
Alessandrini, S.; Decimi, G.; Hagedorn, R.; Sperati, S.
A numerical advection model which can be run on a local computer in a real-time forecast environment is described. This isentropic forecast model provides the local forecast office with easy access to temporally and spatially detailed estimates of atmospheric temperature, moisture, and wind field changes between 12-h rawinsonde observations. Case studies are presented based on the use of the model to predict the preconvective environment in spring and summer situations. Short-term forecasts of midtropospheric static stability patterns and stability index changes are traced for several severe storm events with and without the inclusion of surface data observed during the day. Forecast images of VAS low- and midlevel moisture fields and vertical moisture gradients are compared with the observations to determine the utility of the combined model/VAS imagery as a nowcasting guide.
Petersen, Ralph A.; Homan, Jeffrey H.
Water supply forecasts in the Sierra Nevada using ground-based measurements of snow water equivalent are uncertain because neither point measurements nor transects adequately explain spatial or temporal variability in mountainous terrain. The statistical relationships between the snow observations and streamflow do not perform well in extreme years or in basins with ephemeral snow and may prove less reliable in the future with a changing climate. Since 1990, forecast errors in the Sacramento, San Joaquin, Tulare and Lahontan drainages have had median errors of 10% to 30% and an error in every 1 out of 5 years of 25% to 70%. To address this problem, we combine satellite-based retrievals of fractional snow cover for a 12-year period starting in 2000 with spatially distributed energy balance calculations to reconstruct the snow water equivalent (SWE) values throughout each melt season. The 12-year period of study captures an average of 70% of the streamflow range of the last 80 years in the 18 basins with such estimates available. Reconstructed SWE is validated with: (i) snow pillows (ii) snow courses that show the model can accurately predict maximum SWE at the regularly sampled locations for a range of wet, mean and dry years. Validation from snow surveys in 2010 on slopes of up to 21° at the highest elevations in the American River basin show the model also performs well in a variety of topography. The relationship of SWE with elevation is significantly different for wet, mean and dry years as well as between drainages. Certain latitudes receive proportionally less water in dry years and more water in wet years than other latitudes. At the scale at which water is managed the relationship between SWE and SCA becomes increasingly correlated from March 1st to July 1st, such that real-time SCA observations may be sufficient for SWE prediction. We compare spatially integrated volumes of snow water equivalent from the retrospective model and 2 near real time models with full natural flow estimates in all 18 Sierra Nevada basins. The near real time models consist of an interpolation constrained by remotely sensed maps of snow-covered area and the Snow Data and Assimilation System (SNODAS). April 1 SWE is compared with unimpaired streamflow using the absolute magnitudes, the Spearman rank correlation coefficient, and linear regressions. The results show that the reconstruction performs the best at estimating the unimpaired streamflow, followed by the interpolation and then SNODAS. The implication is that the real-time models can be evaluated with the retrospective one. Moreover, the reconstruction provides a historical perspective to put the real-time estimates into context.
Rittger, K. E.; Dozier, J.; Kahl, A.
A mesoscale boundary-layer model (BLM) is used for running 12-h low-level wind forecasts for the La Plata River region. Several experiments are performed with different boundary conditions that include operational forecasts of the Eta/CPTEC model, local observations, as well as a combination of both. The BLM wind forecasts are compared to the surface wind observations of five weather stations during the period November 2003-April 2004. Two accuracy measures are used: the hit rate or percentage of cases with agreement in the wind direction sector, and the root-mean-squared error (RMSE) of the horizontal wind components. The BLM surface wind forecasts are always more accurate, since its averaged hit rate is three times greater and its averaged RMSE is one half smaller than the Eta forecasts. Despite the large errors in the surface winds displayed by the Eta forecasts, its 850 hPa winds and surface temperature forecasts are able to drive the BLM model to obtain surface winds forecasts with smaller errors than the Eta model. An additional experiment demonstrates that the advantage of using the BLM model for forecasting low-level winds over the La Plata River region is the result of a more appropriate definition of the land-river surface temperature contrast. The particular formulation that the BLM model has for the geometry of the river coasts is fundamental for resolving the smaller scale details of the low-level local circulation. The main conclusion of the study is that operational low-level wind forecasts for the La Plata River region can be improved by running the BLM model forced by the Eta operational forecasts.
Sraibman, L.; Berri, G. J.
A transition from deterministic to probabilistic forecasts of the dispersion of emissions from the Kilauea Volcano on the Island of Hawaii is under way. Operational forecasts of volcanic smog (vog) have been produced for 3 years by a custom version of NOAA's Hysplit dispersion model (vog model hereafter), a Lagrangian transport model that uses high-resolution WRF-ARW model output for initial conditions run at the University of Hawaii at Manoa. The vog model has been successful in predicting which locations in the State of Hawaii will be impacted by the vog plume. Initial concentrations of emissions from the volcano are set empirically based on downstream observations provided by the Hawaiian Volcano Observatory. Fast changing meteorological conditions and/or rapid variations in emissions rates cause forecast errors to increase. Recent efforts aim to leverage the parallelism of Hysplit to run ensemble forecasts with various initial condition configurations to better quantify the forecast uncertainty. The ensemble will contain 28 members each with perturbed heights and locations of initial aerosol concentrations. Forecast sulfur dioxide and sulfate aerosol concentrations follow Air Resources Laboratory's Air Quality Index (AQI). The resulting probabilistic forecasts will provide probability of exceedance plots and concentration-probability plots for each AQI level. Because some people are extremely sensitive to low concentrations of sulfate aerosols, the lowest AQI levels will be distinguished in the exceedance map output. Downstream observations at Pahala and Kona will be used to validate the ensemble results, which will also be compared to the results of deterministic forecasts.
Pattantyus, A.; Businger, S.
Numerical weather and climate models constitute the best available tools to tackle the problems of weather prediction and climate projection. These models have played a key role in the attribution of the observed climate change to anthropogenic causes. However, a better understanding of the current models and the development of improved models are still required to address issues such as the interpretation of climate projections and the large uncertainties still present in regional climate change studies. Two assumptions lie at the heart the climate model suitability: (1) a climate attractor exists, and (2) the model attractor lies on the climate attractor, or at least on the projection of the climate attractor onto model space. In this contribution, two versions of the Lorenz '96 system are used, one as a prototype system and another as an imperfect model, to investigate the implications of assumption (2). In particular, the dependence of model-generated climate on forecast lead time is examined. It is shown that forecasts produced by the imperfect model rapidly diverge from the system's orbit and that this divergence is mainly due to model error. As a result, climatologies produced from these divergent forecasts show a dependence on forecast lead time. This dependence is characterised by an initial rapid bias growth with respect to the system's climatology. The initial bias growth ends at a saturation level which is reached as the transient period in individual forecasts dies out (spin-up period). Furthermore, it is shown that, once the spin-up period is over, climatologies generated with long-term integrations of both the prototype system and the imperfect model are essentially the same as climatologies generated from short-term forecasts from a perfect and an imperfect model, respectively. Despite its simplicity with respect to the actual climate system, this study about the Lorenz '96 system shows features that are relevant for climate studies and the understanding of climate models. In order to show this, two examples using real-world data from operational forecasting systems and climate experiments are also discussed.
The neural network approach is applied to the prediction of the flow of the River Nile. A multilayer feedforward network is constructed and trained by the backpropagation algorithm. We propose several different methods for single-step ahead forecast and multi-step ahead forecast in an attempt to get the least prediction error. These methods investigate different ways to preprocess the inputs and
S. El Shoura; M. El Sherif; A. Atiya; S. Shaheen
Many large metropolitan areas experience elevated concentrations of ground-level ozone pollution during the summertime “smog season”. Local environmental or health agencies often need to make daily air pollution forecasts for public advisories and for input into decisions regarding abatement measures and air quality management. Such forecasts are usually based on statistical relationships between weather conditions and ambient air pollution concentrations.
Andrew C. Comrie
Abstract Probabilistic forecasts of wind speed are becoming critical as interest grows in wind as a clean and renewable source of energy, in addition to a wide range of other uses, from avia- tion to recreational boating. Statistical approaches to wind forecasting offer two particular challenges: the distribution of wind speeds is highly skewed, and wind observations are re- ported
J. McLean Sloughter; Tilmann Gneiting; Adrian E. Raftery
It is recognized today that short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with a paramount information on the uncertainty of expected wind generation. When considering different areas covering a region, they are produced independently, and thus neglect the interdependence structure of prediction errors, induced by movement of meteorological fronts, or more generally by
George Papaefthymiou; Pierre Pinson
In this study, we present an innovative analytical method to determine the angular width and propagation orientation of halo Coronal Mass Ejections (CMEs). The relation of CME actual speed with apparent speed and its components measured at different position angle has been investigated. The present work is based on the cone model proposed by . We have improved this model by (1) eliminating the ambiguity via a new analytical approach, (2) using direct measurements of projection onto the plane of the sky (POS), (3) determining the actual radial speeds from projection speeds at different position angles to clarify the uncertainty of projection speeds in previous empirical models. Our analytical approach allows us to use coronagraph data to determine accurately the geometrical features of POS projections, such as major axis, minor axis, and the displacement of the center of its projection, and to determine the angular width and orientation of a given halo CME. Our approach allows for the first time the determination of the actual CME speed, width, and source location by using coronagraph data quantitatively and consistently. The method greatly enhances the accuracy of the derived geometrical and kinematical properties of halo CMEs, and can be used to optimize Space Weather forecasts. The applied model predications are in good agreement with observations.
Xie, Hong; Ofman, Leon; Lawrence, Gareth
Russian Federation having giant area has low concentration of land meteorological check points. Net of monitoring is not enough for effective forecast and prediction of weather dynamics and extremely situations. Under increase of extremely situations and incidents - hurricanes et al (two times from begin of XXI century) reconstruction and "perestroika" of monitoring net is needful and necessary. The basis of such a progress is distant monitoring using planes and satellites adding land contact monitoring base on efforts of existed points and stations. Interaction of contact and distant views may make hydro meteorological data and prediction more fine and significant. Tradition physical methods must be added by new biological methods of modern study. According to gotten researches animal are able to predict extremely hazards of natural and anthropogenic nature basing of interaction between biological matter and probable physical field that is under primary study. For example it was animals which forecasted dropping of Chelyabinsk meteorite of 2013. Adding of biological indication with complex of meteorological data may increase significance of hazard prediction. The uniting of all data and approaches may become basis of proposed mathematical hydro meteorological weather models. Introduction to practice reported complex methods may decrease of loss from hydro meteorological risks and hazards and increase stability of country economics.
Sapunov, Valentin; Dikinis, Alexandr; Voronov, Nikolai
Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.
Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.
We have initially developed a time-independent forecast for southern California by smoothing the locations of magnitude 2 and larger earthquakes. We show that using small m 2 earthquakes gives a reasonably good prediction of m 5 earthquakes. Our forecast outperforms other time-independent models (Kagan and Jackson, 1994; Frankel et al., 1997), mostly because it has higher spatial resolution. We have
Agnes Helmstetter; Yan Y. Kagan; David D. Jackson
For improving intellectualized management level of holiday tourism and making a contribution to the healthy, orderly and safely development of holiday tourism, we built the tourism requirement forecasting model on the basis of RBF neural network, realized the classification, analysed and forecasted the tourism status and realized the intellectualized management of holiday tourism. It was used to generate RBF net
Wensheng Guo; Junping Du; Ruijie Wang
This paper examines an archive containing over 40 years of 8-day atmospheric forecasts over the contiguous United States from the NCEP reanalysis project to assess the possibilities for using medium-range numerical weather prediction model output for predictions of streamflow. This analysis shows the biases in the NCEP forecasts to be quite extreme. In many regions, systematic precipitation biases exceed 100%
Martyn P. Clark; Lauren E. Hay
Since October 1995 a daily forecast of the UV index, as the irradiance of the biologically-effective ultraviolet radiation, for the next day is published for Austria, Europe and world wide. The Austrian forecast model as well as the input parameters are described. By connecting the UV index with the sensitivity of the photobiological skin types, a recommendation is given to
A. W. Schmalwieser; G. Schauberger
This paper examines two episodes of heavy rainfall and significantly increased water levels. The first case relates to the period including the beginning and the end of the third decade of June 2010 at the Kolubara river basin, where extreme rainfall led to two big flood waves on the Kolubara river, whereat water levels exceeded both regular and extraordinary flood defence and approached their historical maximum. The second case relates to the period including the end of November and the beginning of December 2010 at the Jadar river basin, where heavier precipitation caused the water levels of the basin to reach and surpass the occurrence limit (warning level). The HBV (Hydrological Bureau Waterbalance-section) rainfall/snowmelt - runoff model installed at the RHMSS uses gridded quantitative precipitation and air temperature forecast for 72 hours in advance based on meteorological weather forecast WRF-NMM mesoscale model. Nonhydrostatic Mesoscale Model (NMM) core of the Weather Research and Forecasting (WRF) system is flexible state-of-the-art numerical weather prediction model capable to describe and estimate powerful nonhydrostatic mechanism in convective clouds that cause heavy rain. The HBV model is a semi-distributed conceptual catchment model in which the spatial structure of a catchment area is not explicitly modelled. Instead, the sub-basin represents a primary modelling unit while the basin is characterised by area-elevation distribution and classification of vegetation cover and land use distributed by height zone. WRF-NMM forecast shows very good agreement with observations in terms of timing, location and amount of precipitation. They are used as input for HBV model, forecasted discharges at the output profile of the selected river basin represent model output for consideration. 1 Republic Hydrometeorological Service of Serbia
Deki?, L.; Mihalovi?, A.; Jovi?i?, I.; Vladikovi?, D.; Jerini?, J.; Ivkovi?, M.
The performance of a real-time physically based rainfall forecasting model is examined using radar, satellite, and ground station data for a region of Oklahoma. Model formulation is described in an accompanying paper (French and Krajewski, this issue). Spatially distributed radar reflectivity observations are coupled with model physics and uncertainty analysis through (1) linearization of model dynamics and (2) a Kalman filter formulation. Operationally available remote sensing observations from radar and satellite, and surface meteorologic stations define boundary conditions of the two-dimensional rainfall model. The spatially distributed rainfall is represented by a two-dimensional field of cloud columns, and model physics define the evolution of vertically integrated liquid water content (the model state) in space and time. Rainfall forecasts are evaluated using least squares criteria such as mean error of forecasted rainfall intensity, root mean square error of forecasted rainfall intensity, and correlation coefficient between spatially distributed forecasted and observed rainfall rates. The model performs well compared with two alternative real-time forecasting strategies: persistence and advection forecasting.
French, Mark N.; Andrieu, Hervé; Krajewski, Witold F.
The Electra model for the high-latitude ionosphere is parametrized with polar cap activity and solar wind key parameters. An early version of it, driven with real-time solar wind input from the ACE spacecraft produces regular real-time forecasts of the large-scale activity (see website address below). Prediction outputs include qualitative maps of activity, geomagnetic indices, and local ground magnetic and ionospheric electric fields. Prediction accuracy is determined primarily by the input, and type of model, and additionally by the initial conditions. We present three test cases: a) a Northward BZ interval characterized by low magnetic activity and reverse convection patterns; b) time-dependent enhanced convection, and c) two small-scale (-600 nT) substorm intervals. The model reproduces the large-scale spatial development of convection and magnetospheric substorms as well as the regional indices of geomagnetic activity. For those cases, the choice of the input sequence is much more important than the initial conditions. The local fields, however, are predicted less accurately. In that case the prediction error is additionally a function of local time and type of activity. >http://lep694.gsfc.nasa.gov/RTSM/People/vassi/rt/spatio.html a>
Klimas, A. J.; Vassiliadis, D.; Weigel, R. S.; Uritsky, V. M.
Macroeconomic policy decisions in real-time are based the assessment of current and future economic conditions. These assessments are made difficult by the presence of incomplete and noisy data. The problem is more acute for emerging market economies, where most economic data are released infrequently with a (sometimes substantial) lag. This paper evaluates \\
Rafael Romeu; Troy Matheson; Philip Liu
Two projects at NASA Marshall Space Flight Center have collaborated to develop a high resolution weather forecast model for Mesoamerica: The NASA Short-term Prediction Research and Transition (SPoRT) Center, which integrates unique NASA satellite and weather forecast modeling capabilities into the operational weather forecasting community. NASA's SERVIR Program, which integrates satellite observations, ground-based data, and forecast models to improve disaster response in Central America, the Caribbean, Africa, and the Himalayas.
Molthan, Andrew; Case, Jonathan; Venner, Jason; Moreno-Madrinan, Max J.; Delgado, Francisco
Putting forward a new method of power system medium-long term load forecast model establishment based on state space time-varying parameter equation theory, this work brings state variable into model observable for solution, which reflects the varying rules of equilibrium relation of variables and enhances the veracity for model forecast. The State Space time-varying parameter model describes the variational regular of
Xiang Li; Hao Chen; Shan Gao
The present study developed an artificial neural network (ANN) model to overcome the difficulties in training the ANN models with continuous data consisting of rainy and non-rainy days. Among the six models analyzed the ANN model which used generalized feedforward type network and a hyperbolic tangent function and a combination of meteorological parameters (relative humidity, air pressure, wet bulb temperature and cloudiness), and the rainfall at the point of forecasting and rainfall at the surrounding stations, as an input for training of the model was found most satisfactory in forecasting rainfall in Bangkok, Thailand. The developed ANN model was applied to derive rainfall forecast from 1 to 6 h ahead at 75 rain gauge stations in the study area as forecast point from the data of 3 consecutive years (1997-1999). Results were highly satisfactory for rainfall forecast 1 to 3 h ahead. Sensitivity analysis indicated that the most important input parameter beside rainfall itself is the wet bulb temperature in forecasting rainfall. Based on these results, it is recommended that the developed ANN model can be used for real-time rainfall forecasting and flood management in Bangkok, Thailand.
Hung, N. Q.; Babel, M. S.; Weesakul, S.; Tripathi, N. K.
Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.
Soltanzadeh, I.; Azadi, M.; Vakili, G. A.
Species distribution models (SDMs) are widely used to forecast changes in the spatial distributions of species and communities in response to climate change. However, spatial autocorrelation (SA) is rarely accounted for in these models, despite its ubiquity in broad-scale ecological data. While spatial autocorrelation in model residuals is known to result in biased parameter estimates and the inflation of type I errors, the influence of unmodeled SA on species' range forecasts is poorly understood. Here we quantify how accounting for SA in SDMs influences the magnitude of range shift forecasts produced by SDMs for multiple climate change scenarios. SDMs were fitted to simulated data with a known autocorrelation structure, and to field observations of three mangrove communities from northern Australia displaying strong spatial autocorrelation. Three modeling approaches were implemented: environment-only models (most frequently applied in species' range forecasts), and two approaches that incorporate SA; autologistic models and residuals autocovariate (RAC) models. Differences in forecasts among modeling approaches and climate scenarios were quantified. While all model predictions at the current time closely matched that of the actual current distribution of the mangrove communities, under the climate change scenarios environment-only models forecast substantially greater range shifts than models incorporating SA. Furthermore, the magnitude of these differences intensified with increasing increments of climate change across the scenarios. When models do not account for SA, forecasts of species' range shifts indicate more extreme impacts of climate change, compared to models that explicitly account for SA. Therefore, where biological or population processes induce substantial autocorrelation in the distribution of organisms, and this is not modeled, model predictions will be inaccurate. These results have global importance for conservation efforts as inaccurate forecasts lead to ineffective prioritization of conservation activities and potentially to avoidable species extinctions. PMID:24845950
Crase, Beth; Liedloff, Adam; Vesk, Peter A; Fukuda, Yusuke; Wintle, Brendan A
A technique has been developed that uses Puff, a volcanic ash transport and dispersion (VATD) model, to forecast the relative exposure of aircraft and ground facilities to ash from a volcanic eruption. VATD models couple numerical weather prediction (NWP) data with physical descriptions of the initial eruptive plume, atmospheric dispersion, and settling of ash particles. Three distinct examples of variations on the technique are given using ERA-40 archived reanalysis NWP data. The Feb. 2000 NASA DC-8 event involving an eruption of Hekla volcano, Iceland is first used for analyzing a single flight. Results corroborate previous analyses that conclude the aircraft did encounter a diffuse cloud of volcanic origin, and indicate exposure within a factor of 10 compared to measurements made on the flight. The sensitivity of the technique to dispersion physics is demonstrated. The Feb. 2001 eruption of Mt. Cleveland, Alaska is used as a second example to demonstrate how this technique can be utilized to quickly assess the potential exposure of a multitude of aircraft during and soon after an event. Using flight tracking data from over 40,000 routes over three days, several flights that may have encountered low concentrations of ash were identified, and the exposure calculated. Relative changes in the quantity of exposure when the eruption duration is varied are discussed, and no clear trend is evident as the exposure increased for some flights and decreased for others. A third application of this technique is demonstrated by forecasting the near-surface airborne concentrations of ash that the cities of Yakima Washington, Boise Idaho, and Kelowna British Columbia might have experienced from an eruption of Mt. St. Helens anytime during the year 2000. Results indicate that proximity to the source does not accurately determine the potential hazard. Although an eruption did not occur during this time, the results serve as a demonstration of how existing cities or potential locations of research facilities or military bases can be assessed for susceptibility to hazardous and unhealthy concentrations of ash and other volcanic gases.
Peterson, Rorik A.; Dean, Ken G.
The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the last element. Specifically, we will discuss present capabilities, and the potential to derive further tools. These capabilities will be interpreted in the context of a broad-based, bootstrapping activity for modern Space Weather forecasting.
The Techniques Development Laboratory has a project called the Local AFOS MOS Program (LAMP). The purpose of the project is to provide Model Output Statistics (MOS) forecasts to a Weather Service Forecast Office (WSFO) for essentially all locations for wh...
H. R. Glahn
This paper considers forecasting by econometric and time series models using preliminary (or provisional) data. The standard practice is to ignore the distinction between provisional and final data. We call the forecasts that ignore such a distinction naive forecasts, which are generated as projections from a correctly specified model using the most recent estimates of the unobserved final figures. It
Forecasting with macroeconometric models is now a well established, worldwide practice. Forecasters typically report a point estimate for each macroeconomic variable for each period in the future. However, it is very rare that forecasters will also provid...
R. J. Corker R. Ellis S. Holly
This publication presents forecasts of commuter air carrier activity and describes the models designed for forecasting Conterminous United States, Puerto Rico and the Virgin Islands, Hawaii, and individual airport activity. These forecasts take into accou...
H. Medville C. Starry G. Bernstein
Examples of studies which incorporate satellite imagery and other conventional or remotely-sensed data for nowcasting and very short term weather forecasting are presented. One of the techniques merges a cloud-pattern steering technique and a statistical ...
G. S. Forbes K. A. Degroodt R. L. Scheinhartz
The European Centre for Medium Range Weather Forecasts operationally produce medium range forecasts using what is internationally acknowledged as the world leading global weather forecast model. Future development of this scientifically advanced model relies on a continued availability of experts in the field of meteorological science and with high-level software skills. ECMWF therefore has a vested interest in young scientists and University graduates developing the necessary skills in numerical weather prediction including both scientific and technical aspects. The OpenIFS project at ECMWF maintains a portable version of the ECMWF forecast model (known as IFS) for use in education and research at Universities, National Meteorological Services and other research and education organisations. OpenIFS models can be run on desktop or high performance computers to produce weather forecasts in a similar way to the operational forecasts at ECMWF. ECMWF also provide the Metview desktop application, a modern, graphical, and easy to use tool for analysing and visualising forecasts that is routinely used by scientists and forecasters at ECMWF and other institutions. The combination of Metview with the OpenIFS models has the potential to deliver classroom-friendly tools allowing students to apply their theoretical knowledge to real-world examples using a world-leading weather forecasting model. In this paper we will describe how the OpenIFS model has been used for teaching. We describe the use of Linux based 'virtual machines' pre-packaged on USB sticks that support a technically easy and safe way of providing 'classroom-on-a-stick' learning environments for advanced training in numerical weather prediction. We welcome discussions with interested parties.
Carver, Glenn; Vá?a, Filip; Siemen, Stephan; Kertesz, Sandor; Keeley, Sarah
Probabilistic streamflow prediction based on past climate records or meteorological forecasts have drawn much attention in recent years. It is usually incorporated into operational forecasting systems by government agencies and industries to deal with water resources management and regulation problems. This work presents an operational prototype for short to medium term ensemble streamflow predictions over Quebec, Canada. The system uses ensemble meteorological forecasts for short term (up to 7 days) forecasting, transitioning to a stochastic weather generator conditioned on historical data for the period exceeding 7 days. The precipitation and temperature series are then fed into a combination of 32 hydrology models to account for both the meteorological and hydrology modelling uncertainties. A novel post-processing approach was implemented to correct the biases and the under-dispersion of ensemble meteorological forecasts. This post-processing approach links the mean of the ensemble meteorological forecast to parameters of a stochastic weather generator (absolute probability of precipitation and observed precipitation mean in the case of precipitation). The stochastic weather generator is then used to generated unbiased times series with accurate spread. Results show that the post-processed meteorological forecasts displayed skill for a period up to 7 days for both precipitation and temperature. The ensemble streamflow prediction displayed more skill than when using the deterministic forecast or the stochastic weather generator not conditioned on the ensemble meteorological forecasts. To tackle the uncertainty linked to the hydrology model, 4 different models calibrated with up to 9 different efficiency metrics (for a combination of 32 models/calibrations). Nine different averaging schemes were compared to attribute weights to the 32 combinations. The best averaging method (Granger-Ramanathan) produced estimates with a much better efficiency than the best performing model, while removing all biases linked to the hydrology modelling.
Chen, Jie; Brissette, François; Arsenault, Richard; Gatien, Philippe; Roy, Pierre-Olivier; Li, Zhi; Turcotte, Richard
In recent years, load forecasting has become one of the main fields of study and research. Short Term Load Forecasting (STLF) is an important part of electrical power system operation and planning. This work investigates the applicability of different approaches; Artificial Neural Networks (ANNs) and hybrid analytical models to forecast residential load in Kingdom of Saudi Arabia (KSA). These two techniques are based on model human modes behavior formulation. These human modes represent social, religious, official occasions and environmental parameters impact. The analysis is carried out on residential areas for three regions in two countries exposed to distinct people activities and weather conditions. The collected data are for Al-Khubar and Yanbu industrial city in KSA, in addition to Seattle, USA to show the validity of the proposed models applied on residential load. For each region, two models are proposed. First model is next hour load forecasting while second model is next day load forecasting. Both models are analyzed using the two techniques. The obtained results for ANN next hour models yield very accurate results for all areas while relatively reasonable results are achieved when using hybrid analytical model. For next day load forecasting, the two approaches yield satisfactory results. Comparative studies were conducted to prove the effectiveness of the models proposed.
Al-Harbi, Ahmad Abdulaziz
The author presents an economic model for selective admissions to postsecondary nursing programs. Primary determinants of the admissions model are employment needs, availability of educational resources, and personal resources (ability and learning potential). As there are more applicants than resources, selective admission practices are…
Although raccoons are widely distributed throughout North America, the raccoon rabies virus variant is enzootic only in the eastern United States, based on current surveillance data. This variant of rabies virus is now responsible for >60% of all cases of animal rabies reported in the United States each year. Ongoing national efforts via an oral rabies vaccination (ORV) program are aimed at preventing the spread of raccoon rabies. However, from an epidemiologic perspective, the relative susceptibility of naïve geographic localities, adjacent to defined enzootic areas, to support an outbreak, is unknown. In the current study, we tested the ability of a spatial risk model to forecast raccoon rabies spread in presumably rabies-free and enzootic areas. Demographic, environmental, and geographical features of three adjacent states (Ohio, West Virginia, and Pennsylvania), which include distinct raccoon rabies free, as well as enzootic areas, were modeled by using a Poisson Regression Model, which had been developed from previous studies of enzootic raccoon rabies in New York State. We estimated susceptibility to raccoon rabies emergence at the census tract level and compared the results with historical surveillance data. Approximately 70% of the disease-free region had moderate to very high susceptibility, compared with 23% in the enzootic region. Areas of high susceptibility for raccoon rabies lie west of current ORV intervention areas, especially in southern Ohio and western West Virginia. Predicted high susceptibility areas matched historical surveillance data. We discuss model implications to the spatial dynamics and spread of raccoon rabies, and its application for designing more efficient disease control interventions. PMID:21995266
Recuenco, Sergio; Blanton, Jesse D; Rupprecht, Charles E
We proposed a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and is required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and amplification or bias correction of forecast anomalies. Characterizing and decomposing forecast error in this way has two important applications, which we term the assessment application and the objective analysis application. For the assessment application, our approach results in new objective measures of forecast skill which are more in line with subjective measures of forecast skill and which are useful in validating models and diagnosing their shortcomings. With regard to the objective analysis application, meteorological analysis schemes balance forecast error and observational error to obtain an optimal analysis. Presently, representations of the error covariance matrix used to measure the forecast error are severely limited. For the objective analysis application our approach will improve analyses by providing a more realistic measure of the forecast error. We expect, a priori, that our approach should greatly improve the utility of remotely sensed data which have relatively high horizontal resolution, but which are indirectly related to the conventional atmospheric variables. In this project, we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically, we study the forecast errors of the sea level pressure (SLP) and 500 hPa geopotential height fields for forecasts of the short and medium range. Since the forecasts are generated by the GEOS (Goddard Earth Observing System) data assimilation system with and without ERS 1 scatterometer data, these preliminary studies serve several purposes. They (1) provide a testbed for the use of the distortion representation of forecast errors, (2) act as one means of validating the GEOS data assimilation system and (3) help to describe the impact of the ERS 1 scatterometer data.
Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher
Hybrid modelling, used for simulation and forecasting of hydrological time series, involving both process-based and data-driven types of models combines the available domain knowledge and process physics with the recent advances in data driven tools. In this way, complex hydrological processes can be modelled and forecasted by decomposing the problem into several smaller sub - problems and using process physics based models where these are most appropriate, and data dictated tools (such as ANN, time series models or traditional statistics) for the residual data, when necessary. The fitting and forecasting performance of such models have to be explored case based. So far, only a few attempts to apply various nonlinear time series models within such a framework were reported in the hydrological modelling literature. This contribution presents results concerning the possibility to use GARCH type of models for such purposes. More specifically, error time series from two hydrological conceptual models were analyzed (applied on time series measured from the Hron and Morava Rivers in Slovakia), concentrating on the improvement of the modelling and forecasting performance of these models. The goal of investigation was to try to expand the knowledge in the time series modelling of hydrological model error time series with the aim to test and develop appropriate methods for various time steps from the GARCH family of models. In order to achieve this, following steps were taken: 1. The presence of heteroscedasticity was verified in time series. 2. A model from the GARCH family was fitted on the data, comparing the fit with a fit of an ARMA model. 3. One - step - ahead forecasts from the fitted models were produced, performing comparisons. The investigation of model properties and performances was thoroughly tested under various conditions of their future practical applications. In general, heteroscedasticity was present in the majority of the error time series of the hydrological models. However, the GARCH family of models proved to be suited in removing it only in daily time step. The basic GARCH model was not applicable on any of the time series. In all other investigated cases, the EGARCH(1,1) model had to be used. Unlike in econometric time series, where the so called leverage effect (i.e. the series reacts more strongly to negative changes) is present and pointed out by this model, here the data tends to react more strongly on positive changes. In this particular case it was found, that the general property of hydrological processes, that the rise of discharge is rainfall driven (a highly nonlinear chaotic intermittent process) and the decrease of discharge is ruled by the damping effects of the water storage in the driven system (catchment or river reach), is present also in the hydrological model error series. This shows, that the modelling and forecasting of floods (pulse like rising discharge) is a more demanding task than that of droughts (slowly decreasing flows). Even though the GARCH models did show partial improvements in the modelling and forecasting of flows, they still have several serious disadvantages (such as high sensitivity to the chosen fitting period) and possible further use should be further investigated. These results are of importance with respect to future attempts of modelling of error time series of hydrological models in such hybrid frameworks. They underpin the need of a non-mechanistic approach in the case based analysis of such data and the physical interpretation of statistical modelling results.
The Martinec-Rango snowmelt runoff model was applied to two watersheds in the Rio Grande basin, Colorado-the South Fork Rio Grande, a drainage encompassing 216 sq mi without reservoirs or diversions and the Rio Grande above Del Norte, a drainage encompassing 1,320 sq mi without major reservoirs. The model was successfully applied to both watersheds when run in a simulation mode for the period 1973-79. This period included both high and low runoff seasons. Central to the adaptation of the model to run in a forecast mode was the need to develop a technique to forecast the shape of the snow cover depletion curves between satellite data points. Four separate approaches were investigated-simple linear estimation, multiple regression, parabolic exponential, and type curve. Only the parabolic exponential and type curve methods were run on the South Fork and Rio Grande watersheds for the 1980 runoff season using satellite snow cover updates when available. Although reasonable forecasts were obtained in certain situations, neither method seemed ready for truly operational forecasts, possibly due to a large amount of estimated climatic data for one or two primary base stations during the 1980 season.
Shafer, B.; Jones, E. B.; Frick, D. M. (principal investigators)
Load forecasting is necessary for economic generation of power, economic allocation between plants (unit commitment scheduling), maintenance scheduling, and for system security such as peak load shaving by power interchange with interconnected utilities. In this paper a fuzzy linear regression model for summer and winter seasons is developed. The estimation fuzzy problem for the model is turned out to linear
A. M. Al-Kandari; S. A. Soliman; M. E. El-Hawary
In eutrophic sub-tropical coastal waters around Hong Kong and South China, algal blooms (more often called red tides) due to the rapid growth of microscopic phytoplankton are often observed. Under favourable environmental conditions, these blooms can occur and subside over rather short time scales—in the order of days to a few weeks. Very often, these blooms are observed in weakly flushed coastal waters under calm wind conditions—with or without stratification. Based on high-frequency field observations of harmful algal blooms at two coastal mariculture zones in Hong Kong, a mathematical model has been developed to forecast algal blooms. The model accounts for algal growth, decay, settling and vertical turbulent mixing, and adopts the same assumptions as the classical Riley, Stommel and Bumpus model (Riley, G.A., Stommel, H., Bumpus, D.F., 1949. Quantitative ecology of the plankton of the western North Atlantic. Bulletin of the Bingham Oceanographic Collection Yale University 12, 1-169). It is shown that for algal blooms to occur, a vertical stability criterion, E < 4 ?l2/ ?2, must be satisfied, where E, ?, l are the vertical turbulent diffusivity, algal growth rate, and euphotic layer depth respectively. In addition, a minimum nutrient threshold concentration must be reached. Moreover, with a nutrient competition consideration, the type of bloom (caused by motile or non-motile species) can be classified. The model requires as input simple and readily available field measurements of water column transparency and nutrient concentration, and representative maximum algal growth rate of the motile and non-motile species. In addition, with the use of three-dimensional hydrodynamic circulation models, simple relations are derived to estimate the vertical mixing coefficient as a function of tidal range, wind speed, and density stratification. The model gives a quick assessment of the likelihood of algal bloom occurrence, and has been validated against field observations over a 4-year period. The model helps to explain the observed spatial and temporal patterns of bloom occurrences in relation to the vertical turbulence and nutrient condition. The success of the model points the way to the development of real time management models for disaster mitigation.
Wong, Ken T. M.; Lee, Joseph H. W.; Hodgkiss, I. J.
The Earth's Ionosphere-Thermosphere-Electrodynamics (I-T-E) system varies markedly on a range of spatial and temporal scales and these variations can have adverse effects on human operations and systems. Consequently, there is a need to both mitigate and forecast near-Earth space weather. Following the meteorologists, our goal is to specify and forecast the global I-T-E system with data assimilation models, because they are reliable and the models are already available. Currently, our team has first-principles-based data assimilation models for the ionosphere, ionosphere-plasmasphere, thermosphere, high-latitude ionosphere-electrodynamics, and mid-low latitude ionosphere-electrodynamics. These models assimilate a myriad of different ground- and space-based observations, and there are several data assimilation models for each near-Earth space domain. This enables us to conduct Multimodel Ensemble Data Assimilation of the I-T-E system that can account for different physical modeling assumptions, numerical techniques, and model initialization approaches. The application of ensemble modeling with several different data assimilation models will lead to a paradigm shift in how basic physical processes are studied in near-Earth space, and it should also lead to a significant advance space weather forecasting.
Schunk, R. W.; Scherliess, L.; Eccles, J. V.; Gardner, L. C.; Sojka, J. J.; Zhu, L.; Pi, X.; Mannucci, A.; Wilson, B. D.; Komjathy, A.; Wang, C.; Rosen, G.; Tobiska, W.; Schaefer, R. K.; Paxton, L. J.
Space research, and, consequently, space weather forecasting are immature disciplines. Scientific knowledge is accumulated frequently, which changes our understanding or how solar eruptions occur, and of how they impact targets near or on the Earth, or targets throughout the heliosphere. Along with continuous progress in understanding, space research and forecasting models are advancing rapidly in capability, often providing substantially increases in space weather value over time scales of less than a year. Furthermore, the majority of space environment information available today is, particularly in the solar and heliospheric domains, derived from research missions. An optimal forecasting environment needs to be flexible enough to benefit from this rapid development, and flexible enough to adapt to evolving data sources, many of which may also stem from non-US entities. This presentation will analyze the experiences obtained by developing and operating both a forecasting service for NASA, and an experimental forecasting system for Geomagnetically Induced Currents.
Hesse, Michael; Pulkkinen, Antti; Zheng, Yihua; Maddox, Marlo; Berrios, David; Taktakishvili, Sandro; Kuznetsova, Masha; Chulaki, Anna; Lee, Hyesook; Mullinix, Rick; Rastaetter, Lutz
The purpose of this work is to show the application of a new Earthquake Forecasting Model, called Double Branching Model, both at global and at regional scale, for different zones of the globe. The Double Branching is a time-dependent model and assumes that each earthquake can generate other earthquakes, through physical mechanisms acting on different temporal scales. Remarkably, the model can be used to assess probability forecasts and tested in a forward perspective. Here we show the forecasting maps obtained for different time-magnitude window in each target region. Moreover we compare them with the predictions provided by a spatially-variable stationary Poisson process, still widely used for Seismic Hazard Assessment and forecasting purpose. The results presented here were obtained within the CSEP experiment and are currently underway within several testing centers.
Lombardi, A. M.; Marzocchi, W.
... structures and track. Operating on a model grid with data points only 4 kilometers (2.5 miles miles ... a new five-day forecast using a 10-kilometer grid and updated conditions. NCAR's primary sponsor ...
Traditional economic analysis methods for manufacturing decisions include only the clearly identified immediate cost and revenue streams. Environmental issues have generally been seen as costs, in the form of waste material losses, conformance tests and pre-discharge treatments. The components of the waste stream often purchased as raw materials, become liabilities at the "end of the pipe" and their intrinsic material value is seldom recognized. A new mathematical treatment of manufacturing economics is proposed in which the costs of separation are compared with the intrinsic value of the waste materials to show how their recovery can provide an economic advantage to the manufacturer. The model is based on a unique combination of thermodynamic analysis, economic modeling and linear optimization. This paper describes the proposed model, and examines case studies in which the changed decision rules have yielded significant savings while protecting the environment. The premise proposed is that by including the value of the waste materials in the profit objective of the firm and applying the appropriate technological solution, manufacturing processes can become closed systems in which losses approach zero and environmental problems are converted into economic savings.
Wells, Wayne E.; Edinbarough, Immanuel A.
The Center for Integrated Space Weather Modeling (CISM) has developed three forecast models (FMs) for the Sun-Earth chain. They have been matured by various degrees toward the operational stage. The Sun-Earth FM suite comprises empirical and physical models: the Planetary Equivalent Amplitude (AP-FM), the Solar Wind (SW- FM), and the Geospace (GS-FM) models. We give a brief overview of these forecast models and touch briefly on the associated validation studies. We demonstrate the utility of the models: AP-FM supporting the operations of the AIM (Aeronomy of Ice in the Mesosphere) mission soon after launch; SW-FM providing assistance with the interpretation of the STEREO beacon data; and GS-FM combining model and observed data to characterize the aurora borealis. We will then discuss space weather tools in a more general sense, point out where the current capabilities and shortcomings are, and conclude with a look forward to what areas need improvement to facilitate better real-time forecasts.
Gehmeyr, M.; Baker, D. N.; Millward, G.; Odstrcil, D.
Age–sex-specific population forecasts are derived through stochastic population renewal using forecasts of mortality, fertility and net migration. Functional data models with time series coefficients are used to model age-specific mortality and fertility rates. As detailed migration data are lacking, net migration by age and sex is estimated as the difference between historic annual population data and successive populations one year
Rob J. Hyndman; Heather Booth
Age-sex-specific population forecasts are derived through stochastic population renewal using forecasts of mortality, fertility and net migration. Functional data models with time series coefficients are used to model age-specific mortality and fertility rates. As detailed migration data are lacking, net migration by age and sex is estimated as the difference between historic annual population data and successive populations one year
Rob J Hyndman; Heather Booth
The Barcelona Supercomputing Center (BSC) is the National Supercomputer Facility in Spain, hosting MareNostrum, one of the most powerful Supercomputers in Europe. The Earth Sciences Department of BSC operates daily regional dust and air quality forecasts and conducts intensive modelling research for short-term operational prediction. This contribution summarizes the latest developments and current activities in the field of sand and dust storm modelling and forecasting.
Pérez, C.; Baldasano, J. M.; Jiménez-Guerrero, P.; Jorba, O.; Haustein, K.; Cuevas, E.; Basart, S.; Nickovic, S.
Approaches used by linguists to examine the way in which speakers or writers modify their commitment to the propositional content of their utterances are discussed, and it is noted that a frequent criticism is the failure of inexperienced speakers or writers to modulate their utterances properly. This paper considers economic reports and in…
Makaya, Pindi; Bloor, Thomas
Atmospheric monthly forecast is intermediate between medium range forecast, an initial value problem, and seasonal forecast, a boundary value problem. The influence of sea surface temperature (SST) on the atmospheric dynamics in the time range of 10-40 days is still not well understood. As a consequence, there is no common approach for the representation of the SST in a monthly prediction system. At ISAC-CNR in Bologna a monthly ensemble forecasting system is run experimentally once a month, based on the GLOBO model. GLOBO is an atmospheric general circulation model developed at ISAC. The evolution of SST is represented by a simple slab ocean model based on surface flux balance with a relaxation term to climatological SST. Recently, a new definition of the slab ocean model which includes a flux correction term has been implemented to improve the SST simulation. It has been tested in parallel with the operational forecast for some months of 2011. The results show that the globally averaged root mean square forecast error of the SST simulated with the new model is slightly larger than the operational one. However, the ensemble spread of the SST predicted with the new model increases significantly and becomes very similar to the observed SST variability, in particular in the Northern Hemisphere. The atmospheric field differences between the new and operational forecasts show that SST has an impact in the second part of the month, especially in the Southern Hemisphere. The ensemble spread of atmospheric parameters shows a slight increase using the new slab model. However, its impact on the anomaly forecast fields is small.
Rendina, C.; Buzzi, A.; Malguzzi, P.; Mastrangelo, D.; Drofa, O.
One of the main goals in volcanology is to forecast volcanic eruptions. A trenchant forecast should be made before the onset of a volcanic eruption, using the data available at that time, with the aim of mitigating the volcanic risk associated to the volcanic event. In other words, models implemented with forecast purposes have to take into account the possibility
Luigi Passarelli; Bruno Sanso; Sandri Laura; Warner Marzocchi
As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations.
Zhu, Qing; Zou, Yingchao; Lai, Kin Keung
Seasonal climates dictate the livelihoods of farmers in developing countries. While farmers in developed countries often have seasonal forecasts on which to base their cropping decisions, developing world farmers usually make plans for the season without such information. Climate change increases the seasonal uncertainty, making things more difficult for farmers. Providing seasonal forecasts to these farmers is seen as a way to help buffer these typically marginal groups from the effects of climate change, though how to do so and the efficacy of such an effort is still uncertain. In Sri Lanka, an effort is underway to provide such forecasts to farmers. The accuracy of these forecasts is likely to have large impacts on how farmers accept and respond to the information they receive. We present an agent-based model to explore how the accuracy of seasonal rainfall forecasts affects the growing decisions and behavior of farmers in Sri Lanka. Using a decision function based on prospect theory, this model simulates farmers' behavior in the face of a wet, dry, or normal forecast. Farmers can either choose to grow paddy rice or plant a cash crop. Prospect theory is used to evaluate outcomes of the growing season; the farmer's memory of the level of success under a certain set of conditions affects next season's decision. Results from this study have implications for policy makers and seasonal forecasters.
Jacobi, J. H.; Nay, J.; Gilligan, J. M.
The THREDDS Data Server (TDS) is a web server that provides metadata and data access for scientific datasets. It provides OPeNDAP, WCS, HTTP and netCDF subsetting services for a number of data formats, including netCDF, HDF5, GRIB, BUFR, etc. The TDS is 100% Java, and runs within the Tomcat web server. We have added a new way to serve model data, which takes a collection of Forecast Model Run datasets, and constructs a single dataset with a 2D time coordinate (run time, forecast time). In the case of Unidata's server, these are collections of GRIB files, and we deal correctly with missing data records by using the forecast and run dates, rather than array indices. The TDS also creates various other "synthetic" datasets from the collection: 1) all data from one analysis run; 2) data with the same forecast offset hour (eg all the 3 hour forecasts, from different runs); 3) data with a constant forecast date (eg all the data with forecast/valid time of 2006-08-08T12:00:00Z, from different runs); and 4) the "best" time series, taking the data from the most recent run available. We are currently working with a number of data partners to test and extend this functionality.
This documentation presents an input-output model which has been modified to include the environmental impact of economic operation. In lieu of market prices for the environmental factors, trade-offs with regional income and employment are estimated for use in regional planning. The program is written in FORTRAN IV with single precision for the…
Blaylock, James E.; Jones, Lonnie L.
This paper applies economic theory to an analysis of behavior in the public sector. The model focuses on the division of interest between the public and its political representatives. The division of interest arises because the public officeholder is assumed to act to advance his own interests, and these interests do not coincide automatically with those of his constituents. The
Robert J. Barro
We study the optimal consumption problem in the one-sector model of economic growth under uncertainty. We show the existence of a classical solution of the Hamilton-Jacobi-Bellman equation associated with the stochastic optimization problem, and then give an optimal consumption policy in terms of its solution.
The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
Two kinematic solar wind models were executed to generate five-day forecasts for each day that a daily magnetogram was available in the odd-numbered years of Solar Cycle 23. This yielded over 1500 forecasts from the Wang-Sheeley-Arge (WSA) and Hakamada-Akasofu-Fry version 2 (HAFv2) that are run daily at the NOAA Space Weather Prediction Center and the Air Force Weather Agency, respectively. An extensive evaluation of the models’ performance allows an assessment of their value in space weather prediction over representative portions of a complete solar cycle. This was done by comparing model outputs at the L1 point near Earth with in-situ measurements made by solar wind and magnetic field sensors aboard the Advanced Composition Explorer (ACE) and Wind satellites. Comparative forecast-observation difference statistics were computed for the two forecast parameters available from the WSA model: solar wind radial speed and interplanetary magnetic field (IMF) polarity (positive or negative). Statistics were formulated separately by forecast day for each of the study years in order to determine their variance with forecast duration and phase of solar cycle. The results indicated both similarities and differences in the two models. For example, both exhibit a slowing of the solar wind with increasing forecast duration, and both improve prediction of IMF polarity with increasing solar activity. But WSA shows a reduction in the standard deviation of the forecast-observation difference that depends on study year, while HAF appears to reflect the reduction regardless of phase of the solar cycle. A number of statistics will be shown that will point out relative strengths and weaknesses of the two models.
Norquist, Donald C.; Meeks, W.
In this paper, by taking the 5-min high frequency data of the Shanghai Composite Index as example, we compare the forecasting performance of HAR-RV and Multifractal volatility, Realized volatility, Realized Bipower Variation and their corresponding short memory model with rolling windows forecasting method and the Model Confidence Set which is proved superior to SPA test. The empirical results show that, for six loss functions, HAR-RV outperforms other models. Moreover, to make the conclusions more precise and robust, we use the MCS test to compare the performance of their logarithms form models, and find that the HAR-log(RV) has a better performance in predicting future volatility. Furthermore, by comparing the two models of HAR-RV and HAR-log(RV), we conclude that, in terms of performance forecasting, the HAR-log(RV) model is the best model among models we have discussed in this paper.
Ma, Feng; Wei, Yu; Huang, Dengshi; Chen, Yixiang
Electrical energy in Brazil depends essentially on the streamflow, as hydropowers accounts for up to 79% of the total electrical energy installed capacity. Therefore, streamflow forecasts are very important tools to assist in the planning and operation of Brazilian hydroelectric reservoirs. This study evaluated the performance of a distributed hydrological model, Soil and Water Assessment Tool (SWAT) daily streamflow forecasts into four Reservoirs sited in the Alto do Rio Doce Watershed, in Southeast of Brazil. SWAT model was used with precipitation forecast from the regional meteorological model MM5. The calibration and validation processes of SWAT were accomplished using data from four monitoring stations. The model has been run for the 2010-2012 period, and while the apr/2010-set/2011 period has been used for calibration conducted manually, the validation reached the rest of the period. The manual calibration was conducted by the means of sensibility tests of parameters that control surface runoff and groundwater flow, specially the surlag and alpha_bf, respectively the surface runoff lag coefficient and the baseflow recession constant. The daily and monthly Nash-Sutcliffe, R2 and the mean relative error performance indicators were used to assess the relative performance of the model. Results showed that streamflow forecast was very similar toobservations, except in reservoirs with lower drainage areas, where the model did not simulated the beginning of the flood (Dec-Feb). The streamflow forecasts was strongly dependent on the quality of precipitation forecasts used. Given that no correction in the simulated rainfall by the MM5 model in the Alto do Rio Doce watershed has been conducted and no automated calibration method was applied to the parameters of the hydrologic model, we can conclude that the application of the SWAT hydrologic model employing the output data from the MM5 atmospheric model for the streamflow forecast was shown to be a tool of great potential for real-time operation of reservoirs.
Silva, J. M.; Saad, S. I.; Palma, G.; Rocha, H.; Palmeira, R. M.; Silva, B. L.; Pessoa, A. A.; Ramos, C. G.; Cecchini, M. A.
This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.
Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim
This paper presents a predictability study of the Madden-Julian Oscillation (MJO) that relies on combining empirical model reduction (EMR) with the "past-noise forecasting" (PNF) method. EMR is a data-driven methodology for constructing stochastic low-dimensional models that account for nonlinearity, seasonality and serial correlation in the estimated noise, while PNF constructs an ensemble of forecasts that accounts for interactions between (i) high-frequency variability (noise), estimated here by EMR, and (ii) the low-frequency mode of MJO, as captured by singular spectrum analysis (SSA). A key result is that—compared to an EMR ensemble driven by generic white noise—PNF is able to considerably improve prediction of MJO phase. When forecasts are initiated from weak MJO conditions, the useful skill is of up to 30 days. PNF also significantly improves MJO prediction skill for forecasts that start over the Indian Ocean.
Kondrashov, D.; Chekroun, M. D.; Robertson, A. W.; Ghil, M.
A series of gas hydrate development scenarios were created to assess the range of outcomes predicted for the possible development of the "Eileen" gas hydrate accumulation, North Slope, Alaska. Production forecasts for the "reference case" were built using the 2002 Mallik production tests, mechanistic simulation, and geologic studies conducted by the US Geological Survey. Three additional scenarios were considered: A "downside-scenario" which fails to identify viable production, an "upside-scenario" describes results that are better than expected. To capture the full range of possible outcomes and balance the downside case, an "extreme upside scenario" assumes each well is exceptionally productive.Starting with a representative type-well simulation forecasts, field development timing is applied and the sum of individual well forecasts creating the field-wide production forecast. This technique is commonly used to schedule large-scale resource plays where drilling schedules are complex and production forecasts must account for many changing parameters. The complementary forecasts of rig count, capital investment, and cash flow can be used in a pre-appraisal assessment of potential commercial viability.Since no significant gas sales are currently possible on the North Slope of Alaska, typical parameters were used to create downside, reference, and upside case forecasts that predict from 0 to 71??BM3 (2.5??tcf) of gas may be produced in 20 years and nearly 283??BM3 (10??tcf) ultimate recovery after 100 years.Outlining a range of possible outcomes enables decision makers to visualize the pace and milestones that will be required to evaluate gas hydrate resource development in the Eileen accumulation. Critical values of peak production rate, time to meaningful production volumes, and investments required to rule out a downside case are provided. Upside cases identify potential if both depressurization and thermal stimulation yield positive results. An "extreme upside" case captures the full potential of unconstrained development with widely spaced wells. The results of this study indicate that recoverable gas hydrate resources may exist in the Eileen accumulation and that it represents a good opportunity for continued research. ?? 2010 Elsevier Ltd.
Wilson, S. J.; Hunter, R. B.; Collett, T. S.; Hancock, S.; Boswell, R.; Anderson, B. J.
The University of California, San Diego 3D Heliospheric Tomography Model reconstructs the evolution of heliospheric structures, and can make forecasts of solar wind density and velocity up to 72 hours in the future. The latest model version, installed and running in realtime at the Community Coordinated Modeling Center(CCMC), analyzes scintillations of meter wavelength radio point sources recorded by the Solar-Terrestrial Environment Laboratory(STELab) together with realtime measurements of solar wind speed and density recorded by the Advanced Composition Explorer(ACE) Solar Wind Electron Proton Alpha Monitor(SWEPAM).The solution is reconstructed using tomographic techniques and a simple kinematic wind model. Since installation, the CCMC has been recording the model forecasts and comparing them with ACE measurements, and with forecasts made using other heliospheric models hosted by the CCMC. We report the preliminary results of this validation work and comparison with alternative models.
MacNeice, Peter; Taktakishvili, Alexandra; Jackson, Bernard; Clover, John; Bisi, Mario; Odstrcil, Dusan
A Hierarchal Bayesian model is presented for one season-ahead forecasts of summer rainfall and streamflow using exogenous climate variables for east central China. The model provides estimates of the posterior forecasted probability distribution for 12 rainfall and 2 streamflow stations considering parameter uncertainty, and cross-site correlation. The model has a multi-level structure with regression coefficients modeled from a common multi-variate normal distribution resulting in partial pooling of information across multiple stations and better representation of parameter and posterior distribution uncertainty. Covariance structure of the residuals across stations is explicitly modeled. Model performance is tested under leave-10-out cross-validation. Frequentist and Bayesian performance metrics used include receiver operating characteristic, reduction of error, coefficient of efficiency, rank probability skill scores, and coverage by posterior credible intervals. The ability of the model to reliably forecast season-ahead regional summer rainfall and streamflow offers potential for developing adaptive water risk management strategies.
Chen, X.; Hao, Z.; Devineni, N.; Lall, U.
Short term streamflow forecasting is of importance in water resources management, especially from the point of view of operational flow control and risk management. Beside deterministic rainfall runoff and flow routing models, stochastic time series models are also in operational use for this purpose. The fitting of such stochastic models is preceded, when suitable, by removing the systematic components in the time series (such as trends, seasonality). Usually the interest of practitioners lies in the fitting of the stochastic part of the time series model and removing the systematic components is considered rather a routine task. However, each deseasonalization method has an effect on time series analyzed, affecting the autocorrelation structure and thus influencing the following model choice and the fitted model parameters. When choosing an appropriate stochastic model the practitioners often neglect the presence of long range dependence when considering short term forecasting. This, however, might have an effect on the forecasts even in short term horizon. The autoregressive integrated moving average models (ARFIMA) are often used for modelling of time series displaying long range dependence in hydrology. In hydrology, wavelets are mostly applied for feature extraction and process description rather then modelling and forecasting. In this work we attempted to improve the deseasonalization step of the modelling process by using wavelet analysis. We proposed to combine an ARFIMA model with a wavelet transform used for deseasonalization. The quality of the model is tested on one to ten days ahead forecasts of mean daily runoffs from the Danube River measured at Kienstock in Lower Austria. A comparison with two other models - an ARFIMA model combined with moving average deseasonalization and a linear wavelet based model was performed. The results of the model comparison showed that use of wavelets provides a suitable alternative to the moving average deseasonalization. For one and two days forecasting horizon the new approach did not show improvement in the forecasting performance over the other tested models. However, for longer forecasting horizons, the wavelet deseasonalization - ARFIMA combination outperforms the other two models, thus offering improvement compared to the usual moving average deseasonalization. Since none of the three models was able to remove autocorrelation from the squared residuals, usually indicating heteroscedasticity in the time series, the concept of the wavelet deseasonalization may be explored further in combination of other possibly suitable model, such as a fractionally integrated generalized autoregressive conditional heteroscedasticity model type.
Szolgayová, Elena; Arlt, Josef; Blöschl, Günter; Szolgay, Ján
Work for this project is towards improving the stream flow forecasts for the NOAA River Forecast Centers (RFC) throughout the U.S. using multi-model capability primarily from the NASA Land Information System and remote sensing data provided by AMSR-E for soil moisture. The RFCs address a range of issues, including peak and low flow predictions as well as river floods and
David Toll; Bailing Li; Zhan Xiwu; Cosgrove Brian
Global change affects alpine ecosystems by, among many effects, by altering plant distributions and community composition.\\u000a However, forecasting alpine vegetation change is challenged by a scarcity of studies observing change in fixed plots spanning\\u000a decadal-time scales. We present in this article a probabilistic modeling approach that forecasts vegetation change on Niwot\\u000a Ridge, CO using plant abundance data collected from marked
David R. Johnson; Diane Ebert-May; Patrick J. Webber; Craig E. Tweedie
A worldwide forecast of the erythemally effective ultraviolet (UV) radiation is presented. The forecast was established to inform the public about the expected amount of erythemally effective UV radiation for the next day. Besides the irradiance, the daily dose is forecasted to enable people to choose the appropriate sun protection tools. Following the UV Index as the measure of global erythemally effective irradiance, the daily dose is expressed in units of UV Index hours. In this study, we have validated the model and the forecast against measurements from broadband UV radiometers of the Robertson-Berger type. The measurements were made at four continents ranging from the northern polar circle (67.4 degrees N) to the Antarctic coast (61.1 degrees S). As additional quality criteria the frequency of underestimation was taken into account because the forecast is a tool of radiation protection and made to avoid overexposure. A value closer than one minimal erythemal dose for the most sensitive skin type 1 to the observed value was counted as hit and greater deviations as underestimation or overestimation. The Austrian forecast model underestimates the daily dose in 3.7% of all cases, whereas 1.7% results from the model and 2.0% from the assumed total ozone content. The hit rate could be found in the order of 40%. PMID:15453822
Schmalwieser, Alois W; Schauberger, Günther; Janouch, Michal; Nunez, Manuel; Koskela, Tapani; Berger, Daniel; Karamanian, Gabriel
Prediction models contain tunable parameters which appear in parametrization schemes of sub-grid scale physical processes. This is the case both in conventional and stochastic parametrization schemes. Model development involves specification of these parameters to their optimal values. The key question in the specification is the target, or metric, against which model predictive skill is assessed. It is rather easy to tune a model with respect to some selected properties on expense of performance in some other respects. We propose here a possible solution to this general model tuning problem: a model could be selected based on the growth of energy norm of forecast error. We have applied ECHAM5 atmospheric general circulation model at low resolution, and tuned four of its cloud and precipitation formation parameters such that the optimized model outperforms the default model in terms of total energy norm of forecast error at early medium-range forecasts. The parameter estimation method is based on importance sampling and targeted for ensemble prediction systems. Interestingly, the optimized model continues to outperform the default model up to 10-day range and the improvement is nicely distributed across nearly all model variables. We conclude that the energy norm of forecast error constitutes a very natural target in model selection and since it is an integral over the entire model domain, it is not selective to some particular model variables, areas, or layers.
Järvinen, H.; Ollinaho, P.; Laine, M.; Solonen, A.; Haario, H.
The accurate prediction of the solar wind velocity has been a major challenge in the space weather community. Previous studies proposed many empirical and semi-empirical models to forecast the solar wind velocity based on either the historical observations, e.g. the persistence model, or the instantaneous observations of the sun, e.g. the Wang-Sheeley-Arge model. In this study, we use the one-minute WIND data from January 1995 to August 2012 to investigate and compare the performances of 4 models often used in literature, here referred to as the null model, the persistence model, the one-solar-rotation-ago model, and the Wang-Sheeley-Arge model. It is found that, measured by root mean square error, the persistence model gives the most accurate predictions within two days. Beyond two days, the Wang-Sheeley-Arge model serves as the best model, though it only slightly outperforms the null model and the one-solar-rotation-ago model. Finally, we apply the least-square regression to linearly combine the null model, the persistence model, and the one-solar-rotation-ago model to propose a 'general persistence model'. By comparing its performance against the 4 aforementioned models, it is found that the accuracy of the general persistence model outperforms the other 4 models within five days. Due to its great simplicity and superb performance, we believe that the general persistence model can serve as a benchmark in the forecast of solar wind velocity and has the potential to be modified to arrive at better models.
Gao, Y.; Ridley, A. J.
Different combination methods based on multiple linear regression are explored to identify the conditions that lead to an improvement of seasonal forecast quality when individual operational dynamical systems and a statistical-empirical system are combined. A calibration of the post-processed output is included. The combination methods have been used to merge the ECMWF System 4, the NCEP CFSv2, the Météo-France System 3, and a simple statistical model based on SST lagged regression. The forecast quality was assessed from a deterministic and probabilistic point of view. SSTs averaged over three different tropical regions have been considered: the Niño3.4, the Subtropical Northern Atlantic and Western Tropical Indian SST indices. The forecast quality of these combinations is compared to the forecast quality of a simple multi-model (SMM) where all single models are equally weighted. The results show a large range of behaviours depending on the start date, target month and the index considered. Outperforming the SMM predictions is a difficult task for linear combination methods with the samples currently available in an operational context. The difficulty in the robust estimation of the weights due to the small samples available is one of the reasons that limit the potential benefit of the combination methods that assign unequal weights. However, these combination methods showed the capability to improve the forecast reliability and accuracy in a large proportion of cases. For example, the Forecast Assimilation method proved to be competitive against the SMM while the other combination methods outperformed the SMM when only a small number of forecast systems have skill. Therefore, the weighting does not outperform the SMM when the SMM is very skilful, but it reduces the risk of low skill situations that are found when several single forecast systems have a low skill.
Rodrigues, Luis Ricardo Lage; Doblas-Reyes, Francisco Javier; Coelho, Caio Augusto dos Santos
NASA prefers to land the space shuttle at Kennedy Space Center (KSC). When weather conditions violate Flight Rules at KSC, NASA will usually divert the shuttle landing to Edwards Air Force Base (EAFB) in Southern California. But forecasting surface winds at EAFB is a challenge for the Spaceflight Meteorology Group (SMG) forecasters due to the complex terrain that surrounds EAFB, One particular phenomena identified by SMG is that makes it difficult to forecast the EAFB surface winds is called "wind cycling". This occurs when wind speeds and directions oscillate among towers near the EAFB runway leading to a challenging deorbit bum forecast for shuttle landings. The large-scale numerical weather prediction models cannot properly resolve the wind field due to their coarse horizontal resolutions, so a properly tuned high-resolution mesoscale model is needed. The Weather Research and Forecasting (WRF) model meets this requirement. The AMU assessed the different WRF model options to determine which configuration best predicted surface wind speed and direction at EAFB, To do so, the AMU compared the WRF model performance using two hot start initializations with the Advanced Research WRF and Non-hydrostatic Mesoscale Model dynamical cores and compared model performance while varying the physics options.
Watson, Leela R.; Bauman, William H., III
The purpose of this article is to compare the accuracy of forecasts for natural gas prices as reported by the Energy Information Administration's Short-Term Energy Outlook (STEO) and the futures market for the period from 1998 to 2003. The analysis tabulates the existing data and develops a statistical comparison of the error between STEO and U.S. wellhead natural gas prices and between Henry Hub and U.S. wellhead spot prices. The results indicate that, on average, Henry Hub is a better predictor of natural gas prices with an average error of 0.23 and a standard deviation of 1.22 than STEO with an average error of -0.52 and a standard deviation of 1.36. This analysis suggests that as the futures market continues to report longer forward prices (currently out to five years), it may be of interest to economic modelers to compare the accuracy of their models to the futures market. The authors would especially like to thank Doug Hale of the Energy Information Administration for supporting and reviewing this work.
Wong-Parodi, Gabrielle; Dale, Larry; Lekov, Alex
Conflicts between the goals of having clean air and economic development are widespread. This paper discusses the conceptual and mathematical development of a linear programming optimization model and an interative solution procedure to determine optimal economic development strategies to promote employment subject to various contexts which limit air pollution carrying capacity. Three cases are formulated: (1) maximizing employment subject to ambient concentration constraints, (2) maximizing employment subject to emissions constraints, and (3) minimizing emissions subject to employment constraints. Empirical relationships using Census and pollutant inventory data describe a conceptual urban system, so that indirect and induced impacts of development strategies are also included. The modeling incorporates both point and nonpoint sources, and is shown to be adaptable for nonreactive emissions.
Muschett, F. Douglas
The use of numerical weather forecast model data as a source of data for soil moisture modelling was tested. Results show that the potential evaporation calculated using the Penman-Monteith equation can be estimated accurately using data obtained from the output of a high resolution numerical atmospheric model (HIRLAM, High Resolution Limited Area Model). The mean bias error was 0.26 mm for a 36-hour sum and the root mean square error was 2.14 mm. The evaporation obtained directly from HIRLAM was systematically smaller because this direct model output represents the real evaporation rather than the potential evaporation. The precipitation forecasts were less accurate. When the accuracy of parameters required for the calculation of potential evaporation were studied for one station, no serious bias was found. When two different irrigation models (AMBAV and SWAP) were run over one summer using either measured or HIRLAM data as the input, the results given by the models were quite similar regardless of input data source. The largest differences between the model outputs were caused by the formulation of crop and soil characteristics in the irrigation models.
Venäläinen, A.; Salo, T.; Fortelius, C.
An extension of the classical linear Model Output Statistics (MOS) technique is proposed allowing for the post-processing of ensemble forecasts. In this new approach the cost-function on which the least square parameter estimation is based takes into account the presence of errors in both observations and model observables, unlike the classical linear MOS cost-function whose implicit assumption is the absence of errors in the model observables. It allows for the maintenance of an appropriate variability for the corrected forecasts even for long lead times and for providing a framework in which both deterministic and probabilistic forecasts can be corrected. The scheme is successfully tested for ensemble correction in the context of an idealized low-order chaotic system, the Lorenz atmospheric model, in the presence of model errors, and compared with more classical techniques like the Non-homogeneous Gaussian Regression (NGR) method. The potential use of this approach is also briefly discussed.
A model to assess the value of improved information regarding the inventories, productions, exports, and imports of crop on a worldwide basis is discussed. A previously proposed model is interpreted in a stochastic control setting and the underlying assumptions of the model are revealed. In solving the stochastic optimization problem, the Markov programming approach is much more powerful and exact as compared to the dynamic programming-simulation approach of the original model. The convergence of a dual variable Markov programming algorithm is shown to be fast and efficient. A computer program for the general model of multicountry-multiperiod is developed. As an example, the case of one country-two periods is treated and the results are presented in detail. A comparison with the original model results reveals certain interesting aspects of the algorithms and the dependence of the value of information on the incremental cost function.
Mehra, R. K.; Rouhani, R.; Jones, S.; Schick, I.
This paper presents a short-term monthly forecasting model of West Texas Intermediate crude oil spot price using Organization for Economic Cooperation and Development (OECD) petroleum inventory levels.
Many successful technology forecasting models have been developed but few researchers have explored a model that can best predict short product lifecycles. This research studies the forecast accuracy of long and short product lifecycle datasets using simple logistic, Gompertz, and the time-varying extended logistic models. The performance of the models was evaluated using the mean absolute deviation and the root
Charles V. Trappey; Hsin-ying Wu
In this paper, we propose a general probabilistic model for modeling the evolution of demand forecasts, referred to as the Martingale Model of Forecast Evolution (MMFE). We combine the MMFE with a linear programming model of production and distribution planning implemented in a rolling horizon fashion. The resulting simulation methodology is used to analyze safety stock levels for a multi-product\\/multi-plant
DAVID C. HEATH; PETER L. JACKSON
Two univariate time-series analysis methods have been used to model and forecast the monthly patient volume at the family and community medicine primary health care clinic of King Faisal University, Al-Khobar, Saudi Arabia. Models were based on nine years of data and forecasts made for 2 years. The optimum ARIMA model selected is an autoregressive model of the fourth order
R. E Abdel-Aal; A. M Mangoud
Health forecasting can improve health service provision and individual patient outcomes. Environmental factors are known to impact chronic respiratory conditions such as asthma, but little is known about the extent to which these factors can be used for forecasting. Using weather, air quality and hospital asthma admissions, in London (2005-2006), two related negative binomial models were developed and compared with a naive seasonal model. In the first approach, predictive forecasting models were fitted with 7-day averages of each potential predictor, and then a subsequent multivariable model is constructed. In the second strategy, an exhaustive search of the best fitting models between possible combinations of lags (0-14 days) of all the environmental effects on asthma admission was conducted. Three models were considered: a base model (seasonal effects), contrasted with a 7-day average model and a selected lags model (weather and air quality effects). Season is the best predictor of asthma admissions. The 7-day average and seasonal models were trivial to implement. The selected lags model was computationally intensive, but of no real value over much more easily implemented models. Seasonal factors can predict daily hospital asthma admissions in London, and there is a little evidence that additional weather and air quality information would add to forecast accuracy. PMID:23620439
Soyiri, Ireneous N; Reidpath, Daniel D; Sarran, Christophe
Describes and evaluates a circuitless flow model for forecasting future student attributes and enrollment projections, compared with a simple Markov chain model. Data from a study of secondary students are used to illustrate the analytical methodologies and to contrast results of the Markov chain and circuitless flow models. (JG)
Britney, Robert R.
Authors: Dr. Kenneth L. Judd, Hoover Institution, and Prof. William A. Brock, University of Wisconsin Current climate models range from General Circulation Models (GCM’s) with millions of degrees of freedom to models with few degrees of freedom. Simple Energy Balance Climate Models (EBCM’s) help us understand the dynamics of GCM’s. The same is true in economics with Computable General Equilibrium Models (CGE’s) where some models are infinite-dimensional multidimensional differential equations but some are simple models. Nordhaus (2007, 2010) couples a simple EBCM with a simple economic model. One- and two- dimensional ECBM’s do better at approximating damages across the globe and positive and negative feedbacks from anthroprogenic forcing (North etal. (1981), Wu and North (2007)). A proper coupling of climate and economic systems is crucial for arriving at effective policies. Brock and Xepapadeas (2010) have used Fourier/Legendre based expansions to study the shape of socially optimal carbon taxes over time at the planetary level in the face of damages caused by polar ice cap melt (as discussed by Oppenheimer, 2005) but in only a “one dimensional” EBCM. Economists have used orthogonal polynomial expansions to solve dynamic, forward-looking economic models (Judd, 1992, 1998). This presentation will couple EBCM climate models with basic forward-looking economic models, and examine the effectiveness and scaling properties of alternative solution methods. We will use a two dimensional EBCM model on the sphere (Wu and North, 2007) and a multicountry, multisector regional model of the economic system. Our aim will be to gain insights into intertemporal shape of the optimal carbon tax schedule, and its impact on global food production, as modeled by Golub and Hertel (2009). We will initially have limited computing resources and will need to focus on highly aggregated models. However, this will be more complex than existing models with forward-looking economic modules, and the initial models will help guide the construction of more refined models that can effectively use more powerful computational environments to analyze economic policies related to climate change. REFERENCES Brock, W., Xepapadeas, A., 2010, “An Integration of Simple Dynamic Energy Balance Climate Models and Ramsey Growth Models,” Department of Economics, University of Wisconsin, Madison, and University of Athens. Golub, A., Hertel, T., etal., 2009, “The opportunity cost of land use and the global potential for greenhouse gas mitigation in agriculture and forestry,” RESOURCE AND ENERGY ECONOMICS, 31, 299-319. Judd, K., 1992, “Projection methods for solving aggregate growth models,” JOURNAL OF ECONOMIC THEORY, 58: 410-52. Judd, K., 1998, NUMERICAL METHODS IN ECONOMICS, MIT Press, Cambridge, Mass. Nordhaus, W., 2007, A QUESTION OF BALANCE: ECONOMIC MODELS OF CLIMATE CHANGE, Yale University Press, New Haven, CT. North, G., R., Cahalan, R., Coakely, J., 1981, “Energy balance climate models,” REVIEWS OF GEOPHYSICS AND SPACE PHYSICS, Vol. 19, No. 1, 91-121, February Wu, W., North, G. R., 2007, “Thermal decay modes of a 2-D energy balance climate model,” TELLUS, 59A, 618-626.
Judd, K.; Brock, W. A.
Reserve growth is recognized as a major component of additions to reserves in most oil provinces around the world, particularly in mature provinces. It takes place as a result of the discovery of new pools/reservoirs and extensions of known pools within existing fields, improved knowledge of reservoirs over time leading to a change in estimates of original oil-in-place, and improvement in recovery factor through the application of new technology, such as enhanced oil recovery methods, horizontal/multilateral drilling, and 4D seismic. A reserve growth study was conducted on oil pools in Alberta, Canada, with the following objectives: 1) evaluate historical oil reserve data in order to assess the potential for future reserve growth; 2) develop reserve growth models/ functions to help forecast hydrocarbon volumes; 3) study reserve growth sensitivity to various parameters (for example, pool size, porosity, and oil gravity); and 4) compare reserve growth in oil pools and fields in Alberta with those from other large petroleum provinces around the world. The reported known recoverable oil exclusive of Athabasca oil sands in Alberta increased from 4.5 billion barrels of oil (BBO) in 1960 to 17 BBO in 2005. Some of the pools that were included in the existing database were excluded from the present study for lack of adequate data. Therefore, the known recoverable oil increased from 4.2 to 13.9 BBO over the period from 1960 through 2005, with new discoveries contributing 3.7 BBO and reserve growth adding 6 BBO. This reserve growth took place mostly in pools with more than 125,000 barrels of known recoverable oil. Pools with light oil accounted for most of the total known oil volume, therefore reflecting the overall pool growth. Smaller pools, in contrast, shrank in their total recoverable volumes over the years. Pools with heavy oil (gravity less than 20o API) make up only a small share (3.8 percent) of the total recoverable oil; they showed a 23-fold growth compared to about 3.5-fold growth in pools with medium oil and 2.2-fold growth in pools with light oil over a fifty-year period. The analysis indicates that pools with high porosity reservoirs (greater than 30 percent porosity) grew more than pools with lower porosity reservoirs which could possibly be attributed to permeability differences between the two types. Reserve growth models for Alberta, Canada, show the growth at field level is almost twice as much as at pool level, possibly because the analysis has evaluated fields with two or more pools with different discovery years. Based on the models, the growth in oil volumes in Alberta pools over the next five-year period (2006-2010) is expected to be about 454 million barrels of oil. Over a twenty-five year period, the cumulative reserve growth in Alberta oil pools has been only 2-fold compared to a 4- to- 5-fold increase in other petroleum producing areas such as Saskatchewan, Volga-Ural, U.S. onshore fields, and U.S. Gulf of Mexico. However, the growth at the field level compares well with that of U.S. onshore fields. In other petroleum provinces, the reserves are reported at field levels rather than at pool levels, the latter basically being the equivalent of individual reservoirs. ?? 2010 by the Canadian Society of Petroleum Geologists.
Verma, M.; Cook, T.
The availability and efficient use of the feed resources in India are the primary drivers to maximize productivity of Indian livestock. Feed security is vital to the livestock management, extent of use, conservation and productivity enhancement. Assessment and forecasting of livestock feed resources are most important for effective planning and policy making. In the present study, 40 years of data on crop production, land use pattern, rainfall, its deviation from normal, area under crop and yield of crop were collected and modeled to forecast the likely production of feed resources for the next 20 years. The higher order auto-regressive (AR) models were used to develop efficient forecasting models. Use of climatic variables (actual rainfall and its deviation from normal) in combination with non-climatic factors like area under each crop, yield of crop, lag period etc., increased the efficiency of forecasting models. From the best fitting models, the current total dry matter (DM) availability in India was estimated to be 510.6 million tonnes (mt) comprising of 47.2 mt from concentrates, 319.6 mt from crop residues and 143.8 mt from greens. The availability of DM from dry fodder, green fodder and concentrates is forecasted at 409.4, 135.6 and 61.2 mt, respectively, for 2030.
Suresh, K. P.; Kiran, G. Ravi; Giridhar, K.; Sampath, K. T.
The presence of fog and low clouds in the lower atmosphere can have a critical impact on both airborne and ground transports and is often connected with serious accidents. The improvement of localization, duration and variations in visibility therefore holds an immense operational value. Fog is generally a small scale phenomenon and mostly affected by local advective transport, radiation, turbulent mixing at the surface as well as its microphysical structure. Sophisticated three-dimensional fog models, based on advanced microphysical parameterization schemes and high vertical resolution, have been already developed and give promising results. Nevertheless, the computational time is beyond the range of an operational setup. Therefore, mesoscale numerical weather prediction models are generally used for forecasting all kinds of weather situations. In spite of numerous improvements, a large uncertainty of small scale weather events inherent in deterministic prediction cannot be evaluated adequately. Probabilistic guidance is necessary to assess these uncertainties and give reliable forecasts. In this study, fog forecasts are obtained by a diagnosis scheme similar to Fog Stability Index (FSI) based on COSMO-DE model outputs. COSMO-DE I the German-focused high-resolution operational weather prediction model of the German Meteorological Service. The FSI and the respective fog occurrence probability is optimized and calibrated with statistical postprocessing in terms of logistic regression. In a second step, the predictor number of the FOGCAST model has been optimized by use of the LASSO-method (Least Absolute Shrinkage and Selection Operator). The results will present objective out-of-sample verification based on the Brier score and is performed for station data over Germany. Furthermore, the probabilistic fog forecast approach, FOGCAST, serves as a benchmark for the evaluation of more sophisticated 3D fog models. Several versions have been set up based on different numerical weather prediction systems: 1- COSMO-DE operational forecasts (50 vertical layers, dz_min=20m), 2- COSMO-DE forecasts with different vertical grid setups, 3- COSMO-DE forecasts with fog microphysics of the one dimensional fog forecast model, PAFOG 4- COSMO-FOG forecasts with a very high vertical resolution (60 layers, dz_min=4m) and an one-moment fog microphysics based on the PAFOG model. The results will quantify the impact of vertical grid resolution, and the importance of detailed cloud microphysics, considering explicitly cloud droplet distribution and sedimentation processes.
Masbou, M.; Hacker, M.; Bentzien, S.
The importance of technological forecasting is increasing due to immense effect of technology on the organizations. Accurate forecast will obviously yield more benefit in shorter time scales. Although there are so many studies to forecast the single technological aspect of certain products, such as Flight Speed or Bomber Carriage of air fighters, a general model to forecast overall technological changes
Ercan ÖZTEMEL; M. Batuhan AYHAN
This is the final report for a one-year Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The ultimate goal was to develop a new methodology for macroeconomic modeling applied to national environmental and economic problems. A modeling demonstration and briefings were produced, and significant internal technical support and program interest has been generated. External contacts with DOE`s Office of Environmental Management (DOE-EM), US State Department, and the US intelligence community were established. As a result of DOE-EM interest and requests for further development, this research has been redirected to national environmental simulations as a new LDRD project.
Drake, R.H.; Hardie, R.W.; Loose, V.W.; Booth, S.R.
Forecasting of energy demand in emerging markets is one of the most important policy tools used by the decision makers all over the world. In Turkey, most of the early studies used include various forms of econometric modeling. However, since the estimated economic and demographic parameters usually deviate from the realizations, time-series forecasting appears to give better results. In this
Volkan ?. Ediger; Sertaç Akar
Electric load forecasting has received an increasing attention over the years by academic and industrial researchers and practitioners due to its major role for the effective and economic operation of power utilities. The aim of this paper is to provide a collective unified survey study on the application of computational intelligence (CI) model-free techniques to the short-term load forecasting of
Spyros G. Tzafestas; Elpida Tzafestas
Numerous observations have shown a general spatial correlation between positive Coulomb failure stress changes due to an earthquake and the locations of aftershocks. However this correlation does not give any indication of the rate from which we can infer the magnitude using the Gutenberg-Richter law. Dieterich's rate- and state-dependent model can be used to obtain a forecast of the observed aftershock rate for the space and time evolution of seismicity caused by stress changes applied to an infinite population of nucleating patches. The seismicity rate changes on this model depend on eight parameters: the stressing rate, the amplitude of the stress perturbation, the physical constitutive properties of faults, the spatial parameters (location and radii of the cells), the start and duration of each of the temporal windows as well as the background seismicity rate. The background seismicity is declustered using the epidemic type aftershock sequence model. We use the 1992 Landers earthquake as a case study, using the Southern California Earthquake Data Centre (SCEDC) catalogue, to examine if Dieterich's rate- and state-dependent model can forecast the aftershock seismicity rate. We perform a systematic study on a range of values on all the parameters to test the forecasting ability of this model. The results obtained suggest variable success in forecasting, when varying the values for the parameters, with the spatial and temporal parameters being the most sensitive. The Omori-Utsu law describes the aftershock rate as a power law in time following the main shock and depends on only three parameters: the aftershock productivity, the elapsed time since the main shock and the constant time shift, all of which can be estimated in the early part of the aftershock sequence and then extrapolated to give a long term rate forecast. All parameters are estimated using maximum likelihood methods. We compare the Dieterich and the Omori-Utsu forecasts using the Akaike information criterion which appropriately penalises each model for the number of free parameters used in the fit and explore the full spatial distribution of parameters, forecasts and forecast skill. We find that the Omori-Utsu law consistently out-performs the Dieterich model. We then apply the method to other earthquake sequences and assess its usefulness as a real time aftershock forecasting protocol.
De Gaetano, D.; McCloskey, J.; Nalbant, S. S.
When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.
Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian
We proposed a novel characterization of errors for numerical weather predictions. A general distortion representation allows for the displacement and amplification or bias correction of forecast anomalies. Characterizing and decomposing forecast error in this way has several important applications, including the model assessment application and the objective analysis application. In this project, we have focused on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically, we study the forecast errors of the sea level pressure (SLP), the 500 hPa geopotential height, and the 315 K potential vorticity fields for forecasts of the short and medium range. The forecasts are generated by the Goddard Earth Observing System (GEOS) data assimilation system with and without ERS-1 scatterometer data. A great deal of novel work has been accomplished under the current contract. In broad terms, we have developed and tested an efficient algorithm for determining distortions. The algorithm and constraints are now ready for application to larger data sets to be used to determine the statistics of the distortion as outlined above, and to be applied in data analysis by using GEOS water vapor imagery to correct short-term forecast errors.
Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher
The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M?4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M?4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.
Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.
The goal of the Sensor Management Applied Research Technologies (SMART) On-Demand Modeling project is to develop and demonstrate the readiness of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities to integrate both space-based Earth observations and forecast model output into new data acquisition and assimilation strategies. The project is developing sensor web-enabled processing plans to assimilate Atmospheric Infrared Sounding (AIRS) satellite temperature and moisture retrievals into a regional Weather Research and Forecast (WRF) model over the southeastern United States.
Goodman, H. Michael; Conover, Helen; Zavodsky, Bradley; Maskey, Manil; Jedlovec, Gary; Regner, Kathryn; Li, Xiang; Lu, Jessica; Botts, Mike; Berthiau, Gregoire
A new approach to ensemble forecasting of rainfall over India based on daily outputs of four operational numerical weather prediction (NWP) models in the medium-range timescale (up to 5 days) is proposed in this study. Four global models, namely ECMWF, JMA, GFS and UKMO available on real-time basis at India Meteorological Department, New Delhi, are used simultaneously with adequate weights to obtain a multi-model ensemble (MME) technique. In this technique, weights for each NWP model at each grid point are assigned on the basis of unbiased mean absolute error between the bias-corrected forecast and observed rainfall time series of 366 daily data of 3 consecutive southwest monsoon periods (JJAS) of 2008, 2009 and 2010. Apart from MME, a simple ensemble mean (ENSM) forecast is also generated and experimented. The prediction skill of MME is examined against observed and corresponding outputs of each constituent model during monsoon 2011. The inter-comparison reveals that MME is able to provide more realistic forecast of rainfall over Indian monsoon region by taking the strength of each constituent model. It has been further found that the weighted MME technique has higher skill in predicting daily rainfall compared to ENSM and individual member models. RMSE is found to be lowest in MME forecasts both in magnitude and area coverage. This indicates that fluctuations of day-to-day errors are relatively less in the MME forecast. The inter-comparison of domain-averaged skill scores for different rainfall thresholds further clearly demonstrates that the MME algorithm improves slightly above the ENSM and member models.
Durai, V. R.; Bhardwaj, Rashmi
is probably the most important parameter that is predicted by numerical weather prediction models, though the skill of rainfall prediction is the poorest compared to other parameters, e.g., temperature and humidity. In this study, the impact of rainfall assimilation on mesoscale model forecasts is evaluated during Indian summer monsoon 2011. The Weather Research and Forecasting (WRF) model and its four-dimensional variational data assimilation system are used to assimilate the Tropical Rainfall Measuring Mission 3B42 and Japan Aerospace Exploration Agency Global Satellite Mapping of Precipitation retrieved rainfall. A total of five experiments are performed daily with and without assimilation of rainfall data during the entire month of July 2011. Separate assimilation experiments are performed to assess the sensitivity of WRF model forecast with strict and less strict quality control. Assimilation of rainfall improves the forecast of temperature, specific humidity, and wind speed. Domain average improvement parameter of rainfall forecast is also improved over the Indian landmass when compared with NOAA Climate Prediction Center Morphing technique and Indian Meteorological Department gridded rainfall.
Kumar, Prashant; Kishtawal, C. M.; Pal, P. K.
Forecasting the spatial and temporal distribution of aftershocks is of great importance to earthquake scientists, civil protection authorities and the general public as these events cause disproportionate damage and consternation relative to their size. At present, there are two main approaches to such forecasts-purely statistical methods based on observations of the initial portions of aftershock sequences and a physics-based approach based on Coulomb stress changes caused by the main shock. Here we develop a new method which combines the spatial constraints from the Coulomb model with the statistical power of the STEP (short-term earthquake probability) approach. We test this pseudo prospectively and retrospectively on the Canterbury sequence against the STEP model and a Coulomb rate-state method, using data from the first 10 d following each main event to forecast the rate of M ? 4 events in the following 100 d. We find that in retrospective tests the new model outperforms STEP for two events in the sequence but this is not the case for pseudo-prospective tests. Further, the Coulomb rate-state approach never performs better than STEP. Our results suggest that incorporating the physical constraints from Coulomb stress changes can increase the forecasting power of statistical models and clearly show the importance of good data quality if prospective forecasts are to be implemented in practice.
Steacy, Sandy; Gerstenberger, Matt; Williams, Charles; Rhoades, David; Christophersen, Annemarie
This report presents the results of a modeling study performed with the Livermore Economic Modeling System (EMS), using a number of different methods to model resource-owners' foresight. Qualitatively, the effect of a gven scheme is, a priori, known: if f...
R. B. Bell
Mesoscale weather conditions can significantly affect the space launch and landing operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). During the summer months, land-sea interactions that occur across KSC and CCAFS lead to the formation of a sea breeze, which can then spawn deep convection. These convective processes often last 60 minutes or less and pose a significant challenge to the forecasters at the National Weather Service (NWS) Spaceflight Meteorology Group (SMG). The main challenge is that a "GO" forecast for thunderstorms and precipitation is required at the 90 minute deorbit decision for End Of Mission (EOM) and at the 30 minute Return To Launch Site (RTLS) decision at the Shuttle Landing Facility. Convective initiation, timing, and mode also present a forecast challenge for the NWS in Melbourne, FL (MLB). The NWS MLB issues such tactical forecast information as Terminal Aerodrome Forecasts (TAFs), Spot Forecasts for fire weather and hazardous materials incident support, and severe/hazardous weather Watches, Warnings, and Advisories. Lastly, these forecasting challenges can also affect the 45th Weather Squadron (45 WS), which provides comprehensive weather forecasts for shuttle launch, as well as ground operations, at KSC and CCAFS. The need for accurate mesoscale model forecasts to aid in their decision making is crucial. Both the SMG and the MLB are currently implementing the Weather Research and Forecasting Environmental Modeling System (WRF EMS) software into their operations. The WRF EMS software allows users to employ both dynamical cores - the Advanced Research WRF (ARW) and the Non-hydrostatic Mesoscale Model (NMM). There are also data assimilation analysis packages available for the initialization of the WRF model- the Local Analysis and Prediction System (LAPS) and the Advanced Regional Prediction System (ARPS) Data Analysis System (ADAS). Having a series of initialization options and WRF cores, as well as many options within each core, provides SMG and NWS MLB with a lot of flexibility. It also creates challenges, such as determining which configuration options are best to address specific forecast concerns. The goal of this project is to assess the different configurations available and to determine which configuration will best predict warm season convective initiation in East-Central Florida. Four different combinations of WRF initializations will be run (ADAS-ARW, ADAS-NMM, LAPS-ARW, and LAPS-NMM) at a 4-km resolution over the Florida peninsula and adjacent coastal waters. Five candidate convective initiation days using three different flow regimes over East-Central Florida will be examined, as well as two null cases (non-convection days). Each model run will be integrated 12 hours with three runs per day, at 0900, 1200, and 1500 UTe. ADAS analyses will be generated every 30 minutes using Level II Weather Surveillance Radar-1988 Doppler (WSR-88D) data from all Florida radars to verify the convection forecast. These analyses will be run on the same domain as the four model configurations. To quantify model performance, model output will be subjectively compared to the ADAS analyses of convection to determine forecast accuracy. In addition, a subjective comparison of the performance of the ARW using a high-resolution local grid with 2-way nesting, I-way nesting, and no nesting will be made for select convective initiation cases. The inner grid will cover the East-Central Florida region at a resolution of 1.33 km. The authors will summarize the relative skill of the various WRF configurations and how each configuration behaves relative to the others, as well as determine the best model configuration for predicting warm season convective initiation over East-Central Florida.
Watson, Leela R.; Hoeth, Brian; Blottman, Peter F.
Weather Forecasting is a set of computer-based learning modules that teach students about meteorology from the point of view of learning how to forecast the weather. The modules were designed as the primary teaching resource for a seminar course on weather forecasting at the introductory college level (originally METR 151, later ATMO 151) and can also be used in the laboratory component of an introductory atmospheric science course. The modules assume no prior meteorological knowledge. In addition to text and graphics, the modules include interactive questions and answers designed to reinforce student learning. The module topics are: 1. How to Access Weather Data, 2. How to Read Hourly Weather Observations, 3. The National Collegiate Weather Forecasting Contest, 4. Radiation and the Diurnal Heating Cycle, 5. Factors Affecting Temperature: Clouds and Moisture, 6. Factors Affecting Temperature: Wind and Mixing, 7. Air Masses and Fronts, 8. Forces in the Atmosphere, 9. Air Pressure, Temperature, and Height, 10. Winds and Pressure, 11. The Forecasting Process, 12. Sounding Diagrams, 13. Upper Air Maps, 14. Satellite Imagery, 15. Radar Imagery, 16. Numerical Weather Prediction, 17. NWS Forecast Models, 18. Sources of Model Error, 19. Sea Breezes, Land Breezes, and Coastal Fronts, 20. Soundings, Clouds, and Convection, 21. Snow Forecasting.