A model to forecast peak spreading.
DOT National Transportation Integrated Search
2012-04-01
As traffic congestion increases, the K-factor, defined as the proportion of the 24-hour traffic volume that occurs during the peak hour, may decrease. This behavioral response is known as peak spreading: as congestion grows during the peak travel tim...
Probabilistic computer model of optimal runway turnoffs
NASA Technical Reports Server (NTRS)
Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.
1985-01-01
Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.
Algorithm and data support of traffic congestion forecasting in the controlled transport
NASA Astrophysics Data System (ADS)
Dmitriev, S. V.
2015-06-01
The topicality of problem of the traffic congestion forecasting in the logistic systems of product movement highways is considered. The concepts: the controlled territory, the highway occupancy by vehicles, the parking and the controlled territory are introduced. Technical realizabilityof organizing the necessary flow of information on the state of the transport system for its regulation has been marked. Sequence of practical implementation of the solution is given. An algorithm for predicting traffic congestion in the controlled transport system is suggested.
Speed and Delay Prediction Models for Planning Applications
DOT National Transportation Integrated Search
1999-01-01
Estimation of vehicle speed and delay is fundamental to many forms of : transportation planning analyses including air quality, long-range travel : forecasting, major investment studies, and congestion management systems. : However, existing planning...
Forecasting the clearance time of freeway accidents
DOT National Transportation Integrated Search
2002-01-01
Freeway congestion is a major and costly problem in many U.S. metropolitan areas. From a traveler's perspective, congestion has costs in terms of longer travel times and lost productivity. From the traffic manager's perspective, congestion causes a f...
Optimized Structure of the Traffic Flow Forecasting Model With a Deep Learning Approach.
Yang, Hao-Fan; Dillon, Tharam S; Chen, Yi-Ping Phoebe
2017-10-01
Forecasting accuracy is an important issue for successful intelligent traffic management, especially in the domain of traffic efficiency and congestion reduction. The dawning of the big data era brings opportunities to greatly improve prediction accuracy. In this paper, we propose a novel model, stacked autoencoder Levenberg-Marquardt model, which is a type of deep architecture of neural network approach aiming to improve forecasting accuracy. The proposed model is designed using the Taguchi method to develop an optimized structure and to learn traffic flow features through layer-by-layer feature granulation with a greedy layerwise unsupervised learning algorithm. It is applied to real-world data collected from the M6 freeway in the U.K. and is compared with three existing traffic predictors. To the best of our knowledge, this is the first time that an optimized structure of the traffic flow forecasting model with a deep learning approach is presented. The evaluation results demonstrate that the proposed model with an optimized structure has superior performance in traffic flow forecasting.
Economic analysis for transmission operation and planning
NASA Astrophysics Data System (ADS)
Zhou, Qun
2011-12-01
Restructuring of the electric power industry has caused dramatic changes in the use of transmission system. The increasing congestion conditions as well as the necessity of integrating renewable energy introduce new challenges and uncertainties to transmission operation and planning. Accurate short-term congestion forecasting facilitates market traders in bidding and trading activities. Cost sharing and recovery issue is a major impediment for long-term transmission investment to integrate renewable energy. In this research, a new short-term forecasting algorithm is proposed for predicting congestion, LMPs, and other power system variables based on the concept of system patterns. The advantage of this algorithm relative to standard statistical forecasting methods is that structural aspects underlying power market operations are exploited to reduce the forecasting error. The advantage relative to previously proposed structural forecasting methods is that data requirements are substantially reduced. Forecasting results based on a NYISO case study demonstrate the feasibility and accuracy of the proposed algorithm. Moreover, a negotiation methodology is developed to guide transmission investment for integrating renewable energy. Built on Nash Bargaining theory, the negotiation of investment plans and payment rate can proceed between renewable generation and transmission companies for cost sharing and recovery. The proposed approach is applied to Garver's six bus system. The numerical results demonstrate fairness and efficiency of the approach, and hence can be used as guidelines for renewable energy investors. The results also shed light on policy-making of renewable energy subsidies.
A Beacon Transmission Power Control Algorithm Based on Wireless Channel Load Forecasting in VANETs.
Mo, Yuanfu; Yu, Dexin; Song, Jun; Zheng, Kun; Guo, Yajuan
2015-01-01
In a vehicular ad hoc network (VANET), the periodic exchange of single-hop status information broadcasts (beacon frames) produces channel loading, which causes channel congestion and induces information conflict problems. To guarantee fairness in beacon transmissions from each node and maximum network connectivity, adjustment of the beacon transmission power is an effective method for reducing and preventing channel congestion. In this study, the primary factors that influence wireless channel loading are selected to construct the KF-BCLF, which is a channel load forecasting algorithm based on a recursive Kalman filter and employs multiple regression equation. By pre-adjusting the transmission power based on the forecasted channel load, the channel load was kept within a predefined range; therefore, channel congestion was prevented. Based on this method, the CLF-BTPC, which is a transmission power control algorithm, is proposed. To verify KF-BCLF algorithm, a traffic survey method that involved the collection of floating car data along a major traffic road in Changchun City is employed. By comparing this forecast with the measured channel loads, the proposed KF-BCLF algorithm was proven to be effective. In addition, the CLF-BTPC algorithm is verified by simulating a section of eight-lane highway and a signal-controlled urban intersection. The results of the two verification process indicate that this distributed CLF-BTPC algorithm can effectively control channel load, prevent channel congestion, and enhance the stability and robustness of wireless beacon transmission in a vehicular network.
A Beacon Transmission Power Control Algorithm Based on Wireless Channel Load Forecasting in VANETs
Mo, Yuanfu; Yu, Dexin; Song, Jun; Zheng, Kun; Guo, Yajuan
2015-01-01
In a vehicular ad hoc network (VANET), the periodic exchange of single-hop status information broadcasts (beacon frames) produces channel loading, which causes channel congestion and induces information conflict problems. To guarantee fairness in beacon transmissions from each node and maximum network connectivity, adjustment of the beacon transmission power is an effective method for reducing and preventing channel congestion. In this study, the primary factors that influence wireless channel loading are selected to construct the KF-BCLF, which is a channel load forecasting algorithm based on a recursive Kalman filter and employs multiple regression equation. By pre-adjusting the transmission power based on the forecasted channel load, the channel load was kept within a predefined range; therefore, channel congestion was prevented. Based on this method, the CLF-BTPC, which is a transmission power control algorithm, is proposed. To verify KF-BCLF algorithm, a traffic survey method that involved the collection of floating car data along a major traffic road in Changchun City is employed. By comparing this forecast with the measured channel loads, the proposed KF-BCLF algorithm was proven to be effective. In addition, the CLF-BTPC algorithm is verified by simulating a section of eight-lane highway and a signal-controlled urban intersection. The results of the two verification process indicate that this distributed CLF-BTPC algorithm can effectively control channel load, prevent channel congestion, and enhance the stability and robustness of wireless beacon transmission in a vehicular network. PMID:26571042
Essays in financial transmission rights pricing
NASA Astrophysics Data System (ADS)
Posner, Barry
This work examines issues in the pricing of financial transmission rights in the PJM market region. The US federal government is advocating the creation of large-scale, not-for-profit regional transmission organizations to increase the efficiency of the transmission of electricity. As a non-profit entity, PJM needs to allocate excess revenues collected as congestion rents, and the participants in the transmission markets need to be able to hedge their exposure to congestion rents. For these purposes, PJM has developed an instrument known as the financial transmission right (FTR). This research, utilizing a new data set assembled by the author, looks at two aspects of the FTR market. The first chapter examines the problem of forecasting congestion in a transmission grid. In the PJM FTR system firms bid in a competitive auction for FTRs that cover a period of one month. The auctions take place in the middle of the previous month; therefore firms have to forecast congestion rents for the period two to six weeks after the auction. The common methods of forecasting congestion are either time-series models or full-information engineering studies. In this research, the author develops a forecasting system that is more economically grounded than a simple time-series model, but requires less information than an engineering model. This method is based upon the arbitrage-cost methodology, whereby congesting is calculated as the difference of two non-observable variables: the transmission price difference that would exist in the total absence of transmission capacity between two nodes, and the ability of the existing transmission to reduced that price difference. If the ability to reduce the price difference is greater than the price difference, then the cost of electricity at each node will be the same, and congestion rent will be zero. If transmission capacity limits are binding on the flow of power, then a price difference persists and congestion rents exist. Three transmission paths in the Delmarva Peninsula were examined. The maximum-likelihood two-way Tobit model developed in Chapter One consistently predicts the expected responses to the independent variables that have employed, but the model as defined here does a poor job of predicting prices. This is likely due to the inability to include system outages (i.e., short-term changes in the structure of the transmission grid) as variables in the estimation model. The second chapter addresses the behavior of firms in the monthly auctions for FTRs. FTRs are a claim to congestion rent revenues along a certain path within the PJM grid, and are awarded in a uniform-price divisible-goods auction. Firms typically submit a schedule of bids for different amounts of FTR at different prices, akin to a demand curve. A firm bidding too high a price may cause the clearing price of the FTR to be higher than the realized value of the FTR, creating a loss from ownership of the FTR. A firm bidding too low means that it wins no FTRs, depriving itself of the ability to profit from ownership or to hedge against congestion. Several questions concerning firm behavior are addressed in this study. It is found that firms adjust their bids in response to new information that is obtained from past auctions: they raise or lower bids in accordance with changes in recent FTR prices and payoffs. Firms consistently bid below the value of the FTR (i.e., shade their bids). This adds empirical evidence to the theoretically-posited notion that uniform-price auctions are not truth-telling, unlike the second-price auction for a non-divisible good. Firms employ greater bid-shading in response to increases in the volatility of both FTR clearing prices and realized FTR values. This validates the notion that firms are risk-averse. It is discovered that better-informed "insider" firms employ structurally different bidding strategies, but these differences do not lead to greater profits. However, profits do increase as firms gain more experience in these markets, lending credence to the notion that firms learn over time and that markets discipline poorly performing firms by either educating them or driving them out of the market. It is also found that firms that employ complicated bidding strategies enjoy greater profitability than firms which employ simple bidding strategies. A surprising corollary finding is that firm strategies do not converge to a common form, but that different firms continue to employ different strategies, and often move away from the seemingly dominant strategy. Firms can enter this market as either long-buyers or short-sellers, and it is discovered that long and short players display structurally divergent bidding strategies. This is perhaps unsurprising, given that long players can be either hedgers or speculators, but short players are overwhelmingly speculators.
Holt-Winters Forecasting: A Study of Practical Applications for Healthcare Managers
2006-05-25
Winters Forecasting 5 List of Tables Table 1. Holt-Winters smoothing parameters and Mean Absolute Percentage Errors: Pseudoephedrine prescriptions Table 2...confidence intervals Holt-Winters Forecasting 6 List of Figures Figure 1. Line Plot of Pseudoephedrine Prescriptions forecast using smoothing parameters...The first represents monthly prescriptions of pseudoephedrine . Pseudoephedrine is a drug commonly prescribed to relieve nasal congestion and other
Predicting vehicle fuel consumption patterns using floating vehicle data.
Du, Yiman; Wu, Jianping; Yang, Senyan; Zhou, Liutong
2017-09-01
The status of energy consumption and air pollution in China is serious. It is important to analyze and predict the different fuel consumption of various types of vehicles under different influence factors. In order to fully describe the relationship between fuel consumption and the impact factors, massive amounts of floating vehicle data were used. The fuel consumption pattern and congestion pattern based on large samples of historical floating vehicle data were explored, drivers' information and vehicles' parameters from different group classification were probed, and the average velocity and average fuel consumption in the temporal dimension and spatial dimension were analyzed respectively. The fuel consumption forecasting model was established by using a Back Propagation Neural Network. Part of the sample set was used to train the forecasting model and the remaining part of the sample set was used as input to the forecasting model. Copyright © 2017. Published by Elsevier B.V.
The importance of antipersistence for traffic jams
NASA Astrophysics Data System (ADS)
Krause, Sebastian M.; Habel, Lars; Guhr, Thomas; Schreckenberg, Michael
2017-05-01
Universal characteristics of road networks and traffic patterns can help to forecast and control traffic congestion. The antipersistence of traffic flow time series has been found for many data sets, but its relevance for congestion has been overseen. Based on empirical data from motorways in Germany, we study how antipersistence of traffic flow time-series impacts the duration of traffic congestion on a wide range of time scales. We find a large number of short-lasting traffic jams, which implies a large risk for rear-end collisions.
A Method for Forecasting the Commercial Air Traffic Schedule in the Future
NASA Technical Reports Server (NTRS)
Long, Dou; Lee, David; Gaier, Eric; Johnson, Jesse; Kostiuk, Peter
1999-01-01
This report presents an integrated set of models that forecasts air carriers' future operations when delays due to limited terminal-area capacity are considered. This report models the industry as a whole, avoiding unnecessary details of competition among the carriers. To develop the schedule outputs, we first present a model to forecast the unconstrained flight schedules in the future, based on the assumption of rational behavior of the carriers. Then we develop a method to modify the unconstrained schedules, accounting for effects of congestion due to limited NAS capacities. Our underlying assumption is that carriers will modify their operations to keep mean delays within certain limits. We estimate values for those limits from changes in planned block times reflected in the OAG. Our method for modifying schedules takes many means of reducing the delays into considerations, albeit some of them indirectly. The direct actions include depeaking, operating in off-hours, and reducing hub airports'operations. Indirect actions include using secondary airports, using larger aircraft, and selecting new hub airports, which, we assume, have already been modeled in the FAA's TAF. Users of our suite of models can substitute an alternative forecast for the TAF.
Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation
Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott
2016-01-01
Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217
An Enhanced Convective Forecast (ECF) for the New York TRACON Area
NASA Technical Reports Server (NTRS)
Wheeler, Mark; Stobie, James; Gillen, Robert; Jedlovec, Gary; Sims, Danny
2008-01-01
In an effort to relieve summer-time congestion in the NY Terminal Radar Approach Control (TRACON) area, the FAA is testing an enhanced convective forecast (ECF) product. The test began in June 2008 and is scheduled to run through early September. The ECF is updated every two hours, right before the Air Traffic Control System Command Center (ATCSCC) national planning telcon. It is intended to be used by traffic managers throughout the National Airspace System (NAS) and airlines dispatchers to supplement information from the Collaborative Convective Forecast Product (CCFP) and the Corridor Integrated Weather System (CIWS). The ECF begins where the current CIWS forecast ends at 2 hours and extends out to 12 hours. Unlike the CCFP it is a detailed deterministic forecast with no aerial coverage limits. It is created by an ENSCO forecaster using a variety of guidance products including, the Weather Research and Forecast (WRF) model. This is the same version of the WRF that ENSCO runs over the Florida peninsula in support of launch operations at the Kennedy Space Center. For this project, the WRF model domain has been shifted to the Northeastern US. Several products from the NASA SPoRT group are also used by the ENSCO forecaster. In this paper we will provide examples of the ECF products and discuss individual cases of traffic management actions using ECF guidance.
Using temporal detrending to observe the spatial correlation of traffic.
Ermagun, Alireza; Chatterjee, Snigdhansu; Levinson, David
2017-01-01
This empirical study sheds light on the spatial correlation of traffic links under different traffic regimes. We mimic the behavior of real traffic by pinpointing the spatial correlation between 140 freeway traffic links in a major sub-network of the Minneapolis-St. Paul freeway system with a grid-like network topology. This topology enables us to juxtapose the positive and negative correlation between links, which has been overlooked in short-term traffic forecasting models. To accurately and reliably measure the correlation between traffic links, we develop an algorithm that eliminates temporal trends in three dimensions: (1) hourly dimension, (2) weekly dimension, and (3) system dimension for each link. The spatial correlation of traffic links exhibits a stronger negative correlation in rush hours, when congestion affects route choice. Although this correlation occurs mostly in parallel links, it is also observed upstream, where travelers receive information and are able to switch to substitute paths. Irrespective of the time-of-day and day-of-week, a strong positive correlation is witnessed between upstream and downstream links. This correlation is stronger in uncongested regimes, as traffic flow passes through consecutive links more quickly and there is no congestion effect to shift or stall traffic. The extracted spatial correlation structure can augment the accuracy of short-term traffic forecasting models.
Using temporal detrending to observe the spatial correlation of traffic
2017-01-01
This empirical study sheds light on the spatial correlation of traffic links under different traffic regimes. We mimic the behavior of real traffic by pinpointing the spatial correlation between 140 freeway traffic links in a major sub-network of the Minneapolis—St. Paul freeway system with a grid-like network topology. This topology enables us to juxtapose the positive and negative correlation between links, which has been overlooked in short-term traffic forecasting models. To accurately and reliably measure the correlation between traffic links, we develop an algorithm that eliminates temporal trends in three dimensions: (1) hourly dimension, (2) weekly dimension, and (3) system dimension for each link. The spatial correlation of traffic links exhibits a stronger negative correlation in rush hours, when congestion affects route choice. Although this correlation occurs mostly in parallel links, it is also observed upstream, where travelers receive information and are able to switch to substitute paths. Irrespective of the time-of-day and day-of-week, a strong positive correlation is witnessed between upstream and downstream links. This correlation is stronger in uncongested regimes, as traffic flow passes through consecutive links more quickly and there is no congestion effect to shift or stall traffic. The extracted spatial correlation structure can augment the accuracy of short-term traffic forecasting models. PMID:28472093
NASA Astrophysics Data System (ADS)
Hart, E. K.; Jacobson, M. Z.; Dvorak, M. J.
2008-12-01
Time series power flow analyses of the California electricity grid are performed with extensive addition of intermittent renewable power. The study focuses on the effects of replacing non-renewable and imported (out-of-state) electricity with wind and solar power on the reliability of the transmission grid. Simulations are performed for specific days chosen throughout the year to capture seasonal fluctuations in load, wind, and insolation. Wind farm expansions and new wind farms are proposed based on regional wind resources and time-dependent wind power output is calculated using a meteorological model and the power curves of specific wind turbines. Solar power is incorporated both as centralized and distributed generation. Concentrating solar thermal plants are modeled using local insolation data and the efficiencies of pre-existing plants. Distributed generation from rooftop PV systems is included using regional insolation data, efficiencies of common PV systems, and census data. The additional power output of these technologies offsets power from large natural gas plants and is balanced for the purposes of load matching largely with hydroelectric power and by curtailment when necessary. A quantitative analysis of the effects of this significant shift in the electricity portfolio of the state of California on power availability and transmission line congestion, using a transmission load-flow model, is presented. A sensitivity analysis is also performed to determine the effects of forecasting errors in wind and insolation on load-matching and transmission line congestion.
Increasing accuracy of vehicle speed measurement in congested traffic over dual-loop sensors.
DOT National Transportation Integrated Search
2014-09-01
Classified vehicle counts are a critical measure for forecasting the health of the roadway infrastructure : and for planning future improvements to the transportation network. Balancing the cost of data : collection with the fidelity of the measureme...
Assessing air quality and climate impacts of future ground freight choice in United States
NASA Astrophysics Data System (ADS)
Liu, L.; Bond, T. C.; Smith, S.; Lee, B.; Ouyang, Y.; Hwang, T.; Barkan, C.; Lee, S.; Daenzer, K.
2013-12-01
The demand for freight transportation has continued to increase due to the growth of domestic and international trade. Emissions from ground freight (truck and railways) account for around 7% of the greenhouse gas emissions, 4% of the primary particulate matter emission and 25% of the NOx emissions in the U.S. Freight railways are generally more fuel efficient than trucks and cause less congestion. Freight demand and emissions are affected by many factors, including economic activity, the spatial distribution of demand, freight modal choice and routing decision, and the technology used in each modal type. This work links these four critical aspects of freight emission system to project the spatial distribution of emissions and pollutant concentration from ground freight transport in the U.S. between 2010 and 2050. Macroeconomic scenarios are used to forecast economic activities. Future spatial structure of employment and commodity demand in major metropolitan areas are estimated using spatial models and a shift-share model, respectively. Freight flow concentration and congestion patterns in inter-regional transportation networks are predicted from a four-step freight demand forecasting model. An asymptotic vehicle routing model is also developed to estimate delivery ton-miles for intra-regional freight shipment in metropolitan areas. Projected freight activities are then converted into impacts on air quality and climate. CO2 emissions are determined using a simple model of freight activity and fuel efficiency, and compared with the projected CO2 emissions from the Second Generation Model. Emissions of air pollutants including PM, NOx and CO are calculated with a vehicle fleet model SPEW-Trend, which incorporates the dynamic change of technologies. Emissions are projected under three economic scenarios to represent different plausible futures. Pollutant concentrations are then estimated using tagged chemical tracers in an atmospheric model with the emissions serving as input.
Forecast-based interventions can reduce the health and economic burden of wildfires.
Rappold, Ana G; Fann, Neal L; Crooks, James; Huang, Jin; Cascio, Wayne E; Devlin, Robert B; Diaz-Sanchez, David
2014-09-16
We simulated public health forecast-based interventions during a wildfire smoke episode in rural North Carolina to show the potential for use of modeled smoke forecasts toward reducing the health burden and showed a significant economic benefit of reducing exposures. Daily and county wide intervention advisories were designed to occur when fine particulate matter (PM2.5) from smoke, forecasted 24 or 48 h in advance, was expected to exceed a predetermined threshold. Three different thresholds were considered in simulations, each with three different levels of adherence to the advisories. Interventions were simulated in the adult population susceptible to health exacerbations related to the chronic conditions of asthma and congestive heart failure. Associations between Emergency Department (ED) visits for these conditions and daily PM2.5 concentrations under each intervention were evaluated. Triggering interventions at lower PM2.5 thresholds (≤ 20 μg/m(3)) with good compliance yielded the greatest risk reduction. At the highest threshold levels (50 μg/m(3)) interventions were ineffective in reducing health risks at any level of compliance. The economic benefit of effective interventions exceeded $1 M in excess ED visits for asthma and heart failure, $2 M in loss of productivity, $100 K in respiratory conditions in children, and $42 million due to excess mortality.
Capacity-constrained traffic assignment in networks with residual queues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lam, W.H.K.; Zhang, Y.
2000-04-01
This paper proposes a capacity-constrained traffic assignment model for strategic transport planning in which the steady-state user equilibrium principle is extended for road networks with residual queues. Therefore, the road-exit capacity and the queuing effects can be incorporated into the strategic transport model for traffic forecasting. The proposed model is applicable to the congested network particularly when the traffic demands exceeds the capacity of the network during the peak period. An efficient solution method is proposed for solving the steady-state traffic assignment problem with residual queues. Then a simple numerical example is employed to demonstrate the application of the proposedmore » model and solution method, while an example of a medium-sized arterial highway network in Sioux Falls, South Dakota, is used to test the applicability of the proposed solution to real problems.« less
Network congestion control algorithm based on Actor-Critic reinforcement learning model
NASA Astrophysics Data System (ADS)
Xu, Tao; Gong, Lina; Zhang, Wei; Li, Xuhong; Wang, Xia; Pan, Wenwen
2018-04-01
Aiming at the network congestion control problem, a congestion control algorithm based on Actor-Critic reinforcement learning model is designed. Through the genetic algorithm in the congestion control strategy, the network congestion problems can be better found and prevented. According to Actor-Critic reinforcement learning, the simulation experiment of network congestion control algorithm is designed. The simulation experiments verify that the AQM controller can predict the dynamic characteristics of the network system. Moreover, the learning strategy is adopted to optimize the network performance, and the dropping probability of packets is adaptively adjusted so as to improve the network performance and avoid congestion. Based on the above finding, it is concluded that the network congestion control algorithm based on Actor-Critic reinforcement learning model can effectively avoid the occurrence of TCP network congestion.
Dynamic, stochastic models for congestion pricing and congestion securities.
DOT National Transportation Integrated Search
2010-12-01
This research considers congestion pricing under demand uncertainty. In particular, a robust optimization (RO) approach is applied to optimal congestion pricing problems under user equilibrium. A mathematical model is developed and an analysis perfor...
A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.
Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung
2016-03-01
With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the multicollinearity between explanatory variables were also discussed. By including a specific congestion indicator, the model performance significantly improved. When comparing models with and without ridge regression, the magnitude of the coefficients was altered in the existence of multicollinearity. These conclusions suggest that the use of appropriate congestion measure and consideration of multicolilnearity among the variables would improve the models and our understanding about the effects of congestion on traffic safety. Copyright © 2015 Elsevier Ltd. All rights reserved.
2012-01-01
Background Characterizing factors which determine susceptibility to air pollution is an important step in understanding the distribution of risk in a population and is critical for setting appropriate policies. We evaluate general and specific measures of community health as modifiers of risk for asthma and congestive heart failure following an episode of acute exposure to wildfire smoke. Methods A population-based study of emergency department visits and daily concentrations of fine particulate matter during a wildfire in North Carolina was performed. Determinants of community health defined by County Health Rankings were evaluated as modifiers of the relative risk. A total of 40 mostly rural counties were included in the study. These rankings measure factors influencing health: health behaviors, access and quality of clinical care, social and economic factors, and physical environment, as well as, the outcomes of health: premature mortality and morbidity. Pollutant concentrations were obtained from a mathematically modeled smoke forecasting system. Estimates of relative risk for emergency department visits were based on Poisson mixed effects regression models applied to daily visit counts. Results For asthma, the strongest association was observed at lag day 0 with excess relative risk of 66%(28,117). For congestive heart failure the excess relative risk was 42%(5,93). The largest difference in risk was observed after stratifying on the basis of Socio-Economic Factors. Difference in risk between bottom and top ranked counties by Socio-Economic Factors was 85% and 124% for asthma and congestive heart failure respectively. Conclusions The results indicate that Socio-Economic Factors should be considered as modifying risk factors in air pollution studies and be evaluated in the assessment of air pollution impacts. PMID:23006928
Rappold, Ana G; Cascio, Wayne E; Kilaru, Vasu J; Stone, Susan L; Neas, Lucas M; Devlin, Robert B; Diaz-Sanchez, David
2012-09-24
Characterizing factors which determine susceptibility to air pollution is an important step in understanding the distribution of risk in a population and is critical for setting appropriate policies. We evaluate general and specific measures of community health as modifiers of risk for asthma and congestive heart failure following an episode of acute exposure to wildfire smoke. A population-based study of emergency department visits and daily concentrations of fine particulate matter during a wildfire in North Carolina was performed. Determinants of community health defined by County Health Rankings were evaluated as modifiers of the relative risk. A total of 40 mostly rural counties were included in the study. These rankings measure factors influencing health: health behaviors, access and quality of clinical care, social and economic factors, and physical environment, as well as, the outcomes of health: premature mortality and morbidity. Pollutant concentrations were obtained from a mathematically modeled smoke forecasting system. Estimates of relative risk for emergency department visits were based on Poisson mixed effects regression models applied to daily visit counts. For asthma, the strongest association was observed at lag day 0 with excess relative risk of 66% (28,117). For congestive heart failure the excess relative risk was 42% (5,93). The largest difference in risk was observed after stratifying on the basis of Socio-Economic Factors. Difference in risk between bottom and top ranked counties by Socio-Economic Factors was 85% and 124% for asthma and congestive heart failure respectively. The results indicate that Socio-Economic Factors should be considered as modifying risk factors in air pollution studies and be evaluated in the assessment of air pollution impacts.
Integrated risk/cost planning models for the US Air Traffic system
NASA Technical Reports Server (NTRS)
Mulvey, J. M.; Zenios, S. A.
1985-01-01
A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size.
Civil Tiltrotor Feasibility Study for the New York and Washington Terminal Areas
NASA Technical Reports Server (NTRS)
Stouffer, Virginia; Johnson, Jesse; Gribko, Joana; Yackovetsky, Robert (Technical Monitor)
2001-01-01
NASA tasked LMI to assess the potential contributions of a yet-undeveloped Civil Tiltrotor aircraft (CTR) in improving capacity in the National Airspace System in all weather conditions. The CTRs studied have assumed operating parameters beyond current CTR capabilities. LMI analyzed CTRs three ways: in fast-time terminal area modeling simulations of New York and Washington to determine delay and throughput impacts; in the Integrated Noise Model, to determine local environmental impact; and with an economic model, to determine the price viability of a CTR. The fast-time models encompassed a 250 nmi range and included traffic interactions from local airports. Both the fast-time simulation and the noise model assessed impacts from traffic levels projected for 1999, 2007, and 2017. Results: CTRs can reduce terminal area delays due to concrete congestion in all time frames. The maximum effect, the ratio of CTRs to jets and turboprop aircraft at a subject airport should be optimized. The economic model considered US traffic only and forecasted CTR sales beginning in 2010.
Fractal mechanisms and heart rate dynamics. Long-range correlations and their breakdown with disease
NASA Technical Reports Server (NTRS)
Peng, C. K.; Havlin, S.; Hausdorff, J. M.; Mietus, J. E.; Stanley, H. E.; Goldberger, A. L.
1995-01-01
Under healthy conditions, the normal cardiac (sinus) interbeat interval fluctuates in a complex manner. Quantitative analysis using techniques adapted from statistical physics reveals the presence of long-range power-law correlations extending over thousands of heartbeats. This scale-invariant (fractal) behavior suggests that the regulatory system generating these fluctuations is operating far from equilibrium. In contrast, it is found that for subjects at high risk of sudden death (e.g., congestive heart failure patients), these long-range correlations break down. Application of fractal scaling analysis and related techniques provides new approaches to assessing cardiac risk and forecasting sudden cardiac death, as well as motivating development of novel physiologic models of systems that appear to be heterodynamic rather than homeostatic.
Modeling and simulation of consumer response to dynamic pricing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valenzuela, J.; Thimmapuram, P.; Kim, J
2012-08-01
Assessing the impacts of dynamic-pricing under the smart grid concept is becoming extremely important for deciding its full deployment. In this paper, we develop a model that represents the response of consumers to dynamic pricing. In the model, consumers use forecasted day-ahead prices to shift daily energy consumption from hours when the price is expected to be high to hours when the price is expected to be low while maintaining the total energy consumption as unchanged. We integrate the consumer response model into the Electricity Market Complex Adaptive System (EMCAS). EMCAS is an agent-based model that simulates restructured electricity markets.more » We explore the impacts of dynamic-pricing on price spikes, peak demand, consumer energy bills, power supplier profits, and congestion costs. A simulation of an 11-node test network that includes eight generation companies and five aggregated consumers is performed for a period of 1 month. In addition, we simulate the Korean power system.« less
Investment in generation is heavy, but important needs remain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maize, K.
2007-01-15
Forecasting the direction of the US electric power industry for 2007, much less the distant future, is like defining a velocity vector; doing so requires a direction and speed to delineate progress. In this special report, the paper looks at current industry indicators and draws conclusions based on more than 100 years of experience. To borrow verbatim the title of basketball legend Charles Barkely's book 'I may be wrong but I doubt it'. The forecast takes into consideration USDOE's National Electric Transmission Congestion Study (August 2006),a summary of industry data prepared by Industrial Info Resources (IIR) and NERC's 2006 Long-Termmore » Reliability Assessment (October 2006). It also reports opinions of industry specialists. 3 figs., 4 tabs.« less
Signalling and obfuscation for congestion control
NASA Astrophysics Data System (ADS)
Mareček, Jakub; Shorten, Robert; Yu, Jia Yuan
2015-10-01
We aim to reduce the social cost of congestion in many smart city applications. In our model of congestion, agents interact over limited resources after receiving signals from a central agent that observes the state of congestion in real time. Under natural models of agent populations, we develop new signalling schemes and show that by introducing a non-trivial amount of uncertainty in the signals, we reduce the social cost of congestion, i.e., improve social welfare. The signalling schemes are efficient in terms of both communication and computation, and are consistent with past observations of the congestion. Moreover, the resulting population dynamics converge under reasonable assumptions.
NASA Technical Reports Server (NTRS)
Peters, Mark; Boisvert, Ben; Escala, Diego
2009-01-01
Explicit integration of aviation weather forecasts with the National Airspace System (NAS) structure is needed to improve the development and execution of operationally effective weather impact mitigation plans and has become increasingly important due to NAS congestion and associated increases in delay. This article considers several contemporary weather-air traffic management (ATM) integration applications: the use of probabilistic forecasts of visibility at San Francisco, the Route Availability Planning Tool to facilitate departures from the New York airports during thunderstorms, the estimation of en route capacity in convective weather, and the application of mixed-integer optimization techniques to air traffic management when the en route and terminal capacities are varying with time because of convective weather impacts. Our operational experience at San Francisco and New York coupled with very promising initial results of traffic flow optimizations suggests that weather-ATM integrated systems warrant significant research and development investment. However, they will need to be refined through rapid prototyping at facilities with supportive operational users We have discussed key elements of an emerging aviation weather research area: the explicit integration of aviation weather forecasts with NAS structure to improve the effectiveness and timeliness of weather impact mitigation plans. Our insights are based on operational experiences with Lincoln Laboratory-developed integrated weather sensing and processing systems, and derivative early prototypes of explicit ATM decision support tools such as the RAPT in New York City. The technical components of this effort involve improving meteorological forecast skill, tailoring the forecast outputs to the problem of estimating airspace impacts, developing models to quantify airspace impacts, and prototyping automated tools that assist in the development of objective broad-area ATM strategies, given probabilistic weather forecasts. Lincoln Laboratory studies and prototype demonstrations in this area are helping to define the weather-assimilated decision-making system that is envisioned as a key capability for the multi-agency Next Generation Air Transportation System [1]. The Laboratory's work in this area has involved continuing, operations-based evolution of both weather forecasts and models for weather impacts on the NAS. Our experience has been that the development of usable ATM technologies that address weather impacts must proceed via rapid prototyping at facilities whose users are highly motivated to participate in system evolution.
A research of the community’s opening to the outside world
NASA Astrophysics Data System (ADS)
Xu, Lan; Liu, Xiangzhuo
2017-03-01
Closed residential areas, called community, the traffic network and result in various degrees of traffic congestion such as amputating, dead ends and T-shaped roads. In order to reveal the mechanism of the congestion, establish an effective evaluation index system and finally provide theoretical basis for the study of traffic congestion, we have done researches on factors for traffic congestion and have established a scientific evaluation index system combining experiences home and abroad, based on domestic congestion status. Firstly, we analyse the traffic network as the entry point, and then establish the evaluation model of road capacity with the method of AHP index system. Secondly, we divide the condition of urban congestion into 5 levels from congestion to smoothness. Besides, with VISSIM software, simulations about traffic capacity before and after community opening are carried out. Finally, we provide forward reasonable suggestions upon the combination of models and reality.
NASA Astrophysics Data System (ADS)
Tone, Tetsuya; Kohara, Kazuhiro
We have investigated ways to reduce congestion in a theme park with multi-agents. We constructed a theme park model called Digital Park 1.0 with twenty-three attractions similar in form to Tokyo Disney Sea. We consider not only congestion information (number of vistors standing in line at each attraction) but also the advantage of a priority boarding pass, like Fast Pass which is used at Tokyo Disney Sea. The congestion-information-usage ratio, which reflects the ratio of visitors who behave according to congestion information, was changed from 0% to 100% in both models, with and without priority boarding pass. The “mean stay time of visitors" is a measure of satisfaction. The smaller mean stay time, the larger degree of satisfaction. Here, a short stay time means a short wait time. The resluts of each simulation are averaged over ten trials. The main results are as follows. (1) When congestion-information-usage ratio increased, the mean stay time decreases. When 20% of visitors behaved according to congestion information, the mean stay time was reduced by 30%. (2) A priority boarding pass reduced congestion, and mean stay time was reduced by 15%. (3) When visitors used congestion information and a priority boarding pass, mean stay time was further reduced. When the congestion-information-usage ratio was 20%, mean stay time was reduced by 35%. (4) When congestion-information-usage ratio was over 50%, the congestion reduction effects reached saturation.
Robust optimization-based DC optimal power flow for managing wind generation uncertainty
NASA Astrophysics Data System (ADS)
Boonchuay, Chanwit; Tomsovic, Kevin; Li, Fangxing; Ongsakul, Weerakorn
2012-11-01
Integrating wind generation into the wider grid causes a number of challenges to traditional power system operation. Given the relatively large wind forecast errors, congestion management tools based on optimal power flow (OPF) need to be improved. In this paper, a robust optimization (RO)-based DCOPF is proposed to determine the optimal generation dispatch and locational marginal prices (LMPs) for a day-ahead competitive electricity market considering the risk of dispatch cost variation. The basic concept is to use the dispatch to hedge against the possibility of reduced or increased wind generation. The proposed RO-based DCOPF is compared with a stochastic non-linear programming (SNP) approach on a modified PJM 5-bus system. Primary test results show that the proposed DCOPF model can provide lower dispatch cost than the SNP approach.
Modeling the effect of accessibility and congestion in location choice.
DOT National Transportation Integrated Search
2012-12-01
This study explores the relationship between accessibility and congestion, and their impacts on property values. Three research questions are addressed: (1) What is the relation between accessibility and congestion both regional and neighborhood leve...
Fair and efficient network congestion control based on minority game
NASA Astrophysics Data System (ADS)
Wang, Zuxi; Wang, Wen; Hu, Hanping; Deng, Zhaozhang
2011-12-01
Low link utility, RTT unfairness and unfairness of Multi-Bottleneck network are the existing problems in the present network congestion control algorithms at large. Through the analogy of network congestion control with the "El Farol Bar" problem, we establish a congestion control model based on minority game(MG), and then present a novel network congestion control algorithm based on the model. The result of simulations indicates that the proposed algorithm can make the achievements of link utility closing to 100%, zero packet lose rate, and small of queue size. Besides, the RTT unfairness and the unfairness of Multi-Bottleneck network can be solved, to achieve the max-min fairness in Multi-Bottleneck network, while efficiently weaken the "ping-pong" oscillation caused by the overall synchronization.
Impact of congestion on bus operations and costs.
DOT National Transportation Integrated Search
2003-10-01
Traffic congestion in Northern New Jersey imposes substantial operational and monetary penalty on bus service. The purpose of this project was to quantify the additional time and costs due to traffic congestion. A regression model was developed that ...
Continuum modeling of cooperative traffic flow dynamics
NASA Astrophysics Data System (ADS)
Ngoduy, D.; Hoogendoorn, S. P.; Liu, R.
2009-07-01
This paper presents a continuum approach to model the dynamics of cooperative traffic flow. The cooperation is defined in our model in a way that the equipped vehicle can issue and receive a warning massage when there is downstream congestion. Upon receiving the warning massage, the (up-stream) equipped vehicle will adapt the current desired speed to the speed at the congested area in order to avoid sharp deceleration when approaching the congestion. To model the dynamics of such cooperative systems, a multi-class gas-kinetic theory is extended to capture the adaptation of the desired speed of the equipped vehicle to the speed at the downstream congested traffic. Numerical simulations are carried out to show the influence of the penetration rate of the equipped vehicles on traffic flow stability and capacity in a freeway.
NASA Technical Reports Server (NTRS)
Rodionova, Olga; Sridhar, Banavar; Ng, Hok K.
2016-01-01
Air traffic in the North Atlantic oceanic airspace (NAT) experiences very strong winds caused by jet streams. Flying wind-optimal trajectories increases individual flight efficiency, which is advantageous when operating in the NAT. However, as the NAT is highly congested during peak hours, a large number of potential conflicts between flights are detected for the sets of wind-optimal trajectories. Conflict resolution performed at the strategic level of flight planning can significantly reduce the airspace congestion. However, being completed far in advance, strategic planning can only use predicted environmental conditions that may significantly differ from the real conditions experienced further by aircraft. The forecast uncertainties result in uncertainties in conflict prediction, and thus, conflict resolution becomes less efficient. This work considers wind uncertainties in order to improve the robustness of conflict resolution in the NAT. First, the influence of wind uncertainties on conflict prediction is investigated. Then, conflict resolution methods accounting for wind uncertainties are proposed.
Modeling truck traffic volume growth congestion.
DOT National Transportation Integrated Search
2009-05-01
Modeling of the statewide transportation system is an important element in understanding issues and programming of funds to thwart potential congestion. As Alabama grows its manufacturing economy, the number of heavy vehicles traversing its highways ...
Large-scale transportation network congestion evolution prediction using deep learning theory.
Ma, Xiaolei; Yu, Haiyang; Wang, Yunpeng; Wang, Yinhai
2015-01-01
Understanding how congestion at one location can cause ripples throughout large-scale transportation network is vital for transportation researchers and practitioners to pinpoint traffic bottlenecks for congestion mitigation. Traditional studies rely on either mathematical equations or simulation techniques to model traffic congestion dynamics. However, most of the approaches have limitations, largely due to unrealistic assumptions and cumbersome parameter calibration process. With the development of Intelligent Transportation Systems (ITS) and Internet of Things (IoT), transportation data become more and more ubiquitous. This triggers a series of data-driven research to investigate transportation phenomena. Among them, deep learning theory is considered one of the most promising techniques to tackle tremendous high-dimensional data. This study attempts to extend deep learning theory into large-scale transportation network analysis. A deep Restricted Boltzmann Machine and Recurrent Neural Network architecture is utilized to model and predict traffic congestion evolution based on Global Positioning System (GPS) data from taxi. A numerical study in Ningbo, China is conducted to validate the effectiveness and efficiency of the proposed method. Results show that the prediction accuracy can achieve as high as 88% within less than 6 minutes when the model is implemented in a Graphic Processing Unit (GPU)-based parallel computing environment. The predicted congestion evolution patterns can be visualized temporally and spatially through a map-based platform to identify the vulnerable links for proactive congestion mitigation.
Large-Scale Transportation Network Congestion Evolution Prediction Using Deep Learning Theory
Ma, Xiaolei; Yu, Haiyang; Wang, Yunpeng; Wang, Yinhai
2015-01-01
Understanding how congestion at one location can cause ripples throughout large-scale transportation network is vital for transportation researchers and practitioners to pinpoint traffic bottlenecks for congestion mitigation. Traditional studies rely on either mathematical equations or simulation techniques to model traffic congestion dynamics. However, most of the approaches have limitations, largely due to unrealistic assumptions and cumbersome parameter calibration process. With the development of Intelligent Transportation Systems (ITS) and Internet of Things (IoT), transportation data become more and more ubiquitous. This triggers a series of data-driven research to investigate transportation phenomena. Among them, deep learning theory is considered one of the most promising techniques to tackle tremendous high-dimensional data. This study attempts to extend deep learning theory into large-scale transportation network analysis. A deep Restricted Boltzmann Machine and Recurrent Neural Network architecture is utilized to model and predict traffic congestion evolution based on Global Positioning System (GPS) data from taxi. A numerical study in Ningbo, China is conducted to validate the effectiveness and efficiency of the proposed method. Results show that the prediction accuracy can achieve as high as 88% within less than 6 minutes when the model is implemented in a Graphic Processing Unit (GPU)-based parallel computing environment. The predicted congestion evolution patterns can be visualized temporally and spatially through a map-based platform to identify the vulnerable links for proactive congestion mitigation. PMID:25780910
Characterizing the tradeoffs and costs associated with transportation congestion in supply chains.
DOT National Transportation Integrated Search
2010-01-21
We consider distribution and location-planning models for supply chains that explicitly : account for traffic congestion effects. The majority of facility location and transportation : planning models in the operations research literature consider fa...
A one-model approach based on relaxed combinations of inputs for evaluating input congestion in DEA
NASA Astrophysics Data System (ADS)
Khodabakhshi, Mohammad
2009-08-01
This paper provides a one-model approach of input congestion based on input relaxation model developed in data envelopment analysis (e.g. [G.R. Jahanshahloo, M. Khodabakhshi, Suitable combination of inputs for improving outputs in DEA with determining input congestion -- Considering textile industry of China, Applied Mathematics and Computation (1) (2004) 263-273; G.R. Jahanshahloo, M. Khodabakhshi, Determining assurance interval for non-Archimedean ele improving outputs model in DEA, Applied Mathematics and Computation 151 (2) (2004) 501-506; M. Khodabakhshi, A super-efficiency model based on improved outputs in data envelopment analysis, Applied Mathematics and Computation 184 (2) (2007) 695-703; M. Khodabakhshi, M. Asgharian, An input relaxation measure of efficiency in stochastic data analysis, Applied Mathematical Modelling 33 (2009) 2010-2023]. This approach reduces solving three problems with the two-model approach introduced in the first of the above-mentioned reference to two problems which is certainly important from computational point of view. The model is applied to a set of data extracted from ISI database to estimate input congestion of 12 Canadian business schools.
Autonomous Congestion Control in Delay-Tolerant Networks
NASA Technical Reports Server (NTRS)
Burleigh, Scott; Jennings, Esther; Schoolcraft, Joshua
2006-01-01
Congestion control is an important feature that directly affects network performance. Network congestion may cause loss of data or long delays. Although this problem has been studied extensively in the Internet, the solutions for Internet congestion control do not apply readily to challenged network environments such as Delay Tolerant Networks (DTN) where end-to-end connectivity may not exist continuously and latency can be high. In DTN, end-to-end rate control is not feasible. This calls for congestion control mechanisms where the decisions can be made autonomously with local information only. We use an economic pricing model and propose a rule-based congestion control mechanism where each router can autonomously decide on whether to accept a bundle (data) based on local information such as available storage and the value and risk of accepting the bundle (derived from historical statistics). Preliminary experimental results show that this congestion control mechanism can protect routers from resource depletion without loss of data.
Congestion Pricing for Aircraft Pushback Slot Allocation.
Liu, Lihua; Zhang, Yaping; Liu, Lan; Xing, Zhiwei
2017-01-01
In order to optimize aircraft pushback management during rush hour, aircraft pushback slot allocation based on congestion pricing is explored while considering monetary compensation based on the quality of the surface operations. First, the concept of the "external cost of surface congestion" is proposed, and a quantitative study on the external cost is performed. Then, an aircraft pushback slot allocation model for minimizing the total surface cost is established. An improved discrete differential evolution algorithm is also designed. Finally, a simulation is performed on Xinzheng International Airport using the proposed model. By comparing the pushback slot control strategy based on congestion pricing with other strategies, the advantages of the proposed model and algorithm are highlighted. In addition to reducing delays and optimizing the delay distribution, the model and algorithm are better suited for use for actual aircraft pushback management during rush hour. Further, it is also observed they do not result in significant increases in the surface cost. These results confirm the effectiveness and suitability of the proposed model and algorithm.
NASA Astrophysics Data System (ADS)
Mecikalski, John; Smith, Tracy; Weygandt, Stephen
2014-05-01
Latent heating profiles derived from GOES satellite-based cloud-top cooling rates are being assimilated into a retrospective version of the Rapid Refresh system (RAP) being run at the Global Systems Division. Assimilation of these data may help reduce the time lag for convection initiation (CI) in both the RAP model forecasts and in 3-km High Resolution Rapid Refresh (HRRR) model runs that are initialized off of the RAP model grids. These data may also improve both the location and organization of developing convective storm clusters, especially in the nested HRRR runs. These types of improvements are critical for providing better convective storm guidance around busy hub airports and aviation corridor routes, especially in the highly congested Ohio Valley - Northeast - Mid-Atlantic region. Additional work is focusing on assimilating GOES-R CI algorithm cloud-top cooling-based latent heating profiles directly into the HRRR model. Because of the small-scale nature of the convective phenomena depicted in the cloud-top cooling rate data (on the order of 1-4 km scale), direct assimilation of these data in the HRRR may be more effective than assimilation in the RAP. The RAP is an hourly assimilation system developed at NOAA/ESRL and was implemented at NCEP as a NOAA operational model in May 2012. The 3-km HRRR runs hourly out to 15 hours as a nest within the ESRL real-time experimental RAP. The RAP and HRRR both use the WRF ARW model core, and the Gridpoint Statistical Interpolation (GSI) is used within an hourly cycle to assimilate a wide variety of observations (including radar data) to initialize the RAP. Within this modeling framework, the cloud-top cooling rate-based latent heating profiles are applied as prescribed heating during the diabatic forward model integration part of the RAP digital filter initialization (DFI). No digital filtering is applied on the 3-km HRRR grid, but similar forward model integration with prescribed heating is used to assimilate information from radar reflectivity, lightning flash density and the satellite based cloud-top cooling rate data. In the current HRRR configuration, 4 15-min cycles of latent heating are applied during a pre-forecast hour of integration. This is followed by a final application of GSI at 3-km to fit the latest conventional observation data. At the conference, results from a 5-day retrospective period (July 5-10, 2012) will be shown, focusing on assessment of data impact for both the RAP and HRRR, as well as the sensitivity to various assimilation parameters, including assumed heating strength. Emphasis will be given to documenting the forecast impacts for aviation applications in the Eastern U.S.
Distributed Aviation Concepts and Technologies
NASA Technical Reports Server (NTRS)
Moore, Mark D.
2008-01-01
Aviation has experienced one hundred years of evolution, resulting in the current air transportation system dominated by commercial airliners in a hub and spoke infrastructure. While the first fifty years involved disruptive technologies that required frequent vehicle adaptation, the second fifty years produced a stable evolutionary optimization of decreasing costs with increasing safety. This optimization has resulted in traits favoring a centralized service model with high vehicle productivity and cost efficiency. However, it may also have resulted in a system that is not sufficiently robust to withstand significant system disturbances. Aviation is currently facing rapid change from issues such as environmental damage, terrorism threat, congestion and capacity limitations, and cost of energy. Currently, these issues are leading to a loss of service for weaker spoke markets. These catalysts and a lack of robustness could result in a loss of service for much larger portions of the aviation market. The impact of other competing transportation services may be equally important as casual factors of change. Highway system forecasts indicate a dramatic slow down as congestion reaches a point of non-linearly increasing delay. In the next twenty-five years, there is the potential for aviation to transform itself into a more robust, scalable, adaptive, secure, safe, affordable, convenient, efficient and environmentally friendly system. To achieve these characteristics, the new system will likely be based on a distributed model that enables more direct services. Short range travel is already demonstrating itself to be inefficient with a centralized model, providing opportunities for emergent distributed services through air-taxi models. Technologies from the on-demand revolution in computers and communications are now available as major drivers for aviation on-demand adaptation. Other technologies such as electric propulsion are currently transforming the automobile industry, and will also significantly alter the functionality of future distributed aviation concepts. Many hurdles exist, including technology, regulation, and perception. Aviation has an inherent governmental role not present in other recent on-demand transformations, which may pose a risk of curtailing aviation democratization .
The Influence of Individual Driver Characteristics on Congestion Formation
NASA Astrophysics Data System (ADS)
Wang, Lanjun; Zhang, Hao; Meng, Huadong; Wang, Xiqin
Previous works have pointed out that one of the reasons for the formation of traffic congestion is instability in traffic flow. In this study, we investigate theoretically how the characteristics of individual drivers influence the instability of traffic flow. The discussions are based on the optimal velocity model, which has three parameters related to individual driver characteristics. We specify the mappings between the model parameters and driver characteristics in this study. With linear stability analysis, we obtain a condition for when instability occurs and a constraint about how the model parameters influence the unstable traffic flow. Meanwhile, we also determine how the region of unstable flow densities depends on these parameters. Additionally, the Langevin approach theoretically validates that under the constraint, the macroscopic characteristics of the unstable traffic flow becomes a mixture of free flows and congestions. All of these results imply that both overly aggressive and overly conservative drivers are capable of triggering traffic congestion.
NASA Technical Reports Server (NTRS)
Arnaout, Georges M.; Bowling, Shannon R.
2011-01-01
Traffic congestion is an ongoing problem of great interest to researchers from different areas in academia. With the emerging technology for inter-vehicle communication, vehicles have the ability to exchange information with predecessors by wireless communication. In this paper, we present an agent-based model of traffic congestion and examine the impact of having CACC (Cooperative Adaptive Cruise Control) embedded vehicle(s) on a highway system consisting of 4 traffic lanes without overtaking. In our model, CACC vehicles adapt their acceleration/deceleration according to vehicle-to-vehicle inter-communication. We analyze the average speed of the cars, the shockwaves, and the evolution of traffic congestion throughout the lifecycle of the model. The study identifies how CACC vehicles affect the dynamics of traffic flow on a complex network and reduce the oscillatory behavior (stop and go) resulting from the acceleration/deceleration of the vehicles.
Analysis of Container Yard Capacity In North TPK Using ARIMA Method
NASA Astrophysics Data System (ADS)
Sirajuddin; Cut Gebrina Hisbach, M.; Ekawati, Ratna; Ade Irman, SM
2018-03-01
North container terminal known as North TPK is container terminal located in Indonesia Port Corporation area serving domestic container loading and unloading. It has 1006 ground slots with a total capacity of 5,544 TEUs and the maximum throughput of containers is 539,616 TEUs / year. Container throughput in the North TPK is increasing year by year. In 2011-2012, the North TPK container throughput is 165,080 TEUs / year and in 2015-2016 has reached 213,147 TEUs / year. To avoid congestion, and prevent possible losses in the future, this paper will analyze the flow of containers and the level of Yard Occupation Ratio in the North TPK at Tanjung Priok Port. The method used is the Autoregressive Integrated Moving Average (ARIMA) Model. ARIMA is a model that completely ignores independent variables in making forecasting. ARIMA results show that in 2016-2017 the total throughput of containers reached 234,006 TEUs / year with field effectiveness of 43.4% and in 2017-2018 the total throughput of containers reached 249,417 TEUs / year with field effectiveness 46.2%.
Packet Traffic Dynamics Near Onset of Congestion in Data Communication Network Model
NASA Astrophysics Data System (ADS)
Lawniczak, A. T.; Tang, X.
2006-05-01
The dominant technology of data communication networks is the Packet Switching Network (PSN). It is a complex technology organized as various hierarchical layers according to the International Standard Organization (ISO) Open Systems Interconnect (OSI) Reference Model. The Network Layer of the ISO OSI Reference Model is responsible for delivering packets from their sources to their destinations and for dealing with congestion if it arises in a network. Thus, we focus on this layer and present an abstraction of the Network Layer of the ISO OSI Reference Model. Using this abstraction we investigate how onset of traffic congestion is affected for various routing algorithms by changes in network connection topology. We study how aggregate measures of network performance depend on network connection topology and routing. We explore packets traffic spatio-temporal dynamics near the phase transition point from free flow to congestion for various network connection topologies and routing algorithms. We consider static and adaptive routings. We present selected simulation results.
Autonomous Congestion Control in Delay-Tolerant Networks
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.; Jennings, Esther H.
2005-01-01
Congestion control is an important feature that directly affects network performance. Network congestion may cause loss of data or long delays. Although this problem has been studied extensively in the Internet, the solutions for Internet congestion control do not apply readily to challenged network environments such as Delay Tolerant Networks (DTN) where end-to-end connectivity may not exist continuously and latency can be high. In DTN, end-to-end rate control is not feasible. This calls for congestion control mechanisms where the decisions can be made autonomously with local information only. We use an economic pricing model and propose a rule-based congestion control mechanism where each router can autonomously decide on whether to accept a bundle (data) based on local information such as available storage and the value and risk of accepting the bundle (derived from historical statistics).
Quantum random walks on congested lattices and the effect of dephasing.
Motes, Keith R; Gilchrist, Alexei; Rohde, Peter P
2016-01-27
We consider quantum random walks on congested lattices and contrast them to classical random walks. Congestion is modelled on lattices that contain static defects which reverse the walker's direction. We implement a dephasing process after each step which allows us to smoothly interpolate between classical and quantum random walks as well as study the effect of dephasing on the quantum walk. Our key results show that a quantum walker escapes a finite boundary dramatically faster than a classical walker and that this advantage remains in the presence of heavily congested lattices.
NASA Technical Reports Server (NTRS)
Pels, Eric; Verhoef, Erik T.
2003-01-01
Conventional economic wisdom suggests that congestion pricing would be an appropriate response to cope with the growing congestion levels currently experienced at many airports. Several characteristics of aviation markets, however, may make naive congestion prices equal to the value of marginal travel delays a non-optimal response. This paper has developed a model of airport pricing that captures a number of these features. The model in particular reflects that airlines typically have market power and are engaged in oligopolistic competition at different sub-markets; that part of external travel delays that aircraft impose are internal to an operator and hence should not be accounted for in congestion tolls. We presented an analytical treatment for a simple bi-nodal symmetric network, which through the use of 'hyper-networks' would be readily applicable to dynamic problems (in discrete time) such as peak - off-peak differences, and some numerical exercises for the same symmetric network, which was only designed to illustrate the possible comparative static impacts of tolling, in addition to marginal equilibrium conditions as could be derived for the general model specification. Some main conclusions are that second-best optimal tolls are typically lower than what would be suggested by congestion costs alone and may even be negative, and that the toll as derived by Brueckner (2002) may not lead to an increase in total welfare. While Brueckner (2002) has made clear that congestion tolls on airports may be smaller than expected when congestion costs among aircraft are internal for a firm, our analysis adds to this that a further downward adjustment may be in order due to market power. The presence of market power (which causes prices to exceed marginal costs) may cause the pure congestion toll to be suboptimal, because the resulting decrease in demand is too high (the pure congestion tall does not take into account the decrease in consumer surplus). The various downward adjustments in welfare maximizing tolls may well cause the optimal values of these to be negative. Insofar as subsidization is considered unacceptable for whichever reason, our results warn that the most efficient among the non-negative tolls may actually be a zero toll; the pure congestion toll may actually decrease welfare compared to the base case. The model in this paper contains a few simplifying assumptions that may be relaxed in future work. Load factors and aircraft capacity are fixed in this model for simplicity. In a more advanced version of this model, load factors and aircraft capacity can be endogenized. This makes the derivation of the optimality conditions far more complicated, but it should be feasible in a numerical experiment. One can also add a fourth layer to the model, describing the airport's optimization problem. For example, the airport can maximize profits under a cost recovery constraint. The model then deals with interactions between four types of agents. No distinction is made between peak and off-peak traffic in this paper. Finally, the results of the numerical exercise in this paper need to be checked against an asymmetric equilibrium.
Evaluation of the public health impacts of traffic congestion: a health risk assessment.
Levy, Jonathan I; Buonocore, Jonathan J; von Stackelberg, Katherine
2010-10-27
Traffic congestion is a significant issue in urban areas in the United States and around the world. Previous analyses have estimated the economic costs of congestion, related to fuel and time wasted, but few have quantified the public health impacts or determined how these impacts compare in magnitude to the economic costs. Moreover, the relative magnitudes of economic and public health impacts of congestion would be expected to vary significantly across urban areas, as a function of road infrastructure, population density, and atmospheric conditions influencing pollutant formation, but this variability has not been explored. In this study, we evaluate the public health impacts of ambient exposures to fine particulate matter (PM2.5) concentrations associated with a business-as-usual scenario of predicted traffic congestion. We evaluate 83 individual urban areas using traffic demand models to estimate the degree of congestion in each area from 2000 to 2030. We link traffic volume and speed data with the MOBILE6 model to characterize emissions of PM2.5 and particle precursors attributable to congestion, and we use a source-receptor matrix to evaluate the impact of these emissions on ambient PM2.5 concentrations. Marginal concentration changes are related to a concentration-response function for mortality, with a value of statistical life approach used to monetize the impacts. We estimate that the monetized value of PM2.5-related mortality attributable to congestion in these 83 cities in 2000 was approximately $31 billion (2007 dollars), as compared with a value of time and fuel wasted of $60 billion. In future years, the economic impacts grow (to over $100 billion in 2030) while the public health impacts decrease to $13 billion in 2020 before increasing to $17 billion in 2030, given increasing population and congestion but lower emissions per vehicle. Across cities and years, the public health impacts range from more than an order of magnitude less to in excess of the economic impacts. Our analyses indicate that the public health impacts of congestion may be significant enough in magnitude, at least in some urban areas, to be considered in future evaluations of the benefits of policies to mitigate congestion.
Traffic jams induced by fluctuation of a leading car.
Nagatani, T
2000-04-01
We present a phase diagram of the different kinds of congested traffic triggered by fluctuation of a leading car in an open system without sources and sinks. Traffic states and density waves are investigated numerically by varying the amplitude of fluctuation using a car following model. The phase transitions among the free traffic, oscillatory congested traffic, and homogeneous congested traffic occur by fluctuation of a leading car. With increasing the amplitude of fluctuation, the transition between the free traffic and oscillatory traffic occurs at lower density and the transition between the homogeneous congested traffic and the oscillatory traffic occurs at higher density. The oscillatory congested traffic corresponds to the coexisting phase. Also, the moving localized clusters appear just above the transition lines.
A Minimax Network Flow Model for Characterizing the Impact of Slot Restrictions
NASA Technical Reports Server (NTRS)
Lee, Douglas W.; Patek, Stephen D.; Alexandrov, Natalia; Bass, Ellen J.; Kincaid, Rex K.
2010-01-01
This paper proposes a model for evaluating long-term measures to reduce congestion at airports in the National Airspace System (NAS). This model is constructed with the goal of assessing the global impacts of congestion management strategies, specifically slot restrictions. We develop the Minimax Node Throughput Problem (MINNTHRU), a multicommodity network flow model that provides insight into air traffic patterns when one minimizes the worst-case operation across all airports in a given network. MINNTHRU is thus formulated as a model where congestion arises from network topology. It reflects not market-driven airline objectives, but those of a regulatory authority seeking a distribution of air traffic beneficial to all airports, in response to congestion management measures. After discussing an algorithm for solving MINNTHRU for moderate-sized (30 nodes) and larger networks, we use this model to study the impacts of slot restrictions on the operation of an entire hub-spoke airport network. For both a small example network and a medium-sized network based on 30 airports in the NAS, we use MINNTHRU to demonstrate that increasing the severity of slot restrictions increases the traffic around unconstrained hub airports as well as the worst-case level of operation over all airports.
Quantum random walks on congested lattices and the effect of dephasing
Motes, Keith R.; Gilchrist, Alexei; Rohde, Peter P.
2016-01-01
We consider quantum random walks on congested lattices and contrast them to classical random walks. Congestion is modelled on lattices that contain static defects which reverse the walker’s direction. We implement a dephasing process after each step which allows us to smoothly interpolate between classical and quantum random walks as well as study the effect of dephasing on the quantum walk. Our key results show that a quantum walker escapes a finite boundary dramatically faster than a classical walker and that this advantage remains in the presence of heavily congested lattices. PMID:26812924
Input-Output Modeling and Control of the Departure Process of Congested Airports
NASA Technical Reports Server (NTRS)
Pujet, Nicolas; Delcaire, Bertrand; Feron, Eric
2003-01-01
A simple queueing model of busy airport departure operations is proposed. This model is calibrated and validated using available runway configuration and traffic data. The model is then used to evaluate preliminary control schemes aimed at alleviating departure traffic congestion on the airport surface. The potential impact of these control strategies on direct operating costs, environmental costs and overall delay is quantified and discussed.
Chand, Sai; Dixit, Vinayak V
2018-03-01
The repercussions from congestion and accidents on major highways can have significant negative impacts on the economy and environment. It is a primary objective of transport authorities to minimize the likelihood of these phenomena taking place, to improve safety and overall network performance. In this study, we use the Hurst Exponent metric from Fractal Theory, as a congestion indicator for crash-rate modeling. We analyze one month of traffic speed data at several monitor sites along the M4 motorway in Sydney, Australia and assess congestion patterns with the Hurst Exponent of speed (H speed ). Random Parameters and Latent Class Tobit models were estimated, to examine the effect of congestion on historical crash rates, while accounting for unobserved heterogeneity. Using a latent class modeling approach, the motorway sections were probabilistically classified into two segments, based on the presence of entry and exit ramps. This will allow transportation agencies to implement appropriate safety/traffic countermeasures when addressing accident hotspots or inadequately managed sections of motorway. Copyright © 2017 Elsevier Ltd. All rights reserved.
Congested traffic states in empirical observations and microscopic simulations
NASA Astrophysics Data System (ADS)
Treiber, Martin; Hennecke, Ansgar; Helbing, Dirk
2000-08-01
We present data from several German freeways showing different kinds of congested traffic forming near road inhomogeneities, specifically lane closings, intersections, or uphill gradients. The states are localized or extended, homogeneous or oscillating. Combined states are observed as well, like the coexistence of moving localized clusters and clusters pinned at road inhomogeneities, or regions of oscillating congested traffic upstream of nearly homogeneous congested traffic. The experimental findings are consistent with a recently proposed theoretical phase diagram for traffic near on-ramps [D. Helbing, A. Hennecke, and M. Treiber, Phys. Rev. Lett. 82, 4360 (1999)]. We simulate these situations with a continuous microscopic single-lane model, the ``intelligent driver model,'' using empirical boundary conditions. All observations, including the coexistence of states, are qualitatively reproduced by describing inhomogeneities with local variations of one model parameter. We show that the results of the microscopic model can be understood by formulating the theoretical phase diagram for bottlenecks in a more general way. In particular, a local drop of the road capacity induced by parameter variations has essentially the same effect as an on-ramp.
Air pollution and health risks due to vehicle traffic.
Zhang, Kai; Batterman, Stuart
2013-04-15
Traffic congestion increases vehicle emissions and degrades ambient air quality, and recent studies have shown excess morbidity and mortality for drivers, commuters and individuals living near major roadways. Presently, our understanding of the air pollution impacts from congestion on roads is very limited. This study demonstrates an approach to characterize risks of traffic for on- and near-road populations. Simulation modeling was used to estimate on- and near-road NO2 concentrations and health risks for freeway and arterial scenarios attributable to traffic for different traffic volumes during rush hour periods. The modeling used emission factors from two different models (Comprehensive Modal Emissions Model and Motor Vehicle Emissions Factor Model version 6.2), an empirical traffic speed-volume relationship, the California Line Source Dispersion Model, an empirical NO2-NOx relationship, estimated travel time changes during congestion, and concentration-response relationships from the literature, which give emergency doctor visits, hospital admissions and mortality attributed to NO2 exposure. An incremental analysis, which expresses the change in health risks for small increases in traffic volume, showed non-linear effects. For a freeway, "U" shaped trends of incremental risks were predicted for on-road populations, and incremental risks are flat at low traffic volumes for near-road populations. For an arterial road, incremental risks increased sharply for both on- and near-road populations as traffic increased. These patterns result from changes in emission factors, the NO2-NOx relationship, the travel delay for the on-road population, and the extended duration of rush hour for the near-road population. This study suggests that health risks from congestion are potentially significant, and that additional traffic can significantly increase risks, depending on the type of road and other factors. Further, evaluations of risk associated with congestion must consider travel time, the duration of rush-hour, congestion-specific emission estimates, and uncertainties. Copyright © 2013 Elsevier B.V. All rights reserved.
Air pollution and health risks due to vehicle traffic
Zhang, Kai; Batterman, Stuart
2014-01-01
Traffic congestion increases vehicle emissions and degrades ambient air quality, and recent studies have shown excess morbidity and mortality for drivers, commuters and individuals living near major roadways. Presently, our understanding of the air pollution impacts from congestion on roads is very limited. This study demonstrates an approach to characterize risks of traffic for on- and near-road populations. Simulation modeling was used to estimate on- and near-road NO2 concentrations and health risks for freeway and arterial scenarios attributable to traffic for different traffic volumes during rush hour periods. The modeling used emission factors from two different models (Comprehensive Modal Emissions Model and Motor Vehicle Emissions Factor Model version 6.2), an empirical traffic speed–volume relationship, the California Line Source Dispersion Model, an empirical NO2–NOx relationship, estimated travel time changes during congestion, and concentration–response relationships from the literature, which give emergency doctor visits, hospital admissions and mortality attributed to NO2 exposure. An incremental analysis, which expresses the change in health risks for small increases in traffic volume, showed non-linear effects. For a freeway, “U” shaped trends of incremental risks were predicted for on-road populations, and incremental risks are flat at low traffic volumes for near-road populations. For an arterial road, incremental risks increased sharply for both on- and near-road populations as traffic increased. These patterns result from changes in emission factors, the NO2–NOx relationship, the travel delay for the on-road population, and the extended duration of rush hour for the near-road population. This study suggests that health risks from congestion are potentially significant, and that additional traffic can significantly increase risks, depending on the type of road and other factors. Further, evaluations of risk associated with congestion must consider travel time, the duration of rush-hour, congestion-specific emission estimates, and uncertainties. PMID:23500830
The Physics of Traffic Congestion and Road Pricing in Transportation Planning
NASA Astrophysics Data System (ADS)
Levinson, David
2010-03-01
This presentation develops congestion theory and congestion pricing theory from its micro- foundations, the interaction of two or more vehicles. Using game theory, with a two- player game it is shown that the emergence of congestion depends on the players' relative valuations of early arrival, late arrival, and journey delay. Congestion pricing can be used as a cooperation mechanism to minimize total costs (if returned to the players). The analysis is then extended to the case of the three- player game, which illustrates congestion as a negative externality imposed on players who do not themselves contribute to it. A multi-agent model of travelers competing to utilize a roadway in time and space is presented. To realize the spillover effect among travelers, N-player games are constructed in which the strategy set includes N+1 strategies. We solve the N-player game (for N = 7) and find Nash equilibria if they exist. This model is compared to the bottleneck model. The results of numerical simulation show that the two models yield identical results in terms of lowest total costs and marginal costs when a social optimum exists. Moving from temporal dynamics to spatial complexity, using consistent agent- based techniques, we model the decision-making processes of users and infrastructure owner/operators to explore the welfare consequence of price competition, capacity choice, and product differentiation on congested transportation networks. Component models include: (1) An agent-based travel demand model wherein each traveler has learning capabilities and unique characteristics (e.g. value of time); (2) Econometric facility provision cost models; and (3) Representations of road authorities making pricing and capacity decisions. Different from small-network equilibrium models in prior literature, this agent- based model is applicable to pricing and investment analyses on large complex networks. The subsequent economic analysis focuses on the source, evolution, measurement, and impact of product differentiation with heterogeneous users on a mixed ownership network (with tolled and untolled roads). Two types of product differentiation in the presence of toll roads, path differentiation and space differentiation, are defined and measured for a base case and several variants with different types of price and capacity competition and with various degrees of user heterogeneity. The findings favor a fixed-rate road pricing policy compared to complete pricing freedom on toll roads. It is also shown that the relationship between net social benefit and user heterogeneity is not monotonic on a complex network with toll roads.
NASA Astrophysics Data System (ADS)
Katsura, Yasufumi; Attaviriyanupap, Pathom; Kataoka, Yoshihiko
In this research, the fundamental premises for deregulation of the electric power industry are reevaluated. The authors develop a simple model to represent wholesale electricity market with highly congested network. The model is developed by simplifying the power system and market in New York ISO based on available data of New York ISO in 2004 with some estimation. Based on the developed model and construction cost data from the past, the economic impact of transmission line addition on market participants and the impact of deregulation on power plant additions under market with transmission congestion are studied. Simulation results show that the market signals may fail to facilitate proper capacity additions and results in the undesirable over-construction and insufficient-construction cycle of capacity addition.
Evaluation of the public health impacts of traffic congestion: a health risk assessment
2010-01-01
Background Traffic congestion is a significant issue in urban areas in the United States and around the world. Previous analyses have estimated the economic costs of congestion, related to fuel and time wasted, but few have quantified the public health impacts or determined how these impacts compare in magnitude to the economic costs. Moreover, the relative magnitudes of economic and public health impacts of congestion would be expected to vary significantly across urban areas, as a function of road infrastructure, population density, and atmospheric conditions influencing pollutant formation, but this variability has not been explored. Methods In this study, we evaluate the public health impacts of ambient exposures to fine particulate matter (PM2.5) concentrations associated with a business-as-usual scenario of predicted traffic congestion. We evaluate 83 individual urban areas using traffic demand models to estimate the degree of congestion in each area from 2000 to 2030. We link traffic volume and speed data with the MOBILE6 model to characterize emissions of PM2.5 and particle precursors attributable to congestion, and we use a source-receptor matrix to evaluate the impact of these emissions on ambient PM2.5 concentrations. Marginal concentration changes are related to a concentration-response function for mortality, with a value of statistical life approach used to monetize the impacts. Results We estimate that the monetized value of PM2.5-related mortality attributable to congestion in these 83 cities in 2000 was approximately $31 billion (2007 dollars), as compared with a value of time and fuel wasted of $60 billion. In future years, the economic impacts grow (to over $100 billion in 2030) while the public health impacts decrease to $13 billion in 2020 before increasing to $17 billion in 2030, given increasing population and congestion but lower emissions per vehicle. Across cities and years, the public health impacts range from more than an order of magnitude less to in excess of the economic impacts. Conclusions Our analyses indicate that the public health impacts of congestion may be significant enough in magnitude, at least in some urban areas, to be considered in future evaluations of the benefits of policies to mitigate congestion. PMID:20979626
NASA Astrophysics Data System (ADS)
Behr, Joshua G.; Diaz, Rafael
Non-urgent Emergency Department utilization has been attributed with increasing congestion in the flow and treatment of patients and, by extension, conditions the quality of care and profitability of the Emergency Department. Interventions designed to divert populations to more appropriate care may be cautiously received by operations managers due to uncertainty about the impact an adopted intervention may have on the two values of congestion and profitability. System Dynamics (SD) modeling and simulation may be used to measure the sensitivity of these two, often-competing, values of congestion and profitability and, thus, provide an additional layer of information designed to inform strategic decision making.
Acoustic rhinometry in the dog: a novel large animal model for studies of nasal congestion.
Koss, Michael C; Yu, Yongxin; Hey, John A; McLeod, Robbie L
2002-01-01
The aim of this project was to develop and pharmacologically characterize an experimental dog model of nasal congestion in which nasal patency is measured using acoustic rhinometry. Solubilized compound 48/80 (0.3-3.0%) was administered intranasally to thiopental anesthetized beagle dogs to elicit nasal congestion via localized mast cell degranulation. Compound 48/80-induced effects on parameters of nasal patency were studied in vehicle-treated animals, as well as in the same animals pretreated 2 hours earlier with oral d-pseudoephedrine or chlorpheniramine. Local mast cell degranulation caused a close-related decrease in nasal cavity volume and minimal cross-sectional area (Amin) together with a highly variable increase in nasal secretions. Maximal responses were seen at 90-120 minutes after 48/80 administration. Oral administration of the adrenergic agonist, d-pseudoephedrine (3.0 mg/kg), significantly antagonized all of the nasal effects of compound 48/80 (3.0%). In contrast, oral administration of the histamine H1 receptor antagonist chlorpheniramine (10 mg/kg) appeared to reduce the increased nasal secretions but was without effect on the compound 48/ 80-induced nasal congestion (i.e., volume and Amin). These results show the effectiveness of using acoustic rhinometry in this anesthetized dog model. The observations that compound 48/80-induced nasal congestion was prevented by d-pseudoephedrine pretreatment, but not by chlorpheniramine, suggest that this noninvasive model system may provide an effective tool with which to study the actions of decongestant drugs in preclinical investigations.
NASA Technical Reports Server (NTRS)
Cole, Robert; Wear, Mary; Young, Millennia; Cobel, Christopher; Mason, Sara
2017-01-01
Congestion is commonly reported during spaceflight, and most crewmembers have reported using medications for congestion during International Space Station (ISS) missions. Although congestion has been attributed to fluid shifts during spaceflight, fluid status reaches equilibrium during the first week after launch while congestion continues to be reported throughout long duration missions. Congestion complaints have anecdotally been reported in relation to ISS CO2 levels; this evaluation was undertaken to determine whether or not an association exists. METHODS: Reported headaches, congestion symptoms, and CO2 levels were obtained for ISS expeditions 2-31, and time-weighted means and single-point maxima were determined for 24-hour (24hr) and 7-day (7d) periods prior to each weekly private medical conference. Multiple imputation addressed missing data, and logistic regression modeled the relationship between probability of reported event of congestion or headache and CO2 levels, adjusted for possible confounding covariates. The first seven days of spaceflight were not included to control for fluid shifts. Data were evaluated to determine the concentration of CO2 required to maintain the risk of congestion below 1% to allow for direct comparison with a previously published evaluation of CO2 concentrations and headache. RESULTS: This study confirmed a previously identified significant association between CO2 and headache and also found a significant association between CO2 and congestion. For each 1-mm Hg increase in CO2, the odds of a crew member reporting congestion doubled. The average 7-day CO2 would need to be maintained below 1.5 mmHg to keep the risk of congestion below 1%. The predicted probability curves of ISS headache and congestion curves appear parallel when plotted against ppCO2 levels with congestion occurring at approximately 1mmHg lower than a headache would be reported. DISCUSSION: While the cause of congestion is multifactorial, this study showed congestion is associated with CO2 levels on ISS. Data from additional expeditions could be incorporated to further assess this finding. CO2 levels are also associated with reports of headaches on ISS. While it may be expected for astronauts with congestion to also complain of headaches, these two symptoms are commonly mutually exclusive. Furthermore, it is unknown if a temporal CO2 relationship exists between congestion and headache on ISS. CO2 levels were time-weighted for 24hr and 7d, and thus the time course of congestion leading to headache was not assessed; however, congestion could be an early CO2-related symptom when compared to headache. Future studies evaluating the association of CO2-related congestion leading to headache would be difficult due to the relatively stable daily CO2 levels on ISS currently, but a systematic study could be implemented on-orbit if desired.
Ohno, Yukako; Hanawa, Haruo; Jiao, Shuang; Hayashi, Yuka; Yoshida, Kaori; Suzuki, Tomoyasu; Kashimura, Takeshi; Obata, Hiroaki; Tanaka, Komei; Watanabe, Tohru; Minamino, Tohru
2015-01-01
Hepcidin is a key regulator of mammalian iron metabolism and mainly produced by the liver. Hepcidin excess causes iron deficiency and anemia by inhibiting iron absorption from the intestine and iron release from macrophage stores. Anemia is frequently complicated with heart failure. In heart failure patients, the most frequent histologic appearance of liver is congestion. However, it remains unclear whether liver congestion associated with heart failure influences hepcidin production, thereby contributing to anemia and functional iron deficiency. In this study, we investigated this relationship in clinical and basic studies. In clinical studies of consecutive heart failure patients (n = 320), anemia was a common comorbidity (41%). In heart failure patients without active infection and ongoing cancer (n = 30), log-serum hepcidin concentration of patients with liver congestion was higher than those without liver congestion (p = 0.0316). Moreover, in heart failure patients with liver congestion (n = 19), the anemia was associated with the higher serum hepcidin concentrations, which is a type of anemia characterized by induction of hepcidin. Subsequently, we produced a rat model of heart failure with liver congestion by injecting monocrotaline that causes pulmonary hypertension. The monocrotaline-treated rats displayed liver congestion with increase of hepcidin expression at 4 weeks after monocrotaline injection, followed by anemia and functional iron deficiency observed at 5 weeks. We conclude that liver congestion induces hepcidin production, which may result in anemia and functional iron deficiency in some patients with heart failure.
Electricity market pricing, risk hedging and modeling
NASA Astrophysics Data System (ADS)
Cheng, Xu
In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such behavior. Finally, the complexity of the power market and size of the transmission grid make it difficult for market participants to efficiently analyze the long-term market behavior. We propose a simplified power system commercial model by simulating the PTDFs of critical transmission bottlenecks of the original system.
NASA Astrophysics Data System (ADS)
Davis, L. C.
2016-06-01
Wirelessly connected vehicles that exchange information about traffic conditions can reduce delays caused by congestion. At a 2-to-1 lane reduction, the improvement in flow past a bottleneck due to traffic with a random mixture of 40% connected vehicles is found to be 52%. Control is based on connected-vehicle-reported velocities near the bottleneck. In response to indications of congestion the connected vehicles, which are also adaptive cruise control vehicles, reduce their speed in slowdown regions. Early lane changes of manually driven vehicles from the terminated lane to the continuous lane are induced by the slowing connected vehicles. Self-organized congestion at the bottleneck is thus delayed or eliminated, depending upon the incoming flow magnitude. For the large majority of vehicles, travel times past the bottleneck are substantially reduced. Control is responsible for delaying the onset of congestion as the incoming flow increases. Adaptive cruise control increases the flow out of the congested state at the bottleneck. The nature of the congested state, when it occurs, appears to be similar under a variety of conditions. Typically 80-100 vehicles are approximately equally distributed between the lanes in the 500 m region prior to the end of the terminated lane. Without the adaptive cruise control capability, connected vehicles can delay the onset of congestion but do not increase the asymptotic flow past the bottleneck. Calculations are done using the Kerner-Klenov three-phase theory, stochastic discrete-time model for manual vehicles. The dynamics of the connected vehicles is given by a conventional adaptive cruise control algorithm plus commanded deceleration. Because time in the model for manual vehicles is discrete (one-second intervals), it is assumed that the acceleration of any vehicle immediately in front of a connected vehicle is constant during the time interval, thereby preserving the computational simplicity and speed of a discrete-time model.
Automated Flight Routing Using Stochastic Dynamic Programming
NASA Technical Reports Server (NTRS)
Ng, Hok K.; Morando, Alex; Grabbe, Shon
2010-01-01
Airspace capacity reduction due to convective weather impedes air traffic flows and causes traffic congestion. This study presents an algorithm that reroutes flights in the presence of winds, enroute convective weather, and congested airspace based on stochastic dynamic programming. A stochastic disturbance model incorporates into the reroute design process the capacity uncertainty. A trajectory-based airspace demand model is employed for calculating current and future airspace demand. The optimal routes minimize the total expected traveling time, weather incursion, and induced congestion costs. They are compared to weather-avoidance routes calculated using deterministic dynamic programming. The stochastic reroutes have smaller deviation probability than the deterministic counterpart when both reroutes have similar total flight distance. The stochastic rerouting algorithm takes into account all convective weather fields with all severity levels while the deterministic algorithm only accounts for convective weather systems exceeding a specified level of severity. When the stochastic reroutes are compared to the actual flight routes, they have similar total flight time, and both have about 1% of travel time crossing congested enroute sectors on average. The actual flight routes induce slightly less traffic congestion than the stochastic reroutes but intercept more severe convective weather.
Performance analysis of SS7 congestion controls under sustained overload
NASA Astrophysics Data System (ADS)
Manfield, David R.; Millsteed, Gregory K.; Zukerman, Moshe
1994-04-01
Congestion controls are a key factor in achieving the robust performance required of common channel signaling (CCS) networks in the face of partial network failures and extreme traffic loads, especially as networks become large and carry high traffic volume. The CCITT recommendations define a number of types of congestion control, and the parameters of the controls must be well set in order to ensure their efficacy under transient and sustained signalling network overload. The objective of this paper is to present a modeling approach to the determination of the network parameters that govern the performance of the SS7 congestion controls under sustained overload. Results of the investigation by simulation are presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finley, Cathy
2014-04-30
This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less
Evaluation of statistical models for forecast errors from the HBV model
NASA Astrophysics Data System (ADS)
Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur
2010-04-01
SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.
Distributive routing and congestion control in wireless multihop ad hoc communication networks
NASA Astrophysics Data System (ADS)
Glauche, Ingmar; Krause, Wolfram; Sollacher, Rudolf; Greiner, Martin
2004-10-01
Due to their inherent complexity, engineered wireless multihop ad hoc communication networks represent a technological challenge. Having no mastering infrastructure the nodes have to selforganize themselves in such a way that for example network connectivity, good data traffic performance and robustness are guaranteed. In this contribution the focus is on routing and congestion control. First, random data traffic along shortest path routes is studied by simulations as well as theoretical modeling. Measures of congestion like end-to-end time delay and relaxation times are given. A scaling law of the average time delay with respect to network size is revealed and found to depend on the underlying network topology. In the second step, a distributive routing and congestion control is proposed. Each node locally propagates its routing cost estimates and information about its congestion state to its neighbors, which then update their respective cost estimates. This allows for a flexible adaptation of end-to-end routes to the overall congestion state of the network. Compared to shortest-path routing, the critical network load is significantly increased.
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.
2017-08-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
Online Analysis of Wind and Solar Part II: Transmission Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the toolmore » has been developed and implemented in software.« less
DOT National Transportation Integrated Search
2015-07-01
Urban traffic congestion is a problem that plagues many cities in the United States. Testing strategies to alleviate this : congestion is especially challenging due to the difficulty of modeling complex urban traffic networks. However, recent work ha...
The impact of truck repositioning on congestion and pollution in the LA basin.
DOT National Transportation Integrated Search
2011-03-01
Pollution and congestion caused by port related truck traffic is usually estimated based on careful transportation modeling and simulation. In these efforts, however, attention is normally focused on trucks on their way from a terminal at the Los Ang...
Quantifying model uncertainty in seasonal Arctic sea-ice forecasts
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin
2017-04-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
Three essays on access pricing
NASA Astrophysics Data System (ADS)
Sydee, Ahmed Nasim
In the first essay, a theoretical model is developed to determine the time path of optimal access price in the telecommunications industry. Determining the optimal access price is an important issue in the economics of telecommunications. Setting a high access price discourages potential entrants; a low access price, on the other hand, amounts to confiscation of private property because the infrastructure already built by the incumbent is sunk. Furthermore, a low access price does not give the incumbent incentives to maintain the current network and to invest in new infrastructures. Much of the existing literature on access pricing suffers either from the limitations of a static framework or from the assumption that all costs are avoidable. The telecommunications industry is subject to high stranded costs and, therefore, to address this issue a dynamic model is imperative. This essay presents a dynamic model of one-way access pricing in which the compensation involved in deregulatory taking is formalized and then analyzed. The short run adjustment after deregulatory taking has occurred is carried out and discussed. The long run equilibrium is also analyzed. A time path for the Ramsey price is shown as the correct dynamic price of access. In the second essay, a theoretical model is developed to determine the time path of optimal access price for an infrastructure that is characterized by congestion and lumpy investment. Much of the theoretical literature on access pricing of infrastructure prescribes that the access price be set at the marginal cost of the infrastructure. In proposing this rule of access pricing, the conventional analysis assumes that infrastructure investments are infinitely divisible so that it makes sense to talk about the marginal cost of investment. Often it is the case that investments in infrastructure are lumpy and can only be made in large chunks, and this renders the marginal cost concept meaningless. In this essay, we formalize a model of access pricing with congestion and in which investments in infrastructure are lumpy. To fix ideas, the model is formulated in the context of airport infrastructure investments, which captures both the element of congestion and the lumpiness involved in infrastructure investments. The optimal investment program suggests how many units of capacity should be installed and at which times. Because time is continuous in the model, the discounted cost -- despite the lumpiness of capacity additions -- can be made to vary continuously by varying the time a capacity addition is made. The main results that emerge from the analysis can be described as follows: First, the global demand for air travel rises with time and experiences an upward jump whenever a capacity addition is made. Second, the access price is constant and stays at the basic level when the system is not congested. When the system is congested, a congestion surcharge is imposed on top of the basic level, and the congestion surcharge rises with the level of congestion until the next capacity addition is made at which time the access price takes a downward jump. Third, the individual demand for air travel is constant before congestion sets in and after the last capacity addition takes place. During a time interval in which congestion rises, the individual demand for travel is below the level that prevails when there is no congestion and declines as congestion worsens. The third essay contains a model of access pricing for natural gas transmission pipelines, both when pipeline operators are regulated and when they behave strategically. The high sunk costs involved in building a pipeline network constitute a serious barrier of entry, and competitive behaviour in the transmission pipeline sector cannot be expected. Most of the economic analyses of access pricing for natural gas transmission pipelines are carried out from the regulatory perspective, and the access price paid by shippers are cost-based. The model formalized is intended to capture some essential characteristics of networks in which components interact with one another when combined into an integrated system. The model shows how the topology of the network determines the access prices in different components of the network. The general results that emerge from the analysis can be summarized as follows. First, the monopoly power of a pipeline operator is reduced by the entry of a new pipeline supply connected in parallel to the same demand node. When the pipelines are connected in series, the one upstream enjoys a first-move advantage over the one downstream, and the toll set by the upstream pipeline operator after entry by the downstream pipeline operator will rise above the original monopoly level. The equilibrium prices of natural gas at the various nodes of the network are also discussed. (Abstract shortened by UMI.)
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
A short-term ensemble wind speed forecasting system for wind power applications
NASA Astrophysics Data System (ADS)
Baidya Roy, S.; Traiteur, J. J.; Callicutt, D.; Smith, M.
2011-12-01
This study develops an adaptive, blended forecasting system to provide accurate wind speed forecasts 1 hour ahead of time for wind power applications. The system consists of an ensemble of 21 forecasts with different configurations of the Weather Research and Forecasting Single Column Model (WRFSCM) and a persistence model. The ensemble is calibrated against observations for a 2 month period (June-July, 2008) at a potential wind farm site in Illinois using the Bayesian Model Averaging (BMA) technique. The forecasting system is evaluated against observations for August 2008 at the same site. The calibrated ensemble forecasts significantly outperform the forecasts from the uncalibrated ensemble while significantly reducing forecast uncertainty under all environmental stability conditions. The system also generates significantly better forecasts than persistence, autoregressive (AR) and autoregressive moving average (ARMA) models during the morning transition and the diurnal convective regimes. This forecasting system is computationally more efficient than traditional numerical weather prediction models and can generate a calibrated forecast, including model runs and calibration, in approximately 1 minute. Currently, hour-ahead wind speed forecasts are almost exclusively produced using statistical models. However, numerical models have several distinct advantages over statistical models including the potential to provide turbulence forecasts. Hence, there is an urgent need to explore the role of numerical models in short-term wind speed forecasting. This work is a step in that direction and is likely to trigger a debate within the wind speed forecasting community.
DOT National Transportation Integrated Search
2015-06-01
Urban traffic congestion is a problem that plagues many cities in the United States. Testing strategies to alleviate this : congestion is especially challenging due to the difficulty of modeling complex urban traffic networks. However, recent work ha...
Impact of traffic congestion on road accidents: a spatial analysis of the M25 motorway in England.
Wang, Chao; Quddus, Mohammed A; Ison, Stephen G
2009-07-01
Traffic congestion and road accidents are two external costs of transport and the reduction of their impacts is often one of the primary objectives for transport policy makers. The relationship between traffic congestion and road accidents however is not apparent and less studied. It is speculated that there may be an inverse relationship between traffic congestion and road accidents, and as such this poses a potential dilemma for transport policy makers. This study aims to explore the impact of traffic congestion on the frequency of road accidents using a spatial analysis approach, while controlling for other relevant factors that may affect road accidents. The M25 London orbital motorway, divided into 70 segments, was chosen to conduct this study and relevant data on road accidents, traffic and road characteristics were collected. A robust technique has been developed to map M25 accidents onto its segments. Since existing studies have often used a proxy to measure the level of congestion, this study has employed a precise congestion measurement. A series of Poisson based non-spatial (such as Poisson-lognormal and Poisson-gamma) and spatial (Poisson-lognormal with conditional autoregressive priors) models have been used to account for the effects of both heterogeneity and spatial correlation. The results suggest that traffic congestion has little or no impact on the frequency of road accidents on the M25 motorway. All other relevant factors have provided results consistent with existing studies.
Congestion Pricing for Aircraft Pushback Slot Allocation
Zhang, Yaping
2017-01-01
In order to optimize aircraft pushback management during rush hour, aircraft pushback slot allocation based on congestion pricing is explored while considering monetary compensation based on the quality of the surface operations. First, the concept of the “external cost of surface congestion” is proposed, and a quantitative study on the external cost is performed. Then, an aircraft pushback slot allocation model for minimizing the total surface cost is established. An improved discrete differential evolution algorithm is also designed. Finally, a simulation is performed on Xinzheng International Airport using the proposed model. By comparing the pushback slot control strategy based on congestion pricing with other strategies, the advantages of the proposed model and algorithm are highlighted. In addition to reducing delays and optimizing the delay distribution, the model and algorithm are better suited for use for actual aircraft pushback management during rush hour. Further, it is also observed they do not result in significant increases in the surface cost. These results confirm the effectiveness and suitability of the proposed model and algorithm. PMID:28114429
Qi, Yi; Padiath, Ameena; Zhao, Qun; Yu, Lei
2016-10-01
The Motor Vehicle Emission Simulator (MOVES) quantifies emissions as a function of vehicle modal activities. Hence, the vehicle operating mode distribution is the most vital input for running MOVES at the project level. The preparation of operating mode distributions requires significant efforts with respect to data collection and processing. This study is to develop operating mode distributions for both freeway and arterial facilities under different traffic conditions. For this purpose, in this study, we (1) collected/processed geographic information system (GIS) data, (2) developed a model of CO2 emissions and congestion from observations, (3) implemented the model to evaluate potential emission changes from a hypothetical roadway accident scenario. This study presents a framework by which practitioners can assess emission levels in the development of different strategies for traffic management and congestion mitigation. This paper prepared the primary input, that is, the operating mode ID distribution, required for running MOVES and developed models for estimating emissions for different types of roadways under different congestion levels. The results of this study will provide transportation planners or environmental analysts with the methods for qualitatively assessing the air quality impacts of different transportation operation and demand management strategies.
Real-time Volcanic Cloud Products and Predictions for Aviation Alerts
NASA Astrophysics Data System (ADS)
Krotkov, N. A.; Hughes, E. J.; da Silva, A. M., Jr.; Seftor, C. J.; Brentzel, K. W.; Hassinen, S.; Heinrichs, T. A.; Schneider, D. J.; Hoffman, R.; Myers, T.; Flynn, L. E.; Niu, J.; Theys, N.; Brenot, H. H.
2016-12-01
We will discuss progress of the NASA ASP project, which promotes the use of satellite volcanic SO2 (VSO2) and Ash (VA) data, and forecasting tools that enhance VA Decision Support Systems (DSS) at the VA Advisory Centers (VAACs) for prompt aviation warnings. The goals are: (1) transition NASA algorithms to NOAA for global NRT processing and integration into DSS at Washington VAAC for operational users and public dissemination; (2) Utilize Direct Broadcast capability of the Aura and SNPP satellites to process Direct Readout (DR) data at two high latitude locations in Finland and Fairbanks, Alaska to enhance VA DSS in Europe and at USGS's Alaska Volcano Observatory (AVO) and Alaska-VAAC; (3) Improve global Eulerian model-based VA/VSO2 forecasting and risk/cost assessments with Metron Aviation. Our global NRT OMI and OMPS data have been fully integrated into European Support to Aviation Control Service and NOAA operational web sites. We are transitioning OMPS processing to our partners at NOAA/NESDIS to integrate into operational processing environment. NASA's Suomi NPP Ozone Science Team, in conjunction with GSFC's Direct Readout Laboratory (DRL), have implemented Version 2 of the OMPS real-time DR processing package to generate VSO2 and VA products at the Geographic Information Network of Alaska (GINA) and the Finnish Meteorological Institute (FMI). The system provides real-time coverage over some of the most congested airspace and over many of the most active volcanoes in the world. The OMPS real time capability is now publicly available via DRL's IPOPP package. We use satellite observations to define volcanic source term estimates in the NASA GOES-5 model, which was updated allowing for the simulation of VA and VSO2 clouds. Column SO2 observations from SNPP/OMPS provide an initial estimate of the total cloud SO2 mass, and are used with backward transport analysis to make an initial cloud height estimate. Later VSO2 observations are used to "nudge" the SO2 mass within the model. The GEOS-5 simulations provide qualitative forecasts, which locate the extent of regions hazardous to aviation. Air traffic flow algorithms have been developed by Metron Aviation to use GEOS-5 volcanic simulations to determine the most cost-effective rerouting paths around hazardous volcanic clouds.
DOT National Transportation Integrated Search
1996-01-01
FAST-TRAC : SELECTING THE MOST APPROPRIATE TRAFFIC CONTROL STRATEGY FOR INCIDENT CONGESTION MANAGEMENT CAN HAVE A MAJOR IMPACT ON THE EXTENT AND DURATION OF THE RESULTING CONGESTION. THIS RESEARCH INVESTIGATED THE EFFECTIVENESSES OF SEVERAL CONTRO...
Research on three-phase traffic flow modeling based on interaction range
NASA Astrophysics Data System (ADS)
Zeng, Jun-Wei; Yang, Xu-Gang; Qian, Yong-Sheng; Wei, Xu-Ting
2017-12-01
On the basis of the multiple velocity difference effect (MVDE) model and under short-range interaction, a new three-phase traffic flow model (S-MVDE) is proposed through careful consideration of the influence of the relationship between the speeds of the two adjacent cars on the running state of the rear car. The random slowing rule in the MVDE model is modified in order to emphasize the influence of vehicle interaction between two vehicles on the probability of vehicles’ deceleration. A single-lane model which without bottleneck structure under periodic boundary conditions is simulated, and it is proved that the traffic flow simulated by S-MVDE model will generate the synchronous flow of three-phase traffic theory. Under the open boundary, the model is expanded by adding an on-ramp, the congestion pattern caused by the bottleneck is simulated at different main road flow rates and on-ramp flow rates, which is compared with the traffic congestion pattern observed by Kerner et al. and it is found that the results are consistent with the congestion characteristics in the three-phase traffic flow theory.
NASA Astrophysics Data System (ADS)
Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.
2018-07-01
Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.
On the Inefficiency of Equilibria in Linear Bottleneck Congestion Games
NASA Astrophysics Data System (ADS)
de Keijzer, Bart; Schäfer, Guido; Telelis, Orestis A.
We study the inefficiency of equilibrium outcomes in bottleneck congestion games. These games model situations in which strategic players compete for a limited number of facilities. Each player allocates his weight to a (feasible) subset of the facilities with the goal to minimize the maximum (weight-dependent) latency that he experiences on any of these facilities. We derive upper and (asymptotically) matching lower bounds on the (strong) price of anarchy of linear bottleneck congestion games for a natural load balancing social cost objective (i.e., minimize the maximum latency of a facility). We restrict our studies to linear latency functions. Linear bottleneck congestion games still constitute a rich class of games and generalize, for example, load balancing games with identical or uniformly related machines with or without restricted assignments.
Cogestion and recreation site demand: a model of demand-induced quality effects
Douglas, Aaron J.; Johnson, Richard L.
1993-01-01
This analysis focuses on problems of estimating site-specific dollar benefits conferred by outdoor recreation sites in the face of congestion costs. Encounters, crowding effects and congestion costs have often been treated by natural resource economists in a piecemeal fashion. In the current paper, encounters and crowding effects are treated systematically. We emphasize the quantitative impact of congestion costs on site-specific estimates of benefits conferred by improvements in outdoor recreation sites. The principal analytic conclusion is that techniques that streamline on data requirements produce biased estimates of benefits conferred by site improvements at facilities with significant crowding effects. The principal policy recommendation is that the Federal and state agencies should collect and store information on visitation rates, encounter levels and congestion costs at various outdoor recreation sites.
Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.
1981-03-01
Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS
Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061
Selecting single model in combination forecasting based on cointegration test and encompassing test.
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.
Congestion based mechanism for route discovery in a V2I-V2V system applying smart devices and IoT.
Parrado, Natalia; Donoso, Yezid
2015-03-31
The Internet of Things is a new paradigm in which objects in a specific context can be integrated into traditional communication networks to actively participate in solving a determined problem. The Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2I) technologies are specific cases of IoT and key enablers for Intelligent Transportation Systems (ITS). V2V and V2I have been widely used to solve different problems associated with transportation in cities, in which the most important is traffic congestion. A high percentage of congestion is usually presented by the inappropriate use of resources in vehicular infrastructure. In addition, the integration of traffic congestion in decision making for vehicular traffic is a challenge due to its high dynamic behavior. In this paper, an optimization model over the load balancing in the congestion percentage of the streets is formulated. Later, we explore a fully congestion-oriented route discovery mechanism and we make a proposal on the communication infrastructure that should support it based on V2I and V2V communication. The mechanism is also compared with a modified Dijkstra's approach that reacts at congestion states. Finally, we compare the results of the efficiency of the vehicle's trip with the efficiency in the use of the capacity of the vehicular network.
Congestion Based Mechanism for Route Discovery in a V2I-V2V System Applying Smart Devices and IoT
Parrado, Natalia; Donoso, Yezid
2015-01-01
The Internet of Things is a new paradigm in which objects in a specific context can be integrated into traditional communication networks to actively participate in solving a determined problem. The Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2I) technologies are specific cases of IoT and key enablers for Intelligent Transportation Systems (ITS). V2V and V2I have been widely used to solve different problems associated with transportation in cities, in which the most important is traffic congestion. A high percentage of congestion is usually presented by the inappropriate use of resources in vehicular infrastructure. In addition, the integration of traffic congestion in decision making for vehicular traffic is a challenge due to its high dynamic behavior. In this paper, an optimization model over the load balancing in the congestion percentage of the streets is formulated. Later, we explore a fully congestion-oriented route discovery mechanism and we make a proposal on the communication infrastructure that should support it based on V2I and V2V communication. The mechanism is also compared with a modified Dijkstra’s approach that reacts at congestion states. Finally, we compare the results of the efficiency of the vehicle’s trip with the efficiency in the use of the capacity of the vehicular network. PMID:25835185
Water balance models in one-month-ahead streamflow forecasting
Alley, William M.
1985-01-01
Techniques are tested that incorporate information from water balance models in making 1-month-ahead streamflow forecasts in New Jersey. The results are compared to those based on simple autoregressive time series models. The relative performance of the models is dependent on the month of the year in question. The water balance models are most useful for forecasts of April and May flows. For the stations in northern New Jersey, the April and May forecasts were made in order of decreasing reliability using the water-balance-based approaches, using the historical monthly means, and using simple autoregressive models. The water balance models were useful to a lesser extent for forecasts during the fall months. For the rest of the year the improvements in forecasts over those obtained using the simpler autoregressive models were either very small or the simpler models provided better forecasts. When using the water balance models, monthly corrections for bias are found to improve minimum mean-square-error forecasts as well as to improve estimates of the forecast conditional distributions.
Evaluation Of Statistical Models For Forecast Errors From The HBV-Model
NASA Astrophysics Data System (ADS)
Engeland, K.; Kolberg, S.; Renard, B.; Stensland, I.
2009-04-01
Three statistical models for the forecast errors for inflow to the Langvatn reservoir in Northern Norway have been constructed and tested according to how well the distribution and median values of the forecasts errors fit to the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order autoregressive model was constructed for the forecast errors. The parameters were conditioned on climatic conditions. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order autoregressive model was constructed for the forecast errors. For the last model positive and negative errors were modeled separately. The errors were first NQT-transformed before a model where the mean values were conditioned on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: We wanted a) the median values to be close to the observed values; b) the forecast intervals to be narrow; c) the distribution to be correct. The results showed that it is difficult to obtain a correct model for the forecast errors, and that the main challenge is to account for the auto-correlation in the errors. Model 1 and 2 gave similar results, and the main drawback is that the distributions are not correct. The 95% forecast intervals were well identified, but smaller forecast intervals were over-estimated, and larger intervals were under-estimated. Model 3 gave a distribution that fits better, but the median values do not fit well since the auto-correlation is not properly accounted for. If the 95% forecast interval is of interest, Model 2 is recommended. If the whole distribution is of interest, Model 3 is recommended.
Accuracy of short‐term sea ice drift forecasts using a coupled ice‐ocean model
Zhang, Jinlun
2015-01-01
Abstract Arctic sea ice drift forecasts of 6 h–9 days for the summer of 2014 are generated using the Marginal Ice Zone Modeling and Assimilation System (MIZMAS); the model is driven by 6 h atmospheric forecasts from the Climate Forecast System (CFSv2). Forecast ice drift speed is compared to drifting buoys and other observational platforms. Forecast positions are compared with actual positions 24 h–8 days since forecast. Forecast results are further compared to those from the forecasts generated using an ice velocity climatology driven by multiyear integrations of the same model. The results are presented in the context of scheduling the acquisition of high‐resolution images that need to follow buoys or scientific research platforms. RMS errors for ice speed are on the order of 5 km/d for 24–48 h since forecast using the sea ice model compared with 9 km/d using climatology. Predicted buoy position RMS errors are 6.3 km for 24 h and 14 km for 72 h since forecast. Model biases in ice speed and direction can be reduced by adjusting the air drag coefficient and water turning angle, but the adjustments do not affect verification statistics. This suggests that improved atmospheric forecast forcing may further reduce the forecast errors. The model remains skillful for 8 days. Using the forecast model increases the probability of tracking a target drifting in sea ice with a 10 km × 10 km image from 60 to 95% for a 24 h forecast and from 27 to 73% for a 48 h forecast. PMID:27818852
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
NASA Astrophysics Data System (ADS)
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast
ERIC Educational Resources Information Center
Moore, Crystal Dea
2010-01-01
A cross-sectional study of 76 family caregivers of older veterans with congestive heart failure utilized the McMaster model of family functioning to examine the impact of family functioning variables (problem solving, communication, roles, affective responsiveness, and affective involvement) on caregiver burden dimensions (relationship burden,…
Integration of Behind-the-Meter PV Fleet Forecasts into Utility Grid System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoff, Thomas Hoff; Kankiewicz, Adam
Four major research objectives were completed over the course of this study. Three of the objectives were to evaluate three, new, state-of-the-art solar irradiance forecasting models. The fourth objective was to improve the California Independent System Operator’s (ISO) load forecasts by integrating behind-the-meter (BTM) PV forecasts. The three, new, state-of-the-art solar irradiance forecasting models included: the infrared (IR) satellite-based cloud motion vector (CMV) model; the WRF-SolarCA model and variants; and the Optimized Deep Machine Learning (ODML)-training model. The first two forecasting models targeted known weaknesses in current operational solar forecasts. They were benchmarked against existing operational numerical weather prediction (NWP)more » forecasts, visible satellite CMV forecasts, and measured PV plant power production. IR CMV, WRF-SolarCA, and ODML-training forecasting models all improved the forecast to a significant degree. Improvements varied depending on time of day, cloudiness index, and geographic location. The fourth objective was to demonstrate that the California ISO’s load forecasts could be improved by integrating BTM PV forecasts. This objective represented the project’s most exciting and applicable gains. Operational BTM forecasts consisting of 200,000+ individual rooftop PV forecasts were delivered into the California ISO’s real-time automated load forecasting (ALFS) environment. They were then evaluated side-by-side with operational load forecasts with no BTM-treatment. Overall, ALFS-BTM day-ahead (DA) forecasts performed better than baseline ALFS forecasts when compared to actual load data. Specifically, ALFS-BTM DA forecasts were observed to have the largest reduction of error during the afternoon on cloudy days. Shorter term 30 minute-ahead ALFS-BTM forecasts were shown to have less error under all sky conditions, especially during the morning time periods when traditional load forecasts often experience their largest uncertainties. This work culminated in a GO decision being made by the California ISO to include zonal BTM forecasts into its operational load forecasting system. The California ISO’s Manager of Short Term Forecasting, Jim Blatchford, summarized the research performed in this project with the following quote: “The behind-the-meter (BTM) California ISO region forecasting research performed by Clean Power Research and sponsored by the Department of Energy’s SUNRISE program was an opportunity to verify value and demonstrate improved load forecast capability. In 2016, the California ISO will be incorporating the BTM forecast into the Hour Ahead and Day Ahead load models to look for improvements in the overall load forecast accuracy as BTM PV capacity continues to grow.”« less
NASA Astrophysics Data System (ADS)
Yuan, Kai; Knoop, Victor L.; Hoogendoorn, Serge P.
2017-01-01
On freeways, congestion always leads to capacity drop. This means the queue discharge rate is lower than the pre-queue capacity. Our recent research findings indicate that the queue discharge rate increases with the speed in congestion, that is the capacity drop is strongly correlated with the congestion state. Incorporating this varying capacity drop into a kinematic wave model is essential for assessing consequences of control strategies. However, to the best of authors' knowledge, no such a model exists. This paper fills the research gap by presenting a Lagrangian kinematic wave model. "Lagrangian" denotes that the new model is solved in Lagrangian coordinates. The new model can give capacity drops accompanying both of stop-and-go waves (on homogeneous freeway section) and standing queues (at nodes) in a network. The new model can be applied in a network operation. In this Lagrangian kinematic wave model, the queue discharge rate (or the capacity drop) is a function of vehicular speed in traffic jams. Four case studies on links as well as at lane-drop and on-ramp nodes show that the Lagrangian kinematic wave model can give capacity drops well, consistent with empirical observations.
Forecasting biodiversity in breeding birds using best practices
Taylor, Shawn D.; White, Ethan P.
2018-01-01
Biodiversity forecasts are important for conservation, management, and evaluating how well current models characterize natural systems. While the number of forecasts for biodiversity is increasing, there is little information available on how well these forecasts work. Most biodiversity forecasts are not evaluated to determine how well they predict future diversity, fail to account for uncertainty, and do not use time-series data that captures the actual dynamics being studied. We addressed these limitations by using best practices to explore our ability to forecast the species richness of breeding birds in North America. We used hindcasting to evaluate six different modeling approaches for predicting richness. Hindcasts for each method were evaluated annually for a decade at 1,237 sites distributed throughout the continental United States. All models explained more than 50% of the variance in richness, but none of them consistently outperformed a baseline model that predicted constant richness at each site. The best practices implemented in this study directly influenced the forecasts and evaluations. Stacked species distribution models and “naive” forecasts produced poor estimates of uncertainty and accounting for this resulted in these models dropping in the relative performance compared to other models. Accounting for observer effects improved model performance overall, but also changed the rank ordering of models because it did not improve the accuracy of the “naive” model. Considering the forecast horizon revealed that the prediction accuracy decreased across all models as the time horizon of the forecast increased. To facilitate the rapid improvement of biodiversity forecasts, we emphasize the value of specific best practices in making forecasts and evaluating forecasting methods. PMID:29441230
Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)
NASA Astrophysics Data System (ADS)
OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.
2016-02-01
Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.
Network traffic behaviour near phase transition point
NASA Astrophysics Data System (ADS)
Lawniczak, A. T.; Tang, X.
2006-03-01
We explore packet traffic dynamics in a data network model near phase transition point from free flow to congestion. The model of data network is an abstraction of the Network Layer of the OSI (Open Systems Interconnect) Reference Model of packet switching networks. The Network Layer is responsible for routing packets across the network from their sources to their destinations and for control of congestion in data networks. Using the model we investigate spatio-temporal packets traffic dynamics near the phase transition point for various network connection topologies, and static and adaptive routing algorithms. We present selected simulation results and analyze them.
NASA Astrophysics Data System (ADS)
Prada, Jose Fernando
Keeping a contingency reserve in power systems is necessary to preserve the security of real-time operations. This work studies two different approaches to the optimal allocation of energy and reserves in the day-ahead generation scheduling process. Part I presents a stochastic security-constrained unit commitment model to co-optimize energy and the locational reserves required to respond to a set of uncertain generation contingencies, using a novel state-based formulation. The model is applied in an offer-based electricity market to allocate contingency reserves throughout the power grid, in order to comply with the N-1 security criterion under transmission congestion. The objective is to minimize expected dispatch and reserve costs, together with post contingency corrective redispatch costs, modeling the probability of generation failure and associated post contingency states. The characteristics of the scheduling problem are exploited to formulate a computationally efficient method, consistent with established operational practices. We simulated the distribution of locational contingency reserves on the IEEE RTS96 system and compared the results with the conventional deterministic method. We found that assigning locational spinning reserves can guarantee an N-1 secure dispatch accounting for transmission congestion at a reasonable extra cost. The simulations also showed little value of allocating downward reserves but sizable operating savings from co-optimizing locational nonspinning reserves. Overall, the results indicate the computational tractability of the proposed method. Part II presents a distributed generation scheduling model to optimally allocate energy and spinning reserves among competing generators in a day-ahead market. The model is based on the coordination between individual generators and a market entity. The proposed method uses forecasting, augmented pricing and locational signals to induce efficient commitment of generators based on firm posted prices. It is price-based but does not rely on multiple iterations, minimizes information exchange and simplifies the market clearing process. Simulations of the distributed method performed on a six-bus test system showed that, using an appropriate set of prices, it is possible to emulate the results of a conventional centralized solution, without need of providing make-whole payments to generators. Likewise, they showed that the distributed method can accommodate transactions with different products and complex security constraints.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
Counteracting structural errors in ensemble forecast of influenza outbreaks.
Pei, Sen; Shaman, Jeffrey
2017-10-13
For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Two approaches to forecast Ebola synthetic epidemics.
Champredon, David; Li, Michael; Bolker, Benjamin M; Dushoff, Jonathan
2018-03-01
We use two modelling approaches to forecast synthetic Ebola epidemics in the context of the RAPIDD Ebola Forecasting Challenge. The first approach is a standard stochastic compartmental model that aims to forecast incidence, hospitalization and deaths among both the general population and health care workers. The second is a model based on the renewal equation with latent variables that forecasts incidence in the whole population only. We describe fitting and forecasting procedures for each model and discuss their advantages and drawbacks. We did not find that one model was consistently better in forecasting than the other. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Streamflow forecasts from WRF precipitation for flood early warning in mountain tropical areas
NASA Astrophysics Data System (ADS)
Rogelis, María Carolina; Werner, Micha
2018-02-01
Numerical weather prediction (NWP) models are fundamental to extend forecast lead times beyond the concentration time of a watershed. Particularly for flash flood forecasting in tropical mountainous watersheds, forecast precipitation is required to provide timely warnings. This paper aims to assess the potential of NWP for flood early warning purposes, and the possible improvement that bias correction can provide, in a tropical mountainous area. The paper focuses on the comparison of streamflows obtained from the post-processed precipitation forecasts, particularly the comparison of ensemble forecasts and their potential in providing skilful flood forecasts. The Weather Research and Forecasting (WRF) model is used to produce precipitation forecasts that are post-processed and used to drive a hydrologic model. Discharge forecasts obtained from the hydrological model are used to assess the skill of the WRF model. The results show that post-processed WRF precipitation adds value to the flood early warning system when compared to zero-precipitation forecasts, although the precipitation forecast used in this analysis showed little added value when compared to climatology. However, the reduction of biases obtained from the post-processed ensembles show the potential of this method and model to provide usable precipitation forecasts in tropical mountainous watersheds. The need for more detailed evaluation of the WRF model in the study area is highlighted, particularly the identification of the most suitable parameterisation, due to the inability of the model to adequately represent the convective precipitation found in the study area.
NASA Astrophysics Data System (ADS)
Shaman, J.; Stieglitz, M.; Zebiak, S.; Cane, M.; Day, J. F.
2002-12-01
We present an ensemble local hydrologic forecast derived from the seasonal forecasts of the International Research Institute (IRI) for Climate Prediction. Three- month seasonal forecasts were used to resample historical meteorological conditions and generate ensemble forcing datasets for a TOPMODEL-based hydrology model. Eleven retrospective forecasts were run at a Florida and New York site. Forecast skill was assessed for mean area modeled water table depth (WTD), i.e. near surface soil wetness conditions, and compared with WTD simulated with observed data. Hydrology model forecast skill was evident at the Florida site but not at the New York site. At the Florida site, persistence of hydrologic conditions and local skill of the IRI seasonal forecast contributed to the local hydrologic forecast skill. This forecast will permit probabilistic prediction of future hydrologic conditions. At the Florida site, we have also quantified the link between modeled WTD (i.e. drought) and the amplification and transmission of St. Louis Encephalitis virus (SLEV). We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission associated with human clinical cases. We then combine the seasonal forecasts of local, modeled WTD with this empirical relationship and produce retrospective probabilistic seasonal forecasts of epidemic SLEV transmission in Florida. Epidemic SLEV transmission forecast skill is demonstrated. These findings will permit real-time forecast of drought and resultant SLEV transmission in Florida.
Shimizu, Akira; Kobayashi, Akira; Motoyama, Hiroaki; Sakai, Hiroshi; Yamada, Akira; Yoshizawa, Akihiko; Momose, Masanobu; Kadoya, Masumi; Miyagawa, Shin-ichi
2015-09-01
To evaluate the features of hepatic congestion on gadoxetate disodium (Gd-EOB-DTPA)-enhanced magnetic resonance imaging (MRI) and the mechanisms responsible for the radiological findings in a rat model of partial liver congestion. A conventional T1 -weighted spin-echo sequence of the liver was performed using a 1.5T magnetic resonance imager with an 80-mm magnetic aperture for animal studies. We induced regional congestion using partial left lateral hepatic vein ligation (n = 5) and evaluated the following in both congestive liver (CL) and noncongestive liver (non-CL): 1) chronological changes in the relative enhancement (RE) up to 60 minutes after Gd-EOB-DTPA administration, and 2) mRNA and protein expression of rat organic anion transporting protein 1a1 (Oatp1a1). The RE in the CL reached a small peak (18%) at 5 minutes, corresponding to approximately half of the value observed in the non-CL, then slowly decreased in a linear manner thereafter. The degree of RE in the CL was significantly lower than that in the non-CL for up to 30 minutes (P < 0.05). An immunohistological examination showed that Oatp1a1 protein expression was downregulated in the CL. The mRNA level of Oatp1a1 in the CL was significantly upregulated, compared with that in control rat liver (P = 0.046), whereas no significant difference was observed between the CL and the non-CL (P = 0.698). The reduced signal intensity in the CL on Gd-EOB-DTPA-enhanced MRI could be explained by the decreased uptake of Gd-EOB-DTPA via Oatp1a1 protein in the congestive area. © 2015 Wiley Periodicals, Inc.
Making the Traffic Operations Case for Congestion Pricing: Operational Impacts of Congestion Pricing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Hu, Patricia S; Davidson, Diane
2011-02-01
Congestion begins when an excess of vehicles on a segment of roadway at a given time, resulting in speeds that are significantly slower than normal or 'free flow' speeds. Congestion often means stop-and-go traffic. The transition occurs when vehicle density (the number of vehicles per mile in a lane) exceeds a critical level. Once traffic enters a state of congestion, recovery or time to return to a free-flow state is lengthy; and during the recovery process, delay continues to accumulate. The breakdown in speed and flow greatly impedes the efficient operation of the freeway system, resulting in economic, mobility, environmentalmore » and safety problems. Freeways are designed to function as access-controlled highways characterized by uninterrupted traffic flow so references to freeway performance relate primarily to the quality of traffic flow or traffic conditions as experienced by users of the freeway. The maximum flow or capacity of a freeway segment is reached while traffic is moving freely. As a result, freeways are most productive when they carry capacity flows at 60 mph, whereas lower speeds impose freeway delay, resulting in bottlenecks. Bottlenecks may be caused by physical disruptions, such as a reduced number of lanes, a change in grade, or an on-ramp with a short merge lane. This type of bottleneck occurs on a predictable or 'recurrent' basis at the same time of day and same day of week. Recurrent congestion totals 45% of congestion and is primarily from bottlenecks (40%) as well as inadequate signal timing (5%). Nonrecurring bottlenecks result from crashes, work zone disruptions, adverse weather conditions, and special events that create surges in demand and that account for over 55% of experienced congestion. Figure 1.1 shows that nonrecurring congestion is composed of traffic incidents (25%), severe weather (15%), work zones, (10%), and special events (5%). Between 1995 and 2005, the average percentage change in increased peak traveler delay, based on hours spent in traffic in a year, grew by 22% as the national average of hours spent in delay grew from 36 hours to 44 hours. Peak delay per traveler grew one-third in medium-size urban areas over the 10 year period. The traffic engineering community has developed an arsenal of integrated tools to mitigate the impacts of congestion on freeway throughput and performance, including pricing of capacity to manage demand for travel. Congestion pricing is a strategy which dynamically matches demand with available capacity. A congestion price is a user fee equal to the added cost imposed on other travelers as a result of the last traveler's entry into the highway network. The concept is based on the idea that motorists should pay for the additional congestion they create when entering a congested road. The concept calls for fees to vary according to the level of congestion with the price mechanism applied to make travelers more fully aware of the congestion externality they impose on other travelers and the system itself. The operational rationales for the institution of pricing strategies are to improve the efficiency of operations in a corridor and/or to better manage congestion. To this end, the objectives of this project were to: (1) Better understand and quantify the impacts of congestion pricing strategies on traffic operations through the study of actual projects, and (2) Better understand and quantify the impacts of congestion pricing strategies on traffic operations through the use of modeling and other analytical methods. Specifically, the project was to identify credible analytical procedures that FHWA can use to quantify the impacts of various congestion pricing strategies on traffic flow (throughput) and congestion.« less
Improving medium-range and seasonal hydroclimate forecasts in the southeast USA
NASA Astrophysics Data System (ADS)
Tian, Di
Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.
Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis
NASA Astrophysics Data System (ADS)
Mohamed Ismael, Hawa; Vandyck, George Kobina
The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.
Convective Weather Avoidance with Uncertain Weather Forecasts
NASA Technical Reports Server (NTRS)
Karahan, Sinan; Windhorst, Robert D.
2009-01-01
Convective weather events have a disruptive impact on air traffic both in terminal area and in en-route airspaces. In order to make sure that the national air transportation system is safe and efficient, it is essential to respond to convective weather events effectively. Traffic flow control initiatives in response to convective weather include ground delay, airborne delay, miles-in-trail restrictions as well as tactical and strategic rerouting. The rerouting initiatives can potentially increase traffic density and complexity in regions neighboring the convective weather activity. There is a need to perform rerouting in an intelligent and efficient way such that the disruptive effects of rerouting are minimized. An important area of research is to study the interaction of in-flight rerouting with traffic congestion or complexity and developing methods that quantitatively measure this interaction. Furthermore, it is necessary to find rerouting solutions that account for uncertainties in weather forecasts. These are important steps toward managing complexity during rerouting operations, and the paper is motivated by these research questions. An automated system is developed for rerouting air traffic in order to avoid convective weather regions during the 20- minute - 2-hour time horizon. Such a system is envisioned to work in concert with separation assurance (0 - 20-minute time horizon), and longer term air traffic management (2-hours and beyond) to provide a more comprehensive solution to complexity and safety management. In this study, weather is dynamic and uncertain; it is represented as regions of airspace that pilots are likely to avoid. Algorithms are implemented in an air traffic simulation environment to support the research study. The algorithms used are deterministic but periodically revise reroutes to account for weather forecast updates. In contrast to previous studies, in this study convective weather is represented as regions of airspace that pilots are likely to avoid. The automated system periodically updates forecasts and reassesses rerouting decisions in order to account for changing weather predictions. The main objectives are to reroute flights to avoid convective weather regions and determine the resulting complexity due to rerouting. The eventual goal is to control and reduce complexity while rerouting flights during the 20 minute - 2 hour planning period. A three-hour simulation is conducted using 4800 flights in the national airspace. The study compares several metrics against a baseline scenario using the same traffic and weather but with rerouting disabled. The results show that rerouting can have a negative impact on congestion in some sectors, as expected. The rerouting system provides accurate measurements of the resulting complexity in the congested sectors. Furthermore, although rerouting is performed only in the 20-minute - 2-hour range, it results in a 30% reduction in encounters with nowcast weather polygons (100% being the ideal for perfectly predictable and accurate weather). In the simulations, rerouting was performed for the 20-minute - 2-hour flight time horizon, and for the en-route segment of air traffic. The implementation uses CWAM, a set of polygons that represent probabilities of pilot deviation around weather. The algorithms were implemented in a software-based air traffic simulation system. Initial results of the system's performance and effectiveness were encouraging. Simulation results showed that when flights were rerouted in the 20-minute - 2-hour flight time horizon of air traffic, there were fewer weather encounters in the first 20 minutes than for flights that were not rerouted. Some preliminary results were also obtained that showed that rerouting will also increase complexity. More simulations will be conducted in order to report conclusive results on the effects of rerouting on complexity. Thus, the use of the 20-minute - 2-hour flight time horizon weather avoidance teniques performed in the simulation is expected to provide benefits for short-term weather avoidance.
NASA Astrophysics Data System (ADS)
Saharia, M.; Wood, A.; Clark, M. P.; Bennett, A.; Nijssen, B.; Clark, E.; Newman, A. J.
2017-12-01
Most operational streamflow forecasting systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow require an experienced human forecaster. But this approach faces challenges surrounding process reproducibility, hindcasting capability, and extension to large domains. The operational hydrologic community is increasingly moving towards `over-the-loop' (completely automated) large-domain simulations yet recent developments indicate a widespread lack of community knowledge about the strengths and weaknesses of such systems for forecasting. A realistic representation of land surface hydrologic processes is a critical element for improving forecasts, but often comes at the substantial cost of forecast system agility and efficiency. While popular grid-based models support the distributed representation of land surface processes, intermediate-scale Hydrologic Unit Code (HUC)-based modeling could provide a more efficient and process-aligned spatial discretization, reducing the need for tradeoffs between model complexity and critical forecasting requirements such as ensemble methods and comprehensive model calibration. The National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the USACE to implement, assess, and demonstrate real-time, over-the-loop distributed streamflow forecasting for several large western US river basins and regions. In this presentation, we present early results from short to medium range hydrologic and streamflow forecasts for the Pacific Northwest (PNW). We employ a real-time 1/16th degree daily ensemble model forcings as well as downscaled Global Ensemble Forecasting System (GEFS) meteorological forecasts. These datasets drive an intermediate-scale configuration of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model, which represents the PNW using over 11,700 HUCs. The system produces not only streamflow forecasts (using the MizuRoute channel routing tool) but also distributed model states such as soil moisture and snow water equivalent. We also describe challenges in distributed model-based forecasting, including the application and early results of real-time hydrologic data assimilation.
Forecasting Dust Storms Using the CARMA-Dust Model and MM5 Weather Data
NASA Astrophysics Data System (ADS)
Barnum, B. H.; Winstead, N. S.; Wesely, J.; Hakola, A.; Colarco, P.; Toon, O. B.; Ginoux, P.; Brooks, G.; Hasselbarth, L. M.; Toth, B.; Sterner, R.
2002-12-01
An operational model for the forecast of dust storms in Northern Africa, the Middle East and Southwest Asia has been developed for the United States Air Force Weather Agency (AFWA). The dust forecast model uses the 5th generation Penn State Mesoscale Meteorology Model (MM5), and a modified version of the Colorado Aerosol and Radiation Model for Atmospheres (CARMA). AFWA conducted a 60 day evaluation of the dust model to look at the model's ability to forecast dust storms for short, medium and long range (72 hour) forecast periods. The study used satellite and ground observations of dust storms to verify the model's effectiveness. Each of the main mesoscale forecast theaters was broken down into smaller sub-regions for detailed analysis. The study found the forecast model was able to forecast dust storms in Saharan Africa and the Sahel region with an average Probability of Detection (POD)exceeding 68%, with a 16% False Alarm Rate (FAR). The Southwest Asian theater had average POD's of 61% with FAR's averaging 10%.
Research on light rail electric load forecasting based on ARMA model
NASA Astrophysics Data System (ADS)
Huang, Yifan
2018-04-01
The article compares a variety of time series models and combines the characteristics of power load forecasting. Then, a light load forecasting model based on ARMA model is established. Based on this model, a light rail system is forecasted. The prediction results show that the accuracy of the model prediction is high.
NASA Astrophysics Data System (ADS)
Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.
2013-10-01
Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.
Modeling when and where a secondary accident occurs.
Wang, Junhua; Liu, Boya; Fu, Ting; Liu, Shuo; Stipancic, Joshua
2018-01-31
The occurrence of secondary accidents leads to traffic congestion and road safety issues. Secondary accident prevention has become a major consideration in traffic incident management. This paper investigates the location and time of a potential secondary accident after the occurrence of an initial traffic accident. With accident data and traffic loop data collected over three years from California interstate freeways, a shock wave-based method was introduced to identify secondary accidents. A linear regression model and two machine learning algorithms, including a back-propagation neural network (BPNN) and a least squares support vector machine (LSSVM), were implemented to explore the distance and time gap between the initial and secondary accidents using inputs of crash severity, violation category, weather condition, tow away, road surface condition, lighting, parties involved, traffic volume, duration, and shock wave speed generated by the primary accident. From the results, the linear regression model was inadequate in describing the effect of most variables and its goodness-of-fit and accuracy in prediction was relatively poor. In the training programs, the BPNN and LSSVM demonstrated adequate goodness-of-fit, though the BPNN was superior with a higher CORR and lower MSE. The BPNN model also outperformed the LSSVM in time prediction, while both failed to provide adequate distance prediction. Therefore, the BPNN model could be used to forecast the time gap between initial and secondary accidents, which could be used by decision makers and incident management agencies to prevent or reduce secondary collisions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Forecasting in foodservice: model development, testing, and evaluation.
Miller, J L; Thompson, P A; Orabella, M M
1991-05-01
This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.
A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.
2013-12-18
This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and comparesmore » the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less
NASA Astrophysics Data System (ADS)
Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.
2015-12-01
Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.
Combining forecast weights: Why and how?
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim
2012-09-01
This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.
Surveying traffic congestion based on the concept of community structure of complex networks
NASA Astrophysics Data System (ADS)
Ma, Lili; Zhang, Zhanli; Li, Meng
2016-07-01
In this paper, taking the traffic of Beijing city as an instance, we study city traffic states, especially traffic congestion, based on the concept of network community structure. Concretely, using the floating car data (FCD) information of vehicles gained from the intelligent transport system (ITS) of the city, we construct a new traffic network model which is with floating cars as network nodes and time-varying. It shows that this traffic network has Gaussian degree distributions at different time points. Furthermore, compared with free traffic situations, our simulations show that the traffic network generally has more obvious community structures with larger values of network fitness for congested traffic situations, and through the GPSspg web page, we show that all of our results are consistent with the reality. Then, it indicates that network community structure should be an available way for investigating city traffic congestion problems.
A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.
Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan
2017-01-01
Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.
NASA Astrophysics Data System (ADS)
Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.
2013-12-01
One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.
A Wind Forecasting System for Energy Application
NASA Astrophysics Data System (ADS)
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
2010-05-01
Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.
Transmission congestion management in the electricity market
NASA Astrophysics Data System (ADS)
Chen, Yue
2018-04-01
In this paper we mainly discuss how to optimize the arrangement to decrease the loss of each line when the power generation side of the system transmission congestion occurs in a safe and economical manner. We respectively set the adjust model if the transmission can be eliminated which can calculate the best scheme and safety margin model when transmission cannot be eliminated which is a multi-objective planning problem. We solve the two models on the condition of the load power demands are 982.4MW and 1052.8 MW by Lingo and get the optimal management scheme.
Time series regression and ARIMAX for forecasting currency flow at Bank Indonesia in Sulawesi region
NASA Astrophysics Data System (ADS)
Suharsono, Agus; Suhartono, Masyitha, Aulia; Anuravega, Arum
2015-12-01
The purpose of the study is to forecast the outflow and inflow of currency at Indonesian Central Bank or Bank Indonesia (BI) in Sulawesi Region. The currency outflow and inflow data tend to have a trend pattern which is influenced by calendar variation effects. Therefore, this research focuses to apply some forecasting methods that could handle calendar variation effects, i.e. Time Series Regression (TSR) and ARIMAX models, and compare the forecast accuracy with ARIMA model. The best model is selected based on the lowest of Root Mean Squares Errors (RMSE) at out-sample dataset. The results show that ARIMA is the best model for forecasting the currency outflow and inflow at South Sulawesi. Whereas, the best model for forecasting the currency outflow at Central Sulawesi and Southeast Sulawesi, and for forecasting the currency inflow at South Sulawesi and North Sulawesi is TSR. Additionally, ARIMAX is the best model for forecasting the currency outflow at North Sulawesi. Hence, the results show that more complex models do not neccessary yield more accurate forecast than the simpler one.
Gambling scores for earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang
2010-04-01
This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
NASA Astrophysics Data System (ADS)
Rhee, Jinyoung; Kim, Gayoung; Im, Jungho
2017-04-01
Three regions of Indonesia with different rainfall characteristics were chosen to develop drought forecast models based on machine learning. The 6-month Standardized Precipitation Index (SPI6) was selected as the target variable. The models' forecast skill was compared to the skill of long-range climate forecast models in terms of drought accuracy and regression mean absolute error (MAE). Indonesian droughts are known to be related to El Nino Southern Oscillation (ENSO) variability despite of regional differences as well as monsoon, local sea surface temperature (SST), other large-scale atmosphere-ocean interactions such as Indian Ocean Dipole (IOD) and Southern Pacific Convergence Zone (SPCZ), and local factors including topography and elevation. Machine learning models are thus to enhance drought forecast skill by combining local and remote SST and remote sensing information reflecting initial drought conditions to the long-range climate forecast model results. A total of 126 machine learning models were developed for the three regions of West Java (JB), West Sumatra (SB), and Gorontalo (GO) and six long-range climate forecast models of MSC_CanCM3, MSC_CanCM4, NCEP, NASA, PNU, POAMA as well as one climatology model based on remote sensing precipitation data, and 1 to 6-month lead times. When compared the results between the machine learning models and the long-range climate forecast models, West Java and Gorontalo regions showed similar characteristics in terms of drought accuracy. Drought accuracy of the long-range climate forecast models were generally higher than the machine learning models with short lead times but the opposite appeared for longer lead times. For West Sumatra, however, the machine learning models and the long-range climate forecast models showed similar drought accuracy. The machine learning models showed smaller regression errors for all three regions especially with longer lead times. Among the three regions, the machine learning models developed for Gorontalo showed the highest drought accuracy and the lowest regression error. West Java showed higher drought accuracy compared to West Sumatra, while West Sumatra showed lower regression error compared to West Java. The lower error in West Sumatra may be because of the smaller sample size used for training and evaluation for the region. Regional differences of forecast skill are determined by the effect of ENSO and the following forecast skill of the long-range climate forecast models. While shown somewhat high in West Sumatra, relative importance of remote sensing variables was mostly low in most cases. High importance of the variables based on long-range climate forecast models indicates that the forecast skill of the machine learning models are mostly determined by the forecast skill of the climate models.
Chen, Mei-Chih; Chang, Kaowen
2014-11-06
Many city governments choose to supply more developable land and transportation infrastructure with the hope of attracting people and businesses to their cities. However, like those in Taiwan, major cities worldwide suffer from traffic congestion. This study applies the system thinking logic of the causal loops diagram (CLD) model in the System Dynamics (SD) approach to analyze the issue of traffic congestion and other issues related to roads and land development in Taiwan's cities. Comparing the characteristics of development trends with yearbook data for 2002 to 2013 for all of Taiwan's cities, this study explores the developing phenomenon of unlimited city sprawl and identifies the cause and effect relationships in the characteristics of development trends in traffic congestion, high-density population aggregation in cities, land development, and green land disappearance resulting from city sprawl. This study provides conclusions for Taiwan's cities' sustainability and development (S&D). When developing S&D policies, during decision making processes concerning city planning and land use management, governments should think with a holistic view of carrying capacity with the assistance of system thinking to clarify the prejudices in favor of the unlimited developing phenomena resulting from city sprawl.
Ma, Yukun; Liu, An; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha
2017-02-01
Toxic metals (TMs) and polycyclic aromatic hydrocarbons (PAHs) in urban stormwater pose risk to human health, thereby constraining its reuse potential. Based on the hypothesis that stormwater quality is primarily influenced by anthropogenic activities and traffic congestion, the primary focus of the research study was to analyse the impacts on human health risk from TMs and PAHs in urban stormwater and thereby develop a quantitative risk assessment model. The study found that anthropogenic activities and traffic congestion exert influence on the risk posed by TMs and PAHs in stormwater from commercial and residential areas. Motor vehicle related businesses (FVS) and traffic congestion (TC) were identified as two parameters which need to be included as independent variables to improve the model. Based on the study outcomes, approaches for mitigating the risk associated with TMs and PAHs in urban stormwater are discussed. Additionally, a roadmap is presented for the assessment and management of the risk arising from these pollutants. The study outcomes are expected to contribute to reducing the human health risk associated urban stormwater pollution and thereby enhance its reuse potential. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Treiber, Martin; Kesting, Arne; Helbing, Dirk
2006-07-01
We investigate the adaptation of the time headways in car-following models as a function of the local velocity variance, which is a measure of the inhomogeneity of traffic flow. We apply this mechanism to several car-following models and simulate traffic breakdowns in open systems with an on-ramp as bottleneck and in a closed ring road. Single-vehicle data and one-minute aggregated data generated by several virtual detectors show a semiquantitative agreement with microscopic and flow-density data from the Dutch freeway A9. This includes the observed distributions of the net time headways for free and congested traffic, the velocity variance as a function of density, and the fundamental diagram. The modal value of the time headway distribution is shifted by a factor of about 2 under congested conditions. Macroscopically, this corresponds to the capacity drop at the transition from free to congested traffic. The simulated fundamental diagram shows free, synchronized, and jammed traffic, and a wide scattering in the congested traffic regime. We explain this by a self-organized variance-driven process that leads to the spontaneous formation and decay of long-lived platoons even for a deterministic dynamics on a single lane.
Sensitivity of a Simulated Derecho Event to Model Initial Conditions
NASA Astrophysics Data System (ADS)
Wang, Wei
2014-05-01
Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.
Ability of matrix models to explain the past and predict the future of plant populations.
McEachern, Kathryn; Crone, Elizabeth E.; Ellis, Martha M.; Morris, William F.; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlen, Johan; Kaye, Thomas N.; Knight, Tiffany M.; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer I.; Doak, Daniel F.; Ganesan, Rengaian; Thorpe, Andrea S.; Menges, Eric S.
2013-01-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models.
Ability of matrix models to explain the past and predict the future of plant populations.
Crone, Elizabeth E; Ellis, Martha M; Morris, William F; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlén, Johan; Kaye, Thomas N; Knight, Tiffany M; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer L; Doak, Daniel F; Ganesan, Rengaian; McEachern, Kathyrn; Thorpe, Andrea S; Menges, Eric S
2013-10-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models. © 2013 Society for Conservation Biology.
Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region
NASA Astrophysics Data System (ADS)
Khan, Muhammad Yousaf; Mittnik, Stefan
2018-01-01
In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.
Daily air quality index forecasting with hybrid models: A case in China.
Zhu, Suling; Lian, Xiuyuan; Liu, Haixia; Hu, Jianming; Wang, Yuanyuan; Che, Jinxing
2017-12-01
Air quality is closely related to quality of life. Air pollution forecasting plays a vital role in air pollution warnings and controlling. However, it is difficult to attain accurate forecasts for air pollution indexes because the original data are non-stationary and chaotic. The existing forecasting methods, such as multiple linear models, autoregressive integrated moving average (ARIMA) and support vector regression (SVR), cannot fully capture the information from series of pollution indexes. Therefore, new effective techniques need to be proposed to forecast air pollution indexes. The main purpose of this research is to develop effective forecasting models for regional air quality indexes (AQI) to address the problems above and enhance forecasting accuracy. Therefore, two hybrid models (EMD-SVR-Hybrid and EMD-IMFs-Hybrid) are proposed to forecast AQI data. The main steps of the EMD-SVR-Hybrid model are as follows: the data preprocessing technique EMD (empirical mode decomposition) is utilized to sift the original AQI data to obtain one group of smoother IMFs (intrinsic mode functions) and a noise series, where the IMFs contain the important information (level, fluctuations and others) from the original AQI series. LS-SVR is applied to forecast the sum of the IMFs, and then, S-ARIMA (seasonal ARIMA) is employed to forecast the residual sequence of LS-SVR. In addition, EMD-IMFs-Hybrid first separately forecasts the IMFs via statistical models and sums the forecasting results of the IMFs as EMD-IMFs. Then, S-ARIMA is employed to forecast the residuals of EMD-IMFs. To certify the proposed hybrid model, AQI data from June 2014 to August 2015 collected from Xingtai in China are utilized as a test case to investigate the empirical research. In terms of some of the forecasting assessment measures, the AQI forecasting results of Xingtai show that the two proposed hybrid models are superior to ARIMA, SVR, GRNN, EMD-GRNN, Wavelet-GRNN and Wavelet-SVR. Therefore, the proposed hybrid models can be used as effective and simple tools for air pollution forecasting and warning as well as for management. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Novel Wind Speed Forecasting Model for Wind Farms of Northwest China
NASA Astrophysics Data System (ADS)
Wang, Jian-Zhou; Wang, Yun
2017-01-01
Wind resources are becoming increasingly significant due to their clean and renewable characteristics, and the integration of wind power into existing electricity systems is imminent. To maintain a stable power supply system that takes into account the stochastic nature of wind speed, accurate wind speed forecasting is pivotal. However, no single model can be applied to all cases. Recent studies show that wind speed forecasting errors are approximately 25% to 40% in Chinese wind farms. Presently, hybrid wind speed forecasting models are widely used and have been verified to perform better than conventional single forecasting models, not only in short-term wind speed forecasting but also in long-term forecasting. In this paper, a hybrid forecasting model is developed, the Similar Coefficient Sum (SCS) and Hermite Interpolation are exploited to process the original wind speed data, and the SVM model whose parameters are tuned by an artificial intelligence model is built to make forecast. The results of case studies show that the MAPE value of the hybrid model varies from 22.96% to 28.87 %, and the MAE value varies from 0.47 m/s to 1.30 m/s. Generally, Sign test, Wilcoxon's Signed-Rank test, and Morgan-Granger-Newbold test tell us that the proposed model is different from the compared models.
An Intelligent Decision Support System for Workforce Forecast
2011-01-01
ARIMA ) model to forecast the demand for construction skills in Hong Kong. This model was based...Decision Trees ARIMA Rule Based Forecasting Segmentation Forecasting Regression Analysis Simulation Modeling Input-Output Models LP and NLP Markovian...data • When results are needed as a set of easily interpretable rules 4.1.4 ARIMA Auto-regressive, integrated, moving-average ( ARIMA ) models
Deep Learning Based Solar Flare Forecasting Model. I. Results for Line-of-sight Magnetograms
NASA Astrophysics Data System (ADS)
Huang, Xin; Wang, Huaning; Xu, Long; Liu, Jinfu; Li, Rong; Dai, Xinghua
2018-03-01
Solar flares originate from the release of the energy stored in the magnetic field of solar active regions, the triggering mechanism for these flares, however, remains unknown. For this reason, the conventional solar flare forecast is essentially based on the statistic relationship between solar flares and measures extracted from observational data. In the current work, the deep learning method is applied to set up the solar flare forecasting model, in which forecasting patterns can be learned from line-of-sight magnetograms of solar active regions. In order to obtain a large amount of observational data to train the forecasting model and test its performance, a data set is created from line-of-sight magnetogarms of active regions observed by SOHO/MDI and SDO/HMI from 1996 April to 2015 October and corresponding soft X-ray solar flares observed by GOES. The testing results of the forecasting model indicate that (1) the forecasting patterns can be automatically reached with the MDI data and they can also be applied to the HMI data; furthermore, these forecasting patterns are robust to the noise in the observational data; (2) the performance of the deep learning forecasting model is not sensitive to the given forecasting periods (6, 12, 24, or 48 hr); (3) the performance of the proposed forecasting model is comparable to that of the state-of-the-art flare forecasting models, even if the duration of the total magnetograms continuously spans 19.5 years. Case analyses demonstrate that the deep learning based solar flare forecasting model pays attention to areas with the magnetic polarity-inversion line or the strong magnetic field in magnetograms of active regions.
Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2014-10-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.
Marcilio, Izabel; Hajat, Shakoor; Gouveia, Nelson
2013-08-01
This study aimed to develop different models to forecast the daily number of patients seeking emergency department (ED) care in a general hospital according to calendar variables and ambient temperature readings and to compare the models in terms of forecasting accuracy. The authors developed and tested six different models of ED patient visits using total daily counts of patient visits to an ED in Sao Paulo, Brazil, from January 1, 2008, to December 31, 2010. The first 33 months of the data set were used to develop the ED patient visits forecasting models (the training set), leaving the last 3 months to measure each model's forecasting accuracy by the mean absolute percentage error (MAPE). Forecasting models were developed using three different time-series analysis methods: generalized linear models (GLM), generalized estimating equations (GEE), and seasonal autoregressive integrated moving average (SARIMA). For each method, models were explored with and without the effect of mean daily temperature as a predictive variable. The daily mean number of ED visits was 389, ranging from 166 to 613. Data showed a weekly seasonal distribution, with highest patient volumes on Mondays and lowest patient volumes on weekends. There was little variation in daily visits by month. GLM and GEE models showed better forecasting accuracy than SARIMA models. For instance, the MAPEs from GLM models and GEE models at the first month of forecasting (October 2012) were 11.5 and 10.8% (models with and without control for the temperature effect, respectively), while the MAPEs from SARIMA models were 12.8 and 11.7%. For all models, controlling for the effect of temperature resulted in worse or similar forecasting ability than models with calendar variables alone, and forecasting accuracy was better for the short-term horizon (7 days in advance) than for the longer term (30 days in advance). This study indicates that time-series models can be developed to provide forecasts of daily ED patient visits, and forecasting ability was dependent on the type of model employed and the length of the time horizon being predicted. In this setting, GLM and GEE models showed better accuracy than SARIMA models. Including information about ambient temperature in the models did not improve forecasting accuracy. Forecasting models based on calendar variables alone did in general detect patterns of daily variability in ED volume and thus could be used for developing an automated system for better planning of personnel resources. © 2013 by the Society for Academic Emergency Medicine.
NASA Astrophysics Data System (ADS)
Li, Ji; Chen, Yangbo; Wang, Huanyu; Qin, Jianming; Li, Jie; Chiao, Sen
2017-03-01
Long lead time flood forecasting is very important for large watershed flood mitigation as it provides more time for flood warning and emergency responses. The latest numerical weather forecast model could provide 1-15-day quantitative precipitation forecasting products in grid format, and by coupling this product with a distributed hydrological model could produce long lead time watershed flood forecasting products. This paper studied the feasibility of coupling the Liuxihe model with the Weather Research and Forecasting quantitative precipitation forecast (WRF QPF) for large watershed flood forecasting in southern China. The QPF of WRF products has three lead times, including 24, 48 and 72 h, with the grid resolution being 20 km × 20 km. The Liuxihe model is set up with freely downloaded terrain property; the model parameters were previously optimized with rain gauge observed precipitation, and re-optimized with the WRF QPF. Results show that the WRF QPF has bias with the rain gauge precipitation, and a post-processing method is proposed to post-process the WRF QPF products, which improves the flood forecasting capability. With model parameter re-optimization, the model's performance improves also. This suggests that the model parameters be optimized with QPF, not the rain gauge precipitation. With the increasing of lead time, the accuracy of the WRF QPF decreases, as does the flood forecasting capability. Flood forecasting products produced by coupling the Liuxihe model with the WRF QPF provide a good reference for large watershed flood warning due to its long lead time and rational results.
NASA Astrophysics Data System (ADS)
Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.
2016-12-01
Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.
Weighting of NMME temperature and precipitation forecasts across Europe
NASA Astrophysics Data System (ADS)
Slater, Louise J.; Villarini, Gabriele; Bradley, A. Allen
2017-09-01
Multi-model ensemble forecasts are obtained by weighting multiple General Circulation Model (GCM) outputs to heighten forecast skill and reduce uncertainties. The North American Multi-Model Ensemble (NMME) project facilitates the development of such multi-model forecasting schemes by providing publicly-available hindcasts and forecasts online. Here, temperature and precipitation forecasts are enhanced by leveraging the strengths of eight NMME GCMs (CCSM3, CCSM4, CanCM3, CanCM4, CFSv2, GEOS5, GFDL2.1, and FLORb01) across all forecast months and lead times, for four broad climatic European regions: Temperate, Mediterranean, Humid-Continental and Subarctic-Polar. We compare five different approaches to multi-model weighting based on the equally weighted eight single-model ensembles (EW-8), Bayesian updating (BU) of the eight single-model ensembles (BU-8), BU of the 94 model members (BU-94), BU of the principal components of the eight single-model ensembles (BU-PCA-8) and BU of the principal components of the 94 model members (BU-PCA-94). We assess the forecasting skill of these five multi-models and evaluate their ability to predict some of the costliest historical droughts and floods in recent decades. Results indicate that the simplest approach based on EW-8 preserves model skill, but has considerable biases. The BU and BU-PCA approaches reduce the unconditional biases and negative skill in the forecasts considerably, but they can also sometimes diminish the positive skill in the original forecasts. The BU-PCA models tend to produce lower conditional biases than the BU models and have more homogeneous skill than the other multi-models, but with some loss of skill. The use of 94 NMME model members does not present significant benefits over the use of the 8 single model ensembles. These findings may provide valuable insights for the development of skillful, operational multi-model forecasting systems.
A comparison of GLAS SAT and NMC high resolution NOSAT forecasts from 19 and 11 February 1976
NASA Technical Reports Server (NTRS)
Atlas, R.
1979-01-01
A subjective comparison of the Goddard Laboratory for Atmospheric Sciences (GLAS) and the National Meteorological Center (NMC) high resolution model forecasts is presented. Two cases where NMC's operational model in 1976 had serious difficulties in forecasting for the United States were examined. For each of the cases, the GLAS model forecasts from initial conditions which included satellite sounding data were compared directly to the NMC higher resolution model forecasts, from initial conditions which excluded the satellite data. The comparison showed that the GLAS satellite forecasts significantly improved upon the current NMC operational model's predictions in both cases.
Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast
NASA Technical Reports Server (NTRS)
Zhu, Jiang; Stevens, E.; Zhang, X.; Zavodsky, B. T.; Heinrichs, T.; Broderson, D.
2014-01-01
A case study and monthly statistical analysis using sounder data assimilation to improve the Alaska regional weather forecast model are presented. Weather forecast in Alaska faces challenges as well as opportunities. Alaska has a large land with multiple types of topography and coastal area. Weather forecast models must be finely tuned in order to accurately predict weather in Alaska. Being in the high-latitudes provides Alaska greater coverage of polar orbiting satellites for integration into forecasting models than the lower 48. Forecasting marine low stratus clouds is critical to the Alaska aviation and oil industry and is the current focus of the case study. NASA AIRS/CrIS sounder profiles data are used to do data assimilation for the Alaska regional weather forecast model to improve Arctic marine stratus clouds forecast. Choosing physical options for the WRF model is discussed. Preprocess of AIRS/CrIS sounder data for data assimilation is described. Local observation data, satellite data, and global data assimilation data are used to verify and/or evaluate the forecast results by the MET tools Model Evaluation Tools (MET).
Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
NASA Astrophysics Data System (ADS)
Krzyścin, J. W.; Jaroslawski, J.; Sobolewski, P.
2001-10-01
A forecast of the UV index for the following day is presented. The standard approach to the UV index modelling is applied, i.e., the clear-sky UV index is multiplied by the UV cloud transmission factor. The input to the clear-sky model (tropospheric ultraviolet and visible-TUV model, Madronich, in: M. Tevini (Ed.), Environmental Effects of Ultraviolet Radiation, Lewis Publisher, Boca Raton, /1993, p. 17) consists of the total ozone forecast (by a regression model using the observed and forecasted meteorological variables taken as the initial values of aviation (AVN) global model and their 24-hour forecasts, respectively) and aerosols optical depth (AOD) forecast (assumed persistence). The cloud transmission factor forecast is inferred from the 24-h AVN model run for the total (Sun/+sky) solar irradiance at noon. The model is validated comparing the UV index forecasts with the observed values, which are derived from the daily pattern of the UV erythemal irradiance taken at Belsk (52°N,21°E), Poland, by means of the UV Biometer Solar model 501A for the period May-September 1999. Eighty-one percent and 92% of all forecasts fall into /+/-1 and /+/-2 index unit range, respectively. Underestimation of UV index occurs only in 15%. Thus, the model gives a high security in Sun protection for the public. It is found that in /~35% of all cases a more accurate forecast of AOD is needed to estimate the daily maximum of clear-sky irradiance with the error not exceeding 5%. The assumption of the persistence of the cloud characteristics appears as an alternative to the 24-h forecast of the cloud transmission factor in the case when the AVN prognoses are not available.
Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service
NASA Astrophysics Data System (ADS)
Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.
2016-12-01
The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.
NASA Astrophysics Data System (ADS)
Chen, L. C.; Mo, K. C.; Zhang, Q.; Huang, J.
2014-12-01
Drought prediction from monthly to seasonal time scales is of critical importance to disaster mitigation, agricultural planning, and multi-purpose reservoir management. Starting in December 2012, NOAA Climate Prediction Center (CPC) has been providing operational Standardized Precipitation Index (SPI) Outlooks using the North American Multi-Model Ensemble (NMME) forecasts, to support CPC's monthly drought outlooks and briefing activities. The current NMME system consists of six model forecasts from U.S. and Canada modeling centers, including the CFSv2, CM2.1, GEOS-5, CCSM3.0, CanCM3, and CanCM4 models. In this study, we conduct an assessment of the predictive skill of meteorological drought using real-time NMME forecasts for the period from May 2012 to May 2014. The ensemble SPI forecasts are the equally weighted mean of the six model forecasts. Two performance measures, the anomaly correlation coefficient and root-mean-square errors against the observations, are used to evaluate forecast skill.Similar to the assessment based on NMME retrospective forecasts, predictive skill of monthly-mean precipitation (P) forecasts is generally low after the second month and errors vary among models. Although P forecast skill is not large, SPI predictive skill is high and the differences among models are small. The skill mainly comes from the P observations appended to the model forecasts. This factor also contributes to the similarity of SPI prediction among the six models. Still, NMME SPI ensemble forecasts have higher skill than those based on individual models or persistence, and the 6-month SPI forecasts are skillful out to four months. The three major drought events occurred during the 2012-2014 period, the 2012 Central Great Plains drought, the 2013 Upper Midwest flash drought, and 2013-2014 California drought, are used as examples to illustrate the system's strength and limitations. For precipitation-driven drought events, such as the 2012 Central Great Plains drought, NMME SPI forecasts perform well in predicting drought severity and spatial patterns. For fast-developing drought events, such as the 2013 Upper Midwest flash drought, the system failed to capture the onset of the drought.
A data-driven multi-model methodology with deep feature selection for short-term wind forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias
With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-01-01
;Contents: Volume 2: Commissioned Papers: Congestion Trends in Metropolitan Areas; Alternative Methods for Measuring Congestion Levels; Potential of Congestion Pricing in the Metropolitan Washington Region; Transportation Pricing and Travel Behavior; Peak Pricing Strategies in Transportation, Utilities, and Telecommunications: Lessons for Road Pricing; Cashing Out Employer-Paid Parking: A Precedent for Congestion Pricing; The New York Region: First in Tolls, Last in Road Pricing; Pricing Urban Roadways: Administrative and Institutional Issues; Equity and Fairness Considerations of Congestion Pricing; The Politics of Congestion Pricing; Institutional and Political Challenges in Implementing Congestion Pricing: Case Study of the San Francisco Bay Area; How Congestion Pricingmore » Came to Be Proposed in the San Diego Region: A Case History; Urban Transportation Congestion Pricing: Effects on Urban Form; Congestion Pricing and Motor Vehicle Emissions: An Initial Review; Private Toll Roads: Acceptability of Congestion Pricing in Southern California; Potential of Next-Generation Technology; Electronic Toll Collection Systems; and Impacts of Congestion Pricing on Transit and Carpool Demand and Supply.« less
Next-Day Earthquake Forecasts for California
NASA Astrophysics Data System (ADS)
Werner, M. J.; Jackson, D. D.; Kagan, Y. Y.
2008-12-01
We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results.
NASA Astrophysics Data System (ADS)
Lehner, F.; Wood, A.; Llewellyn, D.; Blatchford, D. B.; Goodbody, A. G.; Pappenberger, F.
2017-12-01
Recent studies have documented the influence of increasing temperature on streamflow across the American West, including snow-melt driven rivers such as the Colorado or Rio Grande. At the same time, some basins are reporting decreasing skill in seasonal streamflow forecasts, termed water supply forecasts (WSFs), over the recent decade. While the skill in seasonal precipitation forecasts from dynamical models remains low, their skill in predicting seasonal temperature variations could potentially be harvested for WSFs to account for non-stationarity in regional temperatures. Here, we investigate whether WSF skill can be improved by incorporating seasonal temperature forecasts from dynamical forecasting models (from the North American Multi Model Ensemble and the European Centre for Medium-Range Weather Forecast System 4) into traditional statistical forecast models. We find improved streamflow forecast skill relative to traditional WSF approaches in a majority of headwater locations in the Colorado and Rio Grande basins. Incorporation of temperature into WSFs thus provides a promising avenue to increase the robustness of current forecasting techniques in the face of continued regional warming.
NASA Astrophysics Data System (ADS)
Cobourn, W. Geoffrey
2010-08-01
An enhanced PM 2.5 air quality forecast model based on nonlinear regression (NLR) and back-trajectory concentrations has been developed for use in the Louisville, Kentucky metropolitan area. The PM 2.5 air quality forecast model is designed for use in the warm season, from May through September, when PM 2.5 air quality is more likely to be critical for human health. The enhanced PM 2.5 model consists of a basic NLR model, developed for use with an automated air quality forecast system, and an additional parameter based on upwind PM 2.5 concentration, called PM24. The PM24 parameter is designed to be determined manually, by synthesizing backward air trajectory and regional air quality information to compute 24-h back-trajectory concentrations. The PM24 parameter may be used by air quality forecasters to adjust the forecast provided by the automated forecast system. In this study of the 2007 and 2008 forecast seasons, the enhanced model performed well using forecasted meteorological data and PM24 as input. The enhanced PM 2.5 model was compared with three alternative models, including the basic NLR model, the basic NLR model with a persistence parameter added, and the NLR model with persistence and PM24. The two models that included PM24 were of comparable accuracy. The two models incorporating back-trajectory concentrations had lower mean absolute errors and higher rates of detecting unhealthy PM2.5 concentrations compared to the other models.
NASA Astrophysics Data System (ADS)
Hitaj, Claudia
In this dissertation, I analyze the drivers of wind power development in the United States as well as the relationship between renewable power plant location and transmission congestion and emissions levels. I first examine the role of government renewable energy incentives and access to the electricity grid on investment in wind power plants across counties from 1998-2007. The results indicate that the federal production tax credit, state-level sales tax credit and production incentives play an important role in promoting wind power. In addition, higher wind power penetration levels can be achieved by bringing more parts of the electricity transmission grid under independent system operator regulation. I conclude that state and federal government policies play a significant role in wind power development both by providing financial support and by improving physical and procedural access to the electricity grid. Second, I examine the effect of renewable power plant location on electricity transmission congestion levels and system-wide emissions levels in a theoretical model and a simulation study. A new renewable plant takes the effect of congestion on its own output into account, but ignores the effect of its marginal contribution to congestion on output from existing plants, which results in curtailment of renewable power. Though pricing congestion removes the externality and reduces curtailment, I find that in the absence of a price on emissions, pricing congestion may in some cases actually increase system-wide emissions. The final part of my dissertation deals with an econometric issue that emerged from the empirical analysis of the drivers of wind power. I study the effect of the degree of censoring on random-effects Tobit estimates in finite sample with a particular focus on severe censoring, when the percentage of uncensored observations reaches 1 to 5 percent. The results show that the Tobit model performs well even at 5 percent uncensored observations with the bias in the Tobit estimates remaining at or below 5 percent. Under severe censoring (1 percent uncensored observations), large biases appear in the estimated standard errors and marginal effects. These are generally reduced as the sample size increases in both N and T.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, WanYin; Zhang, Jie; Florita, Anthony
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance,more » cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.« less
Forecast first: An argument for groundwater modeling in reverse
White, Jeremy
2017-01-01
Numerical groundwater models are important compo-nents of groundwater analyses that are used for makingcritical decisions related to the management of ground-water resources. In this support role, models are oftenconstructed to serve a specific purpose that is to provideinsights, through simulation, related to a specific func-tion of a complex aquifer system that cannot be observeddirectly (Anderson et al. 2015).For any given modeling analysis, several modelinput datasets must be prepared. Herein, the datasetsrequired to simulate the historical conditions are referredto as the calibration model, and the datasets requiredto simulate the model’s purpose are referred to as theforecast model. Future groundwater conditions or otherunobserved aspects of the groundwater system may besimulated by the forecast model—the outputs of interestfrom the forecast model represent the purpose of themodeling analysis. Unfortunately, the forecast model,needed to simulate the purpose of the modeling analysis,is seemingly an afterthought—calibration is where themajority of time and effort are expended and calibrationis usually completed before the forecast model is evenconstructed. Herein, I am proposing a new groundwatermodeling workflow, referred to as the “forecast first”workflow, where the forecast model is constructed at anearlier stage in the modeling analysis and the outputsof interest from the forecast model are evaluated duringsubsequent tasks in the workflow.
Robustness of disaggregate oil and gas discovery forecasting models
Attanasi, E.D.; Schuenemeyer, J.H.
1989-01-01
The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.
Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity
NASA Astrophysics Data System (ADS)
Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján
2017-06-01
It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.
Time Relevance of Convective Weather Forecast for Air Traffic Automation
NASA Technical Reports Server (NTRS)
Chan, William N.
2006-01-01
The Federal Aviation Administration (FAA) is handling nearly 120,000 flights a day through its Air Traffic Management (ATM) system and air traffic congestion is expected to increse substantially over the next 20 years. Weather-induced impacts to throughput and efficiency are the leading cause of flight delays accounting for 70% of all delays with convective weather accounting for 60% of all weather related delays. To support the Next Generation Air Traffic System goal of operating at 3X current capacity in the NAS, ATC decision support tools are being developed to create advisories to assist controllers in all weather constraints. Initial development of these decision support tools did not integrate information regarding weather constraints such as thunderstorms and relied on an additional system to provide that information. Future Decision Support Tools should move towards an integrated system where weather constraints are factored into the advisory of a Decision Support Tool (DST). Several groups such at NASA-Ames, Lincoln Laboratories, and MITRE are integrating convective weather data with DSTs. A survey of current convective weather forecast and observation data show they span a wide range of temporal and spatial resolutions. Short range convective observations can be obtained every 5 mins with longer range forecasts out to several days updated every 6 hrs. Today, the short range forecasts of less than 2 hours have a temporal resolution of 5 mins. Beyond 2 hours, forecasts have much lower temporal. resolution of typically 1 hour. Spatial resolutions vary from 1km for short range to 40km for longer range forecasts. Improving the accuracy of long range convective forecasts is a major challenge. A report published by the National Research Council states improvements for convective forecasts for the 2 to 6 hour time frame will only be achieved for a limited set of convective phenomena in the next 5 to 10 years. Improved longer range forecasts will be probabilistic as opposed to the deterministic shorter range forecasts. Despite the known low level of confidence with respect to long range convective forecasts, these data are still useful to a DST routing algorithm. It is better to develop an aircraft route using the best information available than no information. The temporally coarse long range forecast data needs to be interpolated to be useful to a DST. A DST uses aircraft trajectory predictions that need to be evaluated for impacts by convective storms. Each time-step of a trajectory prediction n&s to be checked against weather data. For the case of coarse temporal data, there needs to be a method fill in weather data where there is none. Simply using the coarse weather data without any interpolation can result in DST routes that are impacted by regions of strong convection. Increasing the temporal resolution of these data can be achieved but result in a large dataset that may prove to be an operational challenge in transmission and loading by a DST. Currently, it takes about 7mins retrieve a 7mb RUC2 forecast file from NOAA at NASA-Ames Research Center. A prototype NCWF6 1 hour forecast is about 3mb in size. A Six hour NCWFG forecast with a 1hr forecast time-step will be about l8mb (6 x 3mb). A 6 hour NCWF6 forecast with a l5min forecast time-step will be about 7mb (24 x 3mb). Based on the time it takes to retrieve a 7mb RUC2 forecast, it will take approximately 70mins to retrieve a 6 hour NCWF forecast with 15min time steps. Until those issues are addressed, there is a need to develop an algorithm that interpolates between these temporally coarse long range forecasts. This paper describes a method of how to use low temporal resolution probabilistic weather forecasts in a DST. The beginning of this paper is a description of some convective weather forecast and observation products followed by an example of how weather data are used by a DST. The subsequent sections will describe probabilistic forecasts followed by a descrtion of a method to use low temporal resolution probabilistic weather forecasts by providing a relevance value to these data outside of their valid times.
A national-scale seasonal hydrological forecast system: development and evaluation over Britain
NASA Astrophysics Data System (ADS)
Bell, Victoria A.; Davies, Helen N.; Kay, Alison L.; Brookshaw, Anca; Scaife, Adam A.
2017-09-01
Skilful winter seasonal predictions for the North Atlantic circulation and northern Europe have now been demonstrated and the potential for seasonal hydrological forecasting in the UK is now being explored. One of the techniques being used combines seasonal rainfall forecasts provided by operational weather forecast systems with hydrological modelling tools to provide estimates of seasonal mean river flows up to a few months ahead. The work presented here shows how spatial information contained in a distributed hydrological model typically requiring high-resolution (daily or better) rainfall data can be used to provide an initial condition for a much simpler forecast model tailored to use low-resolution monthly rainfall forecasts. Rainfall forecasts (hindcasts
) from the GloSea5 model (1996 to 2009) are used to provide the first assessment of skill in these national-scale flow forecasts. The skill in the combined modelling system is assessed for different seasons and regions of Britain, and compared to what might be achieved using other approaches such as use of an ensemble of historical rainfall in a hydrological model, or a simple flow persistence forecast. The analysis indicates that only limited forecast skill is achievable for Spring and Summer seasonal hydrological forecasts; however, Autumn and Winter flows can be reasonably well forecast using (ensemble mean) rainfall forecasts based on either GloSea5 forecasts or historical rainfall (the preferred type of forecast depends on the region). Flow forecasts using ensemble mean GloSea5 rainfall perform most consistently well across Britain, and provide the most skilful forecasts overall at the 3-month lead time. Much of the skill (64 %) in the 1-month ahead seasonal flow forecasts can be attributed to the hydrological initial condition (particularly in regions with a significant groundwater contribution to flows), whereas for the 3-month ahead lead time, GloSea5 forecasts account for ˜ 70 % of the forecast skill (mostly in areas of high rainfall to the north and west) and only 30 % of the skill arises from hydrological memory (typically groundwater-dominated areas). Given the high spatial heterogeneity in typical patterns of UK rainfall and evaporation, future development of skilful spatially distributed seasonal forecasts could lead to substantial improvements in seasonal flow forecast capability, potentially benefitting practitioners interested in predicting hydrological extremes, not only in the UK but also across Europe.
Congestion control strategy on complex network with privilege traffic
NASA Astrophysics Data System (ADS)
Li, Shi-Bao; He, Ya; Liu, Jian-Hang; Zhang, Zhi-Gang; Huang, Jun-Wei
The congestion control of traffic is one of the most important studies in complex networks. In the previous congestion algorithms, all the network traffic is assumed to have the same priority, and the privilege of traffic is ignored. In this paper, a privilege and common traffic congestion control routing strategy (PCR) based on the different priority of traffic is proposed, which can be devised to cope with the different traffic congestion situations. We introduce the concept of privilege traffic in traffic dynamics for the first time and construct a new traffic model which taking into account requirements with different priorities. Besides, a new factor Ui is introduced by the theoretical derivation to characterize the interaction between different traffic routing selection, furthermore, Ui is related to the network throughput. Since the joint optimization among different kinds of traffic is accomplished by PCR, the maximum value of Ui can be significantly reduced and the network performance can be improved observably. The simulation results indicate that the network throughput with PCR has a better performance than the other strategies. Moreover, the network capacity is improved by 25% at least. Additionally, the network throughput is also influenced by privilege traffic number and traffic priority.
Traffic congestion and reliability : trends and advanced strategies for congestion mitigation.
DOT National Transportation Integrated Search
2005-09-01
The report Traffic Congestion and Reliability: Trends and Advanced Strategies for : Congestion Mitigation provides a snapshot of congestion in the United States by : summarizing recent trends in congestion, highlighting the role of travel time : reli...
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
NASA Astrophysics Data System (ADS)
Bao, Hongjun; Zhao, Linna
2012-02-01
A coupled atmospheric-hydrologic-hydraulic ensemble flood forecasting model, driven by The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data, has been developed for flood forecasting over the Huaihe River. The incorporation of numerical weather prediction (NWP) information into flood forecasting systems may increase forecast lead time from a few hours to a few days. A single NWP model forecast from a single forecast center, however, is insufficient as it involves considerable non-predictable uncertainties and leads to a high number of false alarms. The availability of global ensemble NWP systems through TIGGE offers a new opportunity for flood forecast. The Xinanjiang model used for hydrological rainfall-runoff modeling and the one-dimensional unsteady flow model applied to channel flood routing are coupled with ensemble weather predictions based on the TIGGE data from the Canadian Meteorological Centre (CMC), the European Centre for Medium-Range Weather Forecasts (ECMWF), the UK Met Office (UKMO), and the US National Centers for Environmental Prediction (NCEP). The developed ensemble flood forecasting model is applied to flood forecasting of the 2007 flood season as a test case. The test case is chosen over the upper reaches of the Huaihe River above Lutaizi station with flood diversion and retarding areas. The input flood discharge hydrograph from the main channel to the flood diversion area is estimated with the fixed split ratio of the main channel discharge. The flood flow inside the flood retarding area is calculated as a reservoir with the water balance method. The Muskingum method is used for flood routing in the flood diversion area. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE ensemble forecasts. The results demonstrate satisfactory flood forecasting with clear signals of probability of floods up to a few days in advance, and show that TIGGE ensemble forecast data are a promising tool for forecasting of flood inundation, comparable with that driven by raingauge observations.
Price of anarchy is maximized at the percolation threshold.
Skinner, Brian
2015-05-01
When many independent users try to route traffic through a network, the flow can easily become suboptimal as a consequence of congestion of the most efficient paths. The degree of this suboptimality is quantified by the so-called price of anarchy (POA), but so far there are no general rules for when to expect a large POA in a random network. Here I address this question by introducing a simple model of flow through a network with randomly placed congestible and incongestible links. I show that the POA is maximized precisely when the fraction of congestible links matches the percolation threshold of the lattice. Both the POA and the total cost demonstrate critical scaling near the percolation threshold.
A new scoring method for evaluating the performance of earthquake forecasts and predictions
NASA Astrophysics Data System (ADS)
Zhuang, J.
2009-12-01
This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2015-08-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.
Evaluation and Applications of the Prediction of Intensity Model Error (PRIME) Model
NASA Astrophysics Data System (ADS)
Bhatia, K. T.; Nolan, D. S.; Demaria, M.; Schumacher, A.
2015-12-01
Forecasters and end users of tropical cyclone (TC) intensity forecasts would greatly benefit from a reliable expectation of model error to counteract the lack of consistency in TC intensity forecast performance. As a first step towards producing error predictions to accompany each TC intensity forecast, Bhatia and Nolan (2013) studied the relationship between synoptic parameters, TC attributes, and forecast errors. In this study, we build on previous results of Bhatia and Nolan (2013) by testing the ability of the Prediction of Intensity Model Error (PRIME) model to forecast the absolute error and bias of four leading intensity models available for guidance in the Atlantic basin. PRIME forecasts are independently evaluated at each 12-hour interval from 12 to 120 hours during the 2007-2014 Atlantic hurricane seasons. The absolute error and bias predictions of PRIME are compared to their respective climatologies to determine their skill. In addition to these results, we will present the performance of the operational version of PRIME run during the 2015 hurricane season. PRIME verification results show that it can reliably anticipate situations where particular models excel, and therefore could lead to a more informed protocol for hurricane evacuations and storm preparations. These positive conclusions suggest that PRIME forecasts also have the potential to lower the error in the original intensity forecasts of each model. As a result, two techniques are proposed to develop a post-processing procedure for a multimodel ensemble based on PRIME. The first approach is to inverse-weight models using PRIME absolute error predictions (higher predicted absolute error corresponds to lower weights). The second multimodel ensemble applies PRIME bias predictions to each model's intensity forecast and the mean of the corrected models is evaluated. The forecasts of both of these experimental ensembles are compared to those of the equal-weight ICON ensemble, which currently provides the most reliable forecasts in the Atlantic basin.
verification statistics Grumbine, R. W., Virtual Floe Ice Drift Forecast Model Intercomparison, Weather and Forecasting, 13, 886-890, 1998. MMAB Note: Virtual Floe Ice Drift Forecast Model Intercomparison 1996 pdf ~47
Liu, Dong-jun; Li, Li
2015-01-01
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332
Liu, Dong-jun; Li, Li
2015-06-23
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.
Stochastic Model of Seasonal Runoff Forecasts
NASA Astrophysics Data System (ADS)
Krzysztofowicz, Roman; Watada, Leslie M.
1986-03-01
Each year the National Weather Service and the Soil Conservation Service issue a monthly sequence of five (or six) categorical forecasts of the seasonal snowmelt runoff volume. To describe uncertainties in these forecasts for the purposes of optimal decision making, a stochastic model is formulated. It is a discrete-time, finite, continuous-space, nonstationary Markov process. Posterior densities of the actual runoff conditional upon a forecast, and transition densities of forecasts are obtained from a Bayesian information processor. Parametric densities are derived for the process with a normal prior density of the runoff and a linear model of the forecast error. The structure of the model and the estimation procedure are motivated by analyses of forecast records from five stations in the Snake River basin, from the period 1971-1983. The advantages of supplementing the current forecasting scheme with a Bayesian analysis are discussed.
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
Yoo, Wucherl; Sim, Alex
2016-06-24
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
Model Error Estimation for the CPTEC Eta Model
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; daSilva, Arlindo
1999-01-01
Statistical data assimilation systems require the specification of forecast and observation error statistics. Forecast error is due to model imperfections and differences between the initial condition and the actual state of the atmosphere. Practical four-dimensional variational (4D-Var) methods try to fit the forecast state to the observations and assume that the model error is negligible. Here with a number of simplifying assumption, a framework is developed for isolating the model error given the forecast error at two lead-times. Two definitions are proposed for the Talagrand ratio tau, the fraction of the forecast error due to model error rather than initial condition error. Data from the CPTEC Eta Model running operationally over South America are used to calculate forecast error statistics and lower bounds for tau.
Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin
Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.
A scoping review of malaria forecasting: past work and future directions
Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L
2012-01-01
Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venner, Jason; Moreno-Madrinan, Max J.; Delgado, Francisco
2012-01-01
Two projects at NASA Marshall Space Flight Center have collaborated to develop a high resolution weather forecast model for Mesoamerica: The NASA Short-term Prediction Research and Transition (SPoRT) Center, which integrates unique NASA satellite and weather forecast modeling capabilities into the operational weather forecasting community. NASA's SERVIR Program, which integrates satellite observations, ground-based data, and forecast models to improve disaster response in Central America, the Caribbean, Africa, and the Himalayas.
A study for systematic errors of the GLA forecast model in tropical regions
NASA Technical Reports Server (NTRS)
Chen, Tsing-Chang; Baker, Wayman E.; Pfaendtner, James; Corrigan, Martin
1988-01-01
From the sensitivity studies performed with the Goddard Laboratory for Atmospheres (GLA) analysis/forecast system, it was revealed that the forecast errors in the tropics affect the ability to forecast midlatitude weather in some cases. Apparently, the forecast errors occurring in the tropics can propagate to midlatitudes. Therefore, the systematic error analysis of the GLA forecast system becomes a necessary step in improving the model's forecast performance. The major effort of this study is to examine the possible impact of the hydrological-cycle forecast error on dynamical fields in the GLA forecast system.
An Integrated Enrollment Forecast Model. IR Applications, Volume 15, January 18, 2008
ERIC Educational Resources Information Center
Chen, Chau-Kuang
2008-01-01
Enrollment forecasting is the central component of effective budget and program planning. The integrated enrollment forecast model is developed to achieve a better understanding of the variables affecting student enrollment and, ultimately, to perform accurate forecasts. The transfer function model of the autoregressive integrated moving average…
Can we use Earth Observations to improve monthly water level forecasts?
NASA Astrophysics Data System (ADS)
Slater, L. J.; Villarini, G.
2017-12-01
Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
A stochastic post-processing method for solar irradiance forecasts derived from NWPs models
NASA Astrophysics Data System (ADS)
Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.
2010-09-01
Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.
A comparative analysis of errors in long-term econometric forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tepel, R.
1986-04-01
The growing body of literature that documents forecast accuracy falls generally into two parts. The first is prescriptive and is carried out by modelers who use simulation analysis as a tool for model improvement. These studies are ex post, that is, they make use of known values for exogenous variables and generate an error measure wholly attributable to the model. The second type of analysis is descriptive and seeks to measure errors, identify patterns among errors and variables and compare forecasts from different sources. Most descriptive studies use an ex ante approach, that is, they evaluate model outputs based onmore » estimated (or forecasted) exogenous variables. In this case, it is the forecasting process, rather than the model, that is under scrutiny. This paper uses an ex ante approach to measure errors in forecast series prepared by Data Resources Incorporated (DRI), Wharton Econometric Forecasting Associates (Wharton), and Chase Econometrics (Chase) and to determine if systematic patterns of errors can be discerned between services, types of variables (by degree of aggregation), length of forecast and time at which the forecast is made. Errors are measured as the percent difference between actual and forecasted values for the historical period of 1971 to 1983.« less
Forecasting daily patient volumes in the emergency department.
Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L
2008-02-01
Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.
New product forecasting with limited or no data
NASA Astrophysics Data System (ADS)
Ismai, Zuhaimy; Abu, Noratikah; Sufahani, Suliadi
2016-10-01
In the real world, forecasts would always be based on historical data with the assumption that the behaviour be the same for the future. But how do we forecast when there is no such data available? New product or new technologies normally has limited amount of data available. Knowing that forecasting is valuable for decision making, this paper presents forecasting of new product or new technologies using aggregate diffusion models and modified Bass Model. A newly launched Proton car and its penetration was chosen to demonstrate the possibility of forecasting sales demand where there is limited or no data available. The model was developed to forecast diffusion of new vehicle or an innovation in the Malaysian society. It is to represent the level of spread on the new vehicle among a given set of the society in terms of a simple mathematical function that elapsed since the introduction of the new product. This model will forecast the car sales volume. A procedure of the proposed diffusion model was designed and the parameters were estimated. Results obtained by applying the proposed diffusion model and numerical calculation shows that the model is robust and effective for forecasting demand of the new vehicle. The results reveal that newly developed modified Bass diffusion of demand function has significantly contributed for forecasting the diffusion of new Proton car or new product.
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn; Steinsland, Ingelin
2014-05-01
This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.
NASA Astrophysics Data System (ADS)
Siegert, Stefan
2017-04-01
Initialised climate forecasts on seasonal time scales, run several months or even years ahead, are now an integral part of the battery of products offered by climate services world-wide. The availability of seasonal climate forecasts from various modeling centres gives rise to multi-model ensemble forecasts. Post-processing such seasonal-to-decadal multi-model forecasts is challenging 1) because the cross-correlation structure between multiple models and observations can be complicated, 2) because the amount of training data to fit the post-processing parameters is very limited, and 3) because the forecast skill of numerical models tends to be low on seasonal time scales. In this talk I will review new statistical post-processing frameworks for multi-model ensembles. I will focus particularly on Bayesian hierarchical modelling approaches, which are flexible enough to capture commonly made assumptions about collective and model-specific biases of multi-model ensembles. Despite the advances in statistical methodology, it turns out to be very difficult to out-perform the simplest post-processing method, which just recalibrates the multi-model ensemble mean by linear regression. I will discuss reasons for this, which are closely linked to the specific characteristics of seasonal multi-model forecasts. I explore possible directions for improvements, for example using informative priors on the post-processing parameters, and jointly modelling forecasts and observations.
Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.
Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier
2016-02-24
Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.
Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren
2016-01-01
Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words.
Omar, Hani; Hoang, Van Hai; Liu, Duen-Ren
2016-01-01
Enhancing sales and operations planning through forecasting analysis and business intelligence is demanded in many industries and enterprises. Publishing industries usually pick attractive titles and headlines for their stories to increase sales, since popular article titles and headlines can attract readers to buy magazines. In this paper, information retrieval techniques are adopted to extract words from article titles. The popularity measures of article titles are then analyzed by using the search indexes obtained from Google search engine. Backpropagation Neural Networks (BPNNs) have successfully been used to develop prediction models for sales forecasting. In this study, we propose a novel hybrid neural network model for sales forecasting based on the prediction result of time series forecasting and the popularity of article titles. The proposed model uses the historical sales data, popularity of article titles, and the prediction result of a time series, Autoregressive Integrated Moving Average (ARIMA) forecasting method to learn a BPNN-based forecasting model. Our proposed forecasting model is experimentally evaluated by comparing with conventional sales prediction techniques. The experimental result shows that our proposed forecasting method outperforms conventional techniques which do not consider the popularity of title words. PMID:27313605
Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models
NASA Astrophysics Data System (ADS)
Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock
2017-05-01
Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.
Mixture EMOS model for calibrating ensemble forecasts of wind speed.
Baran, S; Lerch, S
2016-03-01
Ensemble model output statistics (EMOS) is a statistical tool for post-processing forecast ensembles of weather variables obtained from multiple runs of numerical weather prediction models in order to produce calibrated predictive probability density functions. The EMOS predictive probability density function is given by a parametric distribution with parameters depending on the ensemble forecasts. We propose an EMOS model for calibrating wind speed forecasts based on weighted mixtures of truncated normal (TN) and log-normal (LN) distributions where model parameters and component weights are estimated by optimizing the values of proper scoring rules over a rolling training period. The new model is tested on wind speed forecasts of the 50 member European Centre for Medium-range Weather Forecasts ensemble, the 11 member Aire Limitée Adaptation dynamique Développement International-Hungary Ensemble Prediction System ensemble of the Hungarian Meteorological Service, and the eight-member University of Washington mesoscale ensemble, and its predictive performance is compared with that of various benchmark EMOS models based on single parametric families and combinations thereof. The results indicate improved calibration of probabilistic and accuracy of point forecasts in comparison with the raw ensemble and climatological forecasts. The mixture EMOS model significantly outperforms the TN and LN EMOS methods; moreover, it provides better calibrated forecasts than the TN-LN combination model and offers an increased flexibility while avoiding covariate selection problems. © 2016 The Authors Environmetrics Published by JohnWiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Satti, S.; Zaitchik, B. F.; Siddiqui, S.; Badr, H. S.; Shukla, S.; Peters-Lidard, C. D.
2015-12-01
The unpredictable nature of precipitation within the East African (EA) region makes it one of the most vulnerable, food insecure regions in the world. There is a vital need for forecasts to inform decision makers, both local and regional, and to help formulate the region's climate change adaptation strategies. Here, we present a suite of different seasonal forecast models, both statistical and dynamical, for the EA region. Objective regionalization is performed for EA on the basis of interannual variability in precipitation in both observations and models. This regionalization is applied as the basis for calculating a number of standard skill scores to evaluate each model's forecast accuracy. A dynamically linked Land Surface Model (LSM) is then applied to determine forecasted flows, which drive the Sudanese Hydroeconomic Optimization Model (SHOM). SHOM combines hydrologic, agronomic and economic inputs to determine the optimal decisions that maximize economic benefits along the Sudanese Blue Nile. This modeling sequence is designed to derive the potential added value of information of each forecasting model to agriculture and hydropower management. A rank of each model's forecasting skill score along with its added value of information is analyzed in order compare the performance of each forecast. This research aims to improve understanding of how characteristics of accuracy, lead time, and uncertainty of seasonal forecasts influence their utility to water resources decision makers who utilize them.
Potential predictability and forecast skill in ensemble climate forecast: the skill-persistence rule
NASA Astrophysics Data System (ADS)
Jin, Y.; Rong, X.; Liu, Z.
2017-12-01
This study investigates the factors that impact the forecast skill for the real world (actual skill) and perfect model (perfect skill) in ensemble climate model forecast with a series of fully coupled general circulation model forecast experiments. It is found that the actual skill of sea surface temperature (SST) in seasonal forecast is substantially higher than the perfect skill on a large part of the tropical oceans, especially the tropical Indian Ocean and the central-eastern Pacific Ocean. The higher actual skill is found to be related to the higher observational SST persistence, suggesting a skill-persistence rule: a higher SST persistence in the real world than in the model could overwhelm the model bias to produce a higher forecast skill for the real world than for the perfect model. The relation between forecast skill and persistence is further examined using a first-order autoregressive model (AR1) analytically for theoretical solutions and numerically for analogue experiments. The AR1 model study shows that the skill-persistence rule is strictly valid in the case of infinite ensemble size, but can be distorted by the sampling error and non-AR1 processes.
Short-Term Global Horizontal Irradiance Forecasting Based on Sky Imaging and Pattern Recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Feng, Cong; Cui, Mingjian
Accurate short-term forecasting is crucial for solar integration in the power grid. In this paper, a classification forecasting framework based on pattern recognition is developed for 1-hour-ahead global horizontal irradiance (GHI) forecasting. Three sets of models in the forecasting framework are trained by the data partitioned from the preprocessing analysis. The first two sets of models forecast GHI for the first four daylight hours of each day. Then the GHI values in the remaining hours are forecasted by an optimal machine learning model determined based on a weather pattern classification model in the third model set. The weather pattern ismore » determined by a support vector machine (SVM) classifier. The developed framework is validated by the GHI and sky imaging data from the National Renewable Energy Laboratory (NREL). Results show that the developed short-term forecasting framework outperforms the persistence benchmark by 16% in terms of the normalized mean absolute error and 25% in terms of the normalized root mean square error.« less
Improving wave forecasting by integrating ensemble modelling and machine learning
NASA Astrophysics Data System (ADS)
O'Donncha, F.; Zhang, Y.; James, S. C.
2017-12-01
Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.
An Econometric Model for Forecasting Income and Employment in Hawaii.
ERIC Educational Resources Information Center
Chau, Laurence C.
This report presents the methodology for short-run forecasting of personal income and employment in Hawaii. The econometric model developed in the study is used to make actual forecasts through 1973 of income and employment, with major components forecasted separately. Several sets of forecasts are made, under different assumptions on external…
Pellicori, Pierpaolo; Cleland, John G F; Zhang, Jufen; Kallvikbacka-Bennett, Anna; Urbinati, Alessia; Shah, Parin; Kazmi, Syed; Clark, Andrew L
2016-12-01
Diuretics are the mainstay of treatment for congestion but concerns exist that they adversely affect prognosis. We explored whether the relationship between loop diuretic use and outcome is explained by the underlying severity of congestion amongst patients referred with suspected heart failure. Of 1190 patients, 712 had a left ventricular ejection fraction (LVEF) ≤50 %, 267 had LVEF >50 % with raised plasma NTproBNP (>400 ng/L) and 211 had LVEF >50 % with NTproBNP ≤400 ng/L; respectively, 72 %, 68 % and 37 % of these groups were treated with loop diuretics including 28 %, 29 % and 10 % in doses ≥80 mg furosemide equivalent/day. Compared to patients with cardiac dysfunction (either LVEF ≤50 % or NT-proBNP >400 ng/L) but not taking a loop diuretic, those taking a loop diuretic were older and had more clinical evidence of congestion, renal dysfunction, anaemia and hyponatraemia. During a median follow-up of 934 (IQR: 513-1425) days, 450 patients were hospitalized for HF or died. Patients prescribed loop diuretics had a worse outcome. However, in multi-variable models, clinical, echocardiographic (inferior vena cava diameter), and biochemical (NTproBNP) measures of congestion were strongly associated with an adverse outcome but not the use, or dose, of loop diuretics. Prescription of loop diuretics identifies patients with more advanced features of heart failure and congestion, which may account for their worse prognosis. Further research is needed to clarify the relationship between loop diuretic agents and outcome; imaging and biochemical measures of congestion might be better guides to diuretic dose than symptoms or clinical signs.
A Unified Data Assimilation Strategy for Regional Coupled Atmosphere-Ocean Prediction Systems
NASA Astrophysics Data System (ADS)
Xie, Lian; Liu, Bin; Zhang, Fuqing; Weng, Yonghui
2014-05-01
Improving tropical cyclone (TC) forecasts is a top priority in weather forecasting. Assimilating various observational data to produce better initial conditions for numerical models using advanced data assimilation techniques has been shown to benefit TC intensity forecasts, whereas assimilating large-scale environmental circulation into regional models by spectral nudging or Scale-Selective Data Assimilation (SSDA) has been demonstrated to improve TC track forecasts. Meanwhile, taking into account various air-sea interaction processes by high-resolution coupled air-sea modelling systems has also been shown to improve TC intensity forecasts. Despite the advances in data assimilation and air-sea coupled models, large errors in TC intensity and track forecasting remain. For example, Hurricane Nate (2011) has brought considerable challenge for the TC operational forecasting community, with very large intensity forecast errors (27, 25, and 40 kts for 48, 72, and 96 h, respectively) for the official forecasts. Considering the slow-moving nature of Hurricane Nate, it is reasonable to hypothesize that air-sea interaction processes played a critical role in the intensity change of the storm, and accurate representation of the upper ocean dynamics and thermodynamics is necessary to quantitatively describe the air-sea interaction processes. Currently, data assimilation techniques are generally only applied to hurricane forecasting in stand-alone atmospheric or oceanic model. In fact, most of the regional hurricane forecasting models only included data assimilation techniques for improving the initial condition of the atmospheric model. In such a situation, the benefit of adjustments in one model (atmospheric or oceanic) by assimilating observational data can be compromised by errors from the other model. Thus, unified data assimilation techniques for coupled air-sea modelling systems, which not only simultaneously assimilate atmospheric and oceanic observations into the coupled air-sea modelling system, but also nudging the large-scale environmental flow in the regional model towards global model forecasts are of increasing necessity. In this presentation, we will outline a strategy for an integrated approach in air-sea coupled data assimilation and discuss its benefits and feasibility from incremental results for select historical hurricane cases.
A probabilistic drought forecasting framework: A combined dynamical and statistical approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh
In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less
Analysis of Random Drop for Gateway Congestion Control
1989-11-01
effective congest)on control policies. Currently No Gateway Policy is used to relieve and signal congestion, which leads to unfair service to the...early application of the policy removes the pressure of congestion relief and allows more accurate signaling of congestion. ’ To be used effectively ...prompted the need for more effective congestion control policies. Currently No Gateway Policy is used to relieve and signal congestion, which leads to
Chen, Mei-Chih; Chang, Kaowen
2014-01-01
Many city governments choose to supply more developable land and transportation infrastructure with the hope of attracting people and businesses to their cities. However, like those in Taiwan, major cities worldwide suffer from traffic congestion. This study applies the system thinking logic of the causal loops diagram (CLD) model in the System Dynamics (SD) approach to analyze the issue of traffic congestion and other issues related to roads and land development in Taiwan’s cities. Comparing the characteristics of development trends with yearbook data for 2002 to 2013 for all of Taiwan’s cities, this study explores the developing phenomenon of unlimited city sprawl and identifies the cause and effect relationships in the characteristics of development trends in traffic congestion, high-density population aggregation in cities, land development, and green land disappearance resulting from city sprawl. This study provides conclusions for Taiwan’s cities’ sustainability and development (S&D). When developing S&D policies, during decision making processes concerning city planning and land use management, governments should think with a holistic view of carrying capacity with the assistance of system thinking to clarify the prejudices in favor of the unlimited developing phenomena resulting from city sprawl. PMID:25383609
Modeling and analyzing cascading dynamics of the Internet based on local congestion information
NASA Astrophysics Data System (ADS)
Zhu, Qian; Nie, Jianlong; Zhu, Zhiliang; Yu, Hai; Xue, Yang
2018-06-01
Cascading failure has already become one of the vital issues in network science. By considering realistic network operational settings, we propose the congestion function to represent the congested extent of node and construct a local congestion-aware routing strategy with a tunable parameter. We investigate the cascading failures on the Internet triggered by deliberate attacks. Simulation results show that the tunable parameter has an optimal value that makes the network achieve a maximum level of robustness. The robustness of the network has a positive correlation with tolerance parameter, but it has a negative correlation with the packets generation rate. In addition, there exists a threshold of the attacking proportion of nodes that makes the network achieve the lowest robustness. Moreover, by introducing the concept of time delay for information transmission on the Internet, we found that an increase of the time delay will decrease the robustness of the network rapidly. The findings of the paper will be useful for enhancing the robustness of the Internet in the future.
Critical behaviour in charging of electric vehicles
NASA Astrophysics Data System (ADS)
Carvalho, Rui; Buzna, Lubos; Gibbens, Richard; Kelly, Frank
2015-09-01
The increasing penetration of electric vehicles over the coming decades, taken together with the high cost to upgrade local distribution networks and consumer demand for home charging, suggest that managing congestion on low voltage networks will be a crucial component of the electric vehicle revolution and the move away from fossil fuels in transportation. Here, we model the max-flow and proportional fairness protocols for the control of congestion caused by a fleet of vehicles charging on two real-world distribution networks. We show that the system undergoes a continuous phase transition to a congested state as a function of the rate of vehicles plugging to the network to charge. We focus on the order parameter and its fluctuations close to the phase transition, and show that the critical point depends on the choice of congestion protocol. Finally, we analyse the inequality in the charging times as the vehicle arrival rate increases, and show that charging times are considerably more equitable in proportional fairness than in max-flow.
Two-step forecast of geomagnetic storm using coronal mass ejection and solar wind condition
Kim, R-S; Moon, Y-J; Gopalswamy, N; Park, Y-D; Kim, Y-H
2014-01-01
To forecast geomagnetic storms, we had examined initially observed parameters of coronal mass ejections (CMEs) and introduced an empirical storm forecast model in a previous study. Now we suggest a two-step forecast considering not only CME parameters observed in the solar vicinity but also solar wind conditions near Earth to improve the forecast capability. We consider the empirical solar wind criteria derived in this study (Bz ≤ −5 nT or Ey ≥ 3 mV/m for t≥ 2 h for moderate storms with minimum Dst less than −50 nT) and a Dst model developed by Temerin and Li (2002, 2006) (TL model). Using 55 CME-Dst pairs during 1997 to 2003, our solar wind criteria produce slightly better forecasts for 31 storm events (90%) than the forecasts based on the TL model (87%). However, the latter produces better forecasts for 24 nonstorm events (88%), while the former correctly forecasts only 71% of them. We then performed the two-step forecast. The results are as follows: (i) for 15 events that are incorrectly forecasted using CME parameters, 12 cases (80%) can be properly predicted based on solar wind conditions; (ii) if we forecast a storm when both CME and solar wind conditions are satisfied (∩), the critical success index becomes higher than that from the forecast using CME parameters alone, however, only 25 storm events (81%) are correctly forecasted; and (iii) if we forecast a storm when either set of these conditions is satisfied (∪), all geomagnetic storms are correctly forecasted. PMID:26213515
Two-step forecast of geomagnetic storm using coronal mass ejection and solar wind condition.
Kim, R-S; Moon, Y-J; Gopalswamy, N; Park, Y-D; Kim, Y-H
2014-04-01
To forecast geomagnetic storms, we had examined initially observed parameters of coronal mass ejections (CMEs) and introduced an empirical storm forecast model in a previous study. Now we suggest a two-step forecast considering not only CME parameters observed in the solar vicinity but also solar wind conditions near Earth to improve the forecast capability. We consider the empirical solar wind criteria derived in this study ( B z ≤ -5 nT or E y ≥ 3 mV/m for t ≥ 2 h for moderate storms with minimum Dst less than -50 nT) and a Dst model developed by Temerin and Li (2002, 2006) (TL model). Using 55 CME- Dst pairs during 1997 to 2003, our solar wind criteria produce slightly better forecasts for 31 storm events (90%) than the forecasts based on the TL model (87%). However, the latter produces better forecasts for 24 nonstorm events (88%), while the former correctly forecasts only 71% of them. We then performed the two-step forecast. The results are as follows: (i) for 15 events that are incorrectly forecasted using CME parameters, 12 cases (80%) can be properly predicted based on solar wind conditions; (ii) if we forecast a storm when both CME and solar wind conditions are satisfied (∩), the critical success index becomes higher than that from the forecast using CME parameters alone, however, only 25 storm events (81%) are correctly forecasted; and (iii) if we forecast a storm when either set of these conditions is satisfied (∪), all geomagnetic storms are correctly forecasted.
High-resolution weather forecasting is affected by many aspects, i.e. model initial conditions, subgrid-scale cumulus convection and cloud microphysics schemes. Recent 12km grid studies using the Weather Research and Forecasting (WRF) model have identified the importance of inco...
ERIC Educational Resources Information Center
Fechter, Alan
Obstacles to producing forecasts of the impact of technological change and skill utilization are briefly discussed, and existing models for forecasting manpower requirements are described and analyzed. A survey of current literature reveals a concentration of models for producing long-range national forecasts, but few models for generating…
Regional Air Quality forecAST (RAQAST) Over the U.S
NASA Astrophysics Data System (ADS)
Yoshida, Y.; Choi, Y.; Zeng, T.; Wang, Y.
2005-12-01
A regional chemistry and transport modeling system is used to provide 48-hour forecast of the concentrations of ozone and its precursors over the United States. Meteorological forecast is conducted using the NCAR/Penn State MM5 model. The regional chemistry and transport model simulates the sources, transport, chemistry, and deposition of 24 chemical tracers. The lateral and upper boundary conditions of trace gas concentrations are specified using the monthly mean output from the global GEOS-CHEM model. The initial and boundary conditions for meteorological fields are taken from the NOAA AVN forecast. The forecast has been operational since August, 2003. Model simulations are evaluated using surface, aircraft, and satellite measurements in the A'hindcast' mode. The next step is an automated forecast evaluation system.
Demand forecast model based on CRM
NASA Astrophysics Data System (ADS)
Cai, Yuancui; Chen, Lichao
2006-11-01
With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2015-04-01
Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.
NASA Astrophysics Data System (ADS)
Pérez, B.; Brouwer, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hackett, B.; Verlaan, M.; Fanjul, E. A.
2012-03-01
ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of several storm surge or circulation models and near-real time tide gauge data in the region, with the following main goals: 1. providing easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool; 2. generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average technique (BMA). The Bayesian Model Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the Bayesian likelihood that a model will give the correct forecast and are continuously updated based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. The system was implemented for the European Atlantic facade (IBIROOS region) and Western Mediterranean coast based on the MATROOS visualization tool developed by Deltares. Results of validation of the different models and BMA implementation for the main harbours are presented for these regions where this kind of activity is performed for the first time. The system is currently operational at Puertos del Estado and has proved to be useful in the detection of calibration problems in some of the circulation models, in the identification of the systematic differences between baroclinic and barotropic models for sea level forecasts and to demonstrate the feasibility of providing an overall probabilistic forecast, based on the BMA method.
Do quantitative decadal forecasts from GCMs provide decision relevant skill?
NASA Astrophysics Data System (ADS)
Suckling, E. B.; Smith, L. A.
2012-04-01
It is widely held that only physics-based simulation models can capture the dynamics required to provide decision-relevant probabilistic climate predictions. This fact in itself provides no evidence that predictions from today's GCMs are fit for purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales, where it is argued that these 'physics free' forecasts provide a quantitative 'zero skill' target for the evaluation of forecasts based on more complicated models. It is demonstrated that these zero skill models are competitive with GCMs on decadal scales for probability forecasts evaluated over the last 50 years. Complications of statistical interpretation due to the 'hindcast' nature of this experiment, and the likely relevance of arguments that the lack of hindcast skill is irrelevant as the signal will soon 'come out of the noise' are discussed. A lack of decision relevant quantiative skill does not bring the science-based insights of anthropogenic warming into doubt, but it does call for a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to do so may risk the credibility of science in support of policy in the long term. The performance amongst a collection of simulation models is evaluated, having transformed ensembles of point forecasts into probability distributions through the kernel dressing procedure [1], according to a selection of proper skill scores [2] and contrasted with purely data-based empirical models. Data-based models are unlikely to yield realistic forecasts for future climate change if the Earth system moves away from the conditions observed in the past, upon which the models are constructed; in this sense the empirical model defines zero skill. When should a decision relevant simulation model be expected to significantly outperform such empirical models? Probability forecasts up to ten years ahead (decadal forecasts) are considered, both on global and regional spatial scales for surface air temperature. Such decadal forecasts are not only important in terms of providing information on the impacts of near-term climate change, but also from the perspective of climate model validation, as hindcast experiments and a sufficient database of historical observations allow standard forecast verification methods to be used. Simulation models from the ENSEMBLES hindcast experiment [3] are evaluated and contrasted with static forecasts of the observed climatology, persistence forecasts and against simple statistical models, called dynamic climatology (DC). It is argued that DC is a more apropriate benchmark in the case of a non-stationary climate. It is found that the ENSEMBLES models do not demonstrate a significant increase in skill relative to the empirical models even at global scales over any lead time up to a decade ahead. It is suggested that the contsruction and co-evaluation with the data-based models become a regular component of the reporting of large simulation model forecasts. The methodology presented may easily be adapted to other forecasting experiments and is expected to influence the design of future experiments. The inclusion of comparisons with dynamic climatology and other data-based approaches provide important information to both scientists and decision makers on which aspects of state-of-the-art simulation forecasts are likely to be fit for purpose. [1] J. Bröcker and L. A. Smith. From ensemble forecasts to predictive distributions, Tellus A, 60(4), 663-678 (2007). [2] J. Bröcker and L. A. Smith. Scoring probabilistic forecasts: The importance of being proper, Weather and Forecasting, 22, 382-388 (2006). [3] F. J. Doblas-Reyes, A. Weisheimer, T. N. Palmer, J. M. Murphy and D. Smith. Forecast quality asessment of the ENSEMBLES seasonal-to-decadal stream 2 hindcasts, ECMWF Technical Memorandum, 621 (2010).
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Hock-Eam, Lim
2012-09-01
Our empirical results show that we can predict GDP growth rate more accurately in continent with fewer large economies, compared to smaller economies like Malaysia. This difficulty is very likely positively correlated with subsidy or social security policies. The stage of economic development and level of competiveness also appears to have interactive effects on this forecast stability. These results are generally independent of the forecasting procedures. Countries with high stability in their economic growth, forecasting by model selection is better than model averaging. Overall forecast weight averaging (FWA) is a better forecasting procedure in most countries. FWA also outperforms simple model averaging (SMA) and has the same forecasting ability as Bayesian model averaging (BMA) in almost all countries.
Model Forecast Skill and Sensitivity to Initial Conditions in the Seasonal Sea Ice Outlook
NASA Technical Reports Server (NTRS)
Blanchard-Wrigglesworth, E.; Cullather, R. I.; Wang, W.; Zhang, J.; Bitz, C. M.
2015-01-01
We explore the skill of predictions of September Arctic sea ice extent from dynamical models participating in the Sea Ice Outlook (SIO). Forecasts submitted in August, at roughly 2 month lead times, are skillful. However, skill is lower in forecasts submitted to SIO, which began in 2008, than in hindcasts (retrospective forecasts) of the last few decades. The multimodel mean SIO predictions offer slightly higher skill than the single-model SIO predictions, but neither beats a damped persistence forecast at longer than 2 month lead times. The models are largely unsuccessful at predicting each other, indicating a large difference in model physics and/or initial conditions. Motivated by this, we perform an initial condition sensitivity experiment with four SIO models, applying a fixed -1 m perturbation to the initial sea ice thickness. The significant range of the response among the models suggests that different model physics make a significant contribution to forecast uncertainty.
Wang, Deyun; Wei, Shuai; Luo, Hongyuan; Yue, Chenqiang; Grunder, Olivier
2017-02-15
The randomness, non-stationarity and irregularity of air quality index (AQI) series bring the difficulty of AQI forecasting. To enhance forecast accuracy, a novel hybrid forecasting model combining two-phase decomposition technique and extreme learning machine (ELM) optimized by differential evolution (DE) algorithm is developed for AQI forecasting in this paper. In phase I, the complementary ensemble empirical mode decomposition (CEEMD) is utilized to decompose the AQI series into a set of intrinsic mode functions (IMFs) with different frequencies; in phase II, in order to further handle the high frequency IMFs which will increase the forecast difficulty, variational mode decomposition (VMD) is employed to decompose the high frequency IMFs into a number of variational modes (VMs). Then, the ELM model optimized by DE algorithm is applied to forecast all the IMFs and VMs. Finally, the forecast value of each high frequency IMF is obtained through adding up the forecast results of all corresponding VMs, and the forecast series of AQI is obtained by aggregating the forecast results of all IMFs. To verify and validate the proposed model, two daily AQI series from July 1, 2014 to June 30, 2016 collected from Beijing and Shanghai located in China are taken as the test cases to conduct the empirical study. The experimental results show that the proposed hybrid model based on two-phase decomposition technique is remarkably superior to all other considered models for its higher forecast accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.
Three-model ensemble wind prediction in southern Italy
NASA Astrophysics Data System (ADS)
Torcasio, Rosa Claudia; Federico, Stefano; Calidonna, Claudia Roberta; Avolio, Elenio; Drofa, Oxana; Landi, Tony Christian; Malguzzi, Piero; Buzzi, Andrea; Bonasoni, Paolo
2016-03-01
Quality of wind prediction is of great importance since a good wind forecast allows the prediction of available wind power, improving the penetration of renewable energies into the energy market. Here, a 1-year (1 December 2012 to 30 November 2013) three-model ensemble (TME) experiment for wind prediction is considered. The models employed, run operationally at National Research Council - Institute of Atmospheric Sciences and Climate (CNR-ISAC), are RAMS (Regional Atmospheric Modelling System), BOLAM (BOlogna Limited Area Model), and MOLOCH (MOdello LOCale in H coordinates). The area considered for the study is southern Italy and the measurements used for the forecast verification are those of the GTS (Global Telecommunication System). Comparison with observations is made every 3 h up to 48 h of forecast lead time. Results show that the three-model ensemble outperforms the forecast of each individual model. The RMSE improvement compared to the best model is between 22 and 30 %, depending on the season. It is also shown that the three-model ensemble outperforms the IFS (Integrated Forecasting System) of the ECMWF (European Centre for Medium-Range Weather Forecast) for the surface wind forecasts. Notably, the three-model ensemble forecast performs better than each unbiased model, showing the added value of the ensemble technique. Finally, the sensitivity of the three-model ensemble RMSE to the length of the training period is analysed.
Statistical and dynamical forecast of regional precipitation after mature phase of ENSO
NASA Astrophysics Data System (ADS)
Sohn, S.; Min, Y.; Lee, J.; Tam, C.; Ahn, J.
2010-12-01
While the seasonal predictability of general circulation models (GCMs) has been improved, the current model atmosphere in the mid-latitude does not respond correctly to external forcing such as tropical sea surface temperature (SST), particularly over the East Asia and western North Pacific summer monsoon regions. In addition, the time-scale of prediction scope is considerably limited and the model forecast skill still is very poor beyond two weeks. Although recent studies indicate that coupled model based multi-model ensemble (MME) forecasts show the better performance, the long-lead forecasts exceeding 9 months still show a dramatic decrease of the seasonal predictability. This study aims at diagnosing the dynamical MME forecasts comprised of the state of art 1-tier models as well as comparing them with the statistical model forecasts, focusing on the East Asian summer precipitation predictions after mature phase of ENSO. The lagged impact of El Nino as major climate contributor on the summer monsoon in model environments is also evaluated, in the sense of the conditional probabilities. To evaluate the probability forecast skills, the reliability (attributes) diagram and the relative operating characteristics following the recommendations of the World Meteorological Organization (WMO) Standardized Verification System for Long-Range Forecasts are used in this study. The results should shed light on the prediction skill for dynamical model and also for the statistical model, in forecasting the East Asian summer monsoon rainfall with a long-lead time.
National Centers for Environmental Prediction
Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
National Centers for Environmental Prediction
Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration /EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION / DIAGNOSTICS Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)
NASA Astrophysics Data System (ADS)
Arritt, R. W.
2008-12-01
The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Can regional climate models provide additional useful information from global seasonal forecasts? MRED will use a suite of regional climate models to downscale seasonal forecasts produced by the new National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus will be on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the potential usefulness of higher resolution, especially for near-surface fields influenced by high resolution orography. Each regional model will cover the conterminous US (CONUS) at approximately 32 km resolution, and will perform an ensemble of 15 runs for each year 1982-2003 for the forecast period 1 December - 30 April. MRED will compare individual regional and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs), as well as wind, humidity, radiation, turbulent heat fluxes, which are important for more advanced coupled macro-scale hydrologic models. Metrics of ensemble spread will also be evaluated. Extensive analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will eventually define a strategy for more skillful and useful regional seasonal climate forecasts.
Forecasting the mortality rates of Indonesian population by using neural network
NASA Astrophysics Data System (ADS)
Safitri, Lutfiani; Mardiyati, Sri; Rahim, Hendrisman
2018-03-01
A model that can represent a problem is required in conducting a forecasting. One of the models that has been acknowledged by the actuary community in forecasting mortality rate is the Lee-Certer model. Lee Carter model supported by Neural Network will be used to calculate mortality forecasting in Indonesia. The type of Neural Network used is feedforward neural network aligned with backpropagation algorithm in python programming language. And the final result of this study is mortality rate in forecasting Indonesia for the next few years
Potential Technologies for Assessing Risk Associated with a Mesoscale Forecast
2015-10-01
American GFS models, and informally applied on the Weather Research and Forecasting ( WRF ) model. The current CI equation is as follows...Reen B, Penc R. Investigating surface bias errors in the Weather Research and Forecasting ( WRF ) model using a Geographic Information System (GIS). J...Forecast model ( WRF -ARW) with extensions that might include finer terrain resolutions and more detailed representations of the underlying atmospheric
Inanlouganji, Alireza; Reddy, T. Agami; Katipamula, Srinivas
2018-04-13
Forecasting solar irradiation has acquired immense importance in view of the exponential increase in the number of solar photovoltaic (PV) system installations. In this article, analyses results involving statistical and machine-learning techniques to predict solar irradiation for different forecasting horizons are reported. Yearlong typical meteorological year 3 (TMY3) datasets from three cities in the United States with different climatic conditions have been used in this analysis. A simple forecast approach that assumes consecutive days to be identical serves as a baseline model to compare forecasting alternatives. To account for seasonal variability and to capture short-term fluctuations, different variants of themore » lagged moving average (LMX) model with cloud cover as the input variable are evaluated. Finally, the proposed LMX model is evaluated against an artificial neural network (ANN) model. How the one-hour and 24-hour models can be used in conjunction to predict different short-term rolling horizons is discussed, and this joint application is illustrated for a four-hour rolling horizon forecast scheme. Lastly, the effect of using predicted cloud cover values, instead of measured ones, on the accuracy of the models is assessed. Results show that LMX models do not degrade in forecast accuracy if models are trained with the forecast cloud cover data.« less
Skill of Ensemble Seasonal Probability Forecasts
NASA Astrophysics Data System (ADS)
Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk
2010-05-01
In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inanlouganji, Alireza; Reddy, T. Agami; Katipamula, Srinivas
Forecasting solar irradiation has acquired immense importance in view of the exponential increase in the number of solar photovoltaic (PV) system installations. In this article, analyses results involving statistical and machine-learning techniques to predict solar irradiation for different forecasting horizons are reported. Yearlong typical meteorological year 3 (TMY3) datasets from three cities in the United States with different climatic conditions have been used in this analysis. A simple forecast approach that assumes consecutive days to be identical serves as a baseline model to compare forecasting alternatives. To account for seasonal variability and to capture short-term fluctuations, different variants of themore » lagged moving average (LMX) model with cloud cover as the input variable are evaluated. Finally, the proposed LMX model is evaluated against an artificial neural network (ANN) model. How the one-hour and 24-hour models can be used in conjunction to predict different short-term rolling horizons is discussed, and this joint application is illustrated for a four-hour rolling horizon forecast scheme. Lastly, the effect of using predicted cloud cover values, instead of measured ones, on the accuracy of the models is assessed. Results show that LMX models do not degrade in forecast accuracy if models are trained with the forecast cloud cover data.« less
Optimal topology to minimizing congestion in connected communication complex network
NASA Astrophysics Data System (ADS)
Benyoussef, M.; Ez-Zahraouy, H.; Benyoussef, A.
In this paper, a new model of the interdependent complex network is proposed, based on two assumptions that (i) the capacity of a node depends on its degree, and (ii) the traffic load depends on the distribution of the links in the network. Based on these assumptions, the presented model proposes a method of connection not based on the node having a higher degree but on the region containing hubs. It is found that the final network exhibits two kinds of degree distribution behavior, depending on the kind and the way of the connection. This study reveals a direct relation between network structure and traffic flow. It is found that pc the point of transition between the free flow and the congested phase depends on the network structure and the degree distribution. Moreover, this new model provides an improvement in the traffic compared to the results found in a single network. The same behavior of degree distribution found in a BA network and observed in the real world is obtained; except for this model, the transition point between the free phase and congested phase is much higher than the one observed in a network of BA, for both static and dynamic protocols.
Skill of a global seasonal ensemble streamflow forecasting system
NASA Astrophysics Data System (ADS)
Candogan Yossef, Naze; Winsemius, Hessel; Weerts, Albrecht; van Beek, Rens; Bierkens, Marc
2013-04-01
Forecasting of water availability and scarcity is a prerequisite for managing the risks and opportunities caused by the inter-annual variability of streamflow. Reliable seasonal streamflow forecasts are necessary to prepare for an appropriate response in disaster relief, management of hydropower reservoirs, water supply, agriculture and navigation. Seasonal hydrological forecasting on a global scale could be valuable especially for developing regions of the world, where effective hydrological forecasting systems are scarce. In this study, we investigate the forecasting skill of the global seasonal streamflow forecasting system FEWS-World, using the global hydrological model PCR-GLOBWB. FEWS-World has been setup within the European Commission 7th Framework Programme project Global Water Scarcity Information Service (GLOWASIS). Skill is assessed in historical simulation mode as well as retroactive forecasting mode. The assessment in historical simulation mode used a meteorological forcing based on observations from the Climate Research Unit of the University of East Anglia and the ERA-40 reanalysis of the European Center for Medium-Range Weather Forecasts (ECMWF). We assessed the skill of the global hydrological model PCR-GLOBWB in reproducing past discharge extremes in 20 large rivers of the world. This preliminary assessment concluded that the prospects for seasonal forecasting with PCR-GLOBWB or comparable models are positive. However this assessment did not include actual meteorological forecasts. Thus the meteorological forcing errors were not assessed. Yet, in a forecasting setup, the predictive skill of a hydrological forecasting system is affected by errors due to uncertainty from numerical weather prediction models. For the assessment in retroactive forecasting mode, the model is forced with actual ensemble forecasts from the seasonal forecast archives of ECMWF. Skill is assessed at 78 stations on large river basins across the globe, for all the months of the year and for lead times up to 6 months. The forecasted discharges are compared with observed monthly streamflow records using the ensemble verification measures Brier Skill Score (BSS) and Continuous Ranked Probability Score (CRPS). The eventual goal is to transfer FEWS-World to operational forecasting mode, where the system will use operational seasonal forecasts from ECMWF. The results will be disseminated on the internet, and hopefully provide information that is valuable for users in data and model-poor regions of the world.
NASA Astrophysics Data System (ADS)
Subramanian, Aneesh C.; Palmer, Tim N.
2017-06-01
Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.
NASA Astrophysics Data System (ADS)
Nobre, Paulo; Moura, Antonio D.; Sun, Liqiang
2001-12-01
This study presents an evaluation of a seasonal climate forecast done with the International Research Institute for Climate Prediction (IRI) dynamical forecast system (regional model nested into a general circulation model) over northern South America for January-April 1999, encompassing the rainy season over Brazil's Nordeste. The one-way nesting is one in two tiers: first the NCEP's Regional Spectral Model (RSM) runs with an 80-km grid mesh forced by the ECHAM3 atmospheric general circulation model (AGCM) outputs; then the RSM runs with a finer grid mesh (20 km) forced by the forecasts generated by the RSM-80. An ensemble of three realizations is done. Lower boundary conditions over the oceans for both ECHAM and RSM model runs are sea surface temperature forecasts over the tropical oceans. Soil moisture is initialized by ECHAM's inputs. The rainfall forecasts generated by the regional model are compared with those of the AGCM and observations. It is shown that the regional model at 80-km resolution improves upon the AGCM rainfall forecast, reducing both seasonal bias and root-mean-square error. On the other hand, the RSM-20 forecasts presented larger errors, with spatial patterns that resemble those of local topography. The better forecast of the position and width of the intertropical convergence zone (ITCZ) over the tropical Atlantic by the RSM-80 model is one of the principal reasons for better-forecast scores of the RSM-80 relative to the AGCM. The regional model improved the spatial as well as the temporal details of rainfall distribution, and also presenting the minimum spread among the ensemble members. The statistics of synoptic-scale weather variability on seasonal timescales were best forecast with the regional 80-km model over the Nordeste. The possibility of forecasting the frequency distribution of dry and wet spells within the rainy season is encouraging.
Comparing Perceptions and Measures of Congestion
DOT National Transportation Integrated Search
2012-10-01
Peoples perception of congestion and the actual measured congestion do not always agree. Measured : congestion relates to the delay resulting from field measurements of traffic volume, speed, and travel : time. Peoples perception of congestion ...
Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio
2016-09-26
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.
Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio
2016-01-01
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707
A seasonal hydrologic ensemble prediction system for water resource management
NASA Astrophysics Data System (ADS)
Luo, L.; Wood, E. F.
2006-12-01
A seasonal hydrologic ensemble prediction system, developed for the Ohio River basin, has been improved and expanded to several other regions including the Eastern U.S., Africa and East Asia. The prediction system adopts the traditional Extended Streamflow Prediction (ESP) approach, utilizing the VIC (Variable Infiltration Capacity) hydrological model as the central tool for producing ensemble prediction of soil moisture, snow and streamflow with lead times up to 6-month. VIC is forced by observed meteorology to estimate the hydrological initial condition prior to the forecast, but during the forecast period the atmospheric forcing comes from statistically downscaled, seasonal forecast from dynamic climate models. The seasonal hydrologic ensemble prediction system is currently producing realtime seasonal hydrologic forecast for these regions on a monthly basis. Using hindcasts from a 19-year period (1981-1999), during which seasonal hindcasts from NCEP Climate Forecast System (CFS) and European Union DEMETER project are available, we evaluate the performance of the forecast system over our forecast regions. The evaluation shows that the prediction system using the current forecast approach is able to produce reliable and accurate precipitation, soil moisture and streamflow predictions. The overall skill is much higher then the traditional ESP. In particular, forecasts based on multiple climate model forecast are more skillful than single model-based forecast. This emphasizes the significant need for producing seasonal climate forecast with multiple climate models for hydrologic applications. Forecast from this system is expected to provide very valuable information about future hydrologic states and associated risks for end users, including water resource management and financial sectors.
NASA Astrophysics Data System (ADS)
Sinha, T.; Arumugam, S.
2012-12-01
Seasonal streamflow forecasts contingent on climate forecasts can be effectively utilized in updating water management plans and optimize generation of hydroelectric power. Streamflow in the rainfall-runoff dominated basins critically depend on forecasted precipitation in contrast to snow dominated basins, where initial hydrological conditions (IHCs) are more important. Since precipitation forecasts from Atmosphere-Ocean-General Circulation Models are available at coarse scale (~2.8° by 2.8°), spatial and temporal downscaling of such forecasts are required to implement land surface models, which typically runs on finer spatial and temporal scales. Consequently, multiple sources are introduced at various stages in predicting seasonal streamflow. Therefore, in this study, we addresses the following science questions: 1) How do we attribute the errors in monthly streamflow forecasts to various sources - (i) model errors, (ii) spatio-temporal downscaling, (iii) imprecise initial conditions, iv) no forecasts, and (iv) imprecise forecasts? and 2) How does monthly streamflow forecast errors propagate with different lead time over various seasons? In this study, the Variable Infiltration Capacity (VIC) model is calibrated over Apalachicola River at Chattahoochee, FL in the southeastern US and implemented with observed 1/8° daily forcings to estimate reference streamflow during 1981 to 2010. The VIC model is then forced with different schemes under updated IHCs prior to forecasting period to estimate relative mean square errors due to: a) temporally disaggregation, b) spatial downscaling, c) Reverse Ensemble Streamflow Prediction (imprecise IHCs), d) ESP (no forecasts), and e) ECHAM4.5 precipitation forecasts. Finally, error propagation under different schemes are analyzed with different lead time over different seasons.
NASA Astrophysics Data System (ADS)
O'Brien, Enda; McKinstry, Alastair; Ralph, Adam
2015-04-01
Building on previous work presented at EGU 2013 (http://www.sciencedirect.com/science/article/pii/S1876610213016068 ), more results are available now from a different wind-farm in complex terrain in southwest Ireland. The basic approach is to interpolate wind-speed forecasts from an operational weather forecast model (i.e., HARMONIE in the case of Ireland) to the precise location of each wind-turbine, and then use Bayes Model Averaging (BMA; with statistical information collected from a prior training-period of e.g., 25 days) to remove systematic biases. Bias-corrected wind-speed forecasts (and associated power-generation forecasts) are then provided twice daily (at 5am and 5pm) out to 30 hours, with each forecast validation fed back to BMA for future learning. 30-hr forecasts from the operational Met Éireann HARMONIE model at 2.5km resolution have been validated against turbine SCADA observations since Jan. 2014. An extra high-resolution (0.5km grid-spacing) HARMONIE configuration has been run since Nov. 2014 as an extra member of the forecast "ensemble". A new version of HARMONIE with extra filters designed to stabilize high-resolution configurations has been run since Jan. 2015. Measures of forecast skill and forecast errors will be provided, and the contributions made by the various physical and computational enhancements to HARMONIE will be quantified.
Wind power application research on the fusion of the determination and ensemble prediction
NASA Astrophysics Data System (ADS)
Lan, Shi; Lina, Xu; Yuzhu, Hao
2017-07-01
The fused product of wind speed for the wind farm is designed through the use of wind speed products of ensemble prediction from the European Centre for Medium-Range Weather Forecasts (ECMWF) and professional numerical model products on wind power based on Mesoscale Model5 (MM5) and Beijing Rapid Update Cycle (BJ-RUC), which are suitable for short-term wind power forecasting and electric dispatch. The single-valued forecast is formed by calculating the different ensemble statistics of the Bayesian probabilistic forecasting representing the uncertainty of ECMWF ensemble prediction. Using autoregressive integrated moving average (ARIMA) model to improve the time resolution of the single-valued forecast, and based on the Bayesian model averaging (BMA) and the deterministic numerical model prediction, the optimal wind speed forecasting curve and the confidence interval are provided. The result shows that the fusion forecast has made obvious improvement to the accuracy relative to the existing numerical forecasting products. Compared with the 0-24 h existing deterministic forecast in the validation period, the mean absolute error (MAE) is decreased by 24.3 % and the correlation coefficient (R) is increased by 12.5 %. In comparison with the ECMWF ensemble forecast, the MAE is reduced by 11.7 %, and R is increased 14.5 %. Additionally, MAE did not increase with the prolongation of the forecast ahead.
Forecasting malaria in a highly endemic country using environmental and clinical predictors.
Zinszer, Kate; Kigozi, Ruth; Charland, Katia; Dorsey, Grant; Brewer, Timothy F; Brownstein, John S; Kamya, Moses R; Buckeridge, David L
2015-06-18
Malaria thrives in poor tropical and subtropical countries where local resources are limited. Accurate disease forecasts can provide public and clinical health services with the information needed to implement targeted approaches for malaria control that make effective use of limited resources. The objective of this study was to determine the relevance of environmental and clinical predictors of malaria across different settings in Uganda. Forecasting models were based on health facility data collected by the Uganda Malaria Surveillance Project and satellite-derived rainfall, temperature, and vegetation estimates from 2006 to 2013. Facility-specific forecasting models of confirmed malaria were developed using multivariate autoregressive integrated moving average models and produced weekly forecast horizons over a 52-week forecasting period. The model with the most accurate forecasts varied by site and by forecast horizon. Clinical predictors were retained in the models with the highest predictive power for all facility sites. The average error over the 52 forecasting horizons ranged from 26 to 128% whereas the cumulative burden forecast error ranged from 2 to 22%. Clinical data, such as drug treatment, could be used to improve the accuracy of malaria predictions in endemic settings when coupled with environmental predictors. Further exploration of malaria forecasting is necessary to improve its accuracy and value in practice, including examining other environmental and intervention predictors, including insecticide-treated nets.
Real-time Social Internet Data to Guide Forecasting Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Valle, Sara Y.
Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less
Use of medium-range numerical weather prediction model output to produce forecasts of streamflow
Clark, M.P.; Hay, L.E.
2004-01-01
This paper examines an archive containing over 40 years of 8-day atmospheric forecasts over the contiguous United States from the NCEP reanalysis project to assess the possibilities for using medium-range numerical weather prediction model output for predictions of streamflow. This analysis shows the biases in the NCEP forecasts to be quite extreme. In many regions, systematic precipitation biases exceed 100% of the mean, with temperature biases exceeding 3??C. In some locations, biases are even higher. The accuracy of NCEP precipitation and 2-m maximum temperature forecasts is computed by interpolating the NCEP model output for each forecast day to the location of each station in the NWS cooperative network and computing the correlation with station observations. Results show that the accuracy of the NCEP forecasts is rather low in many areas of the country. Most apparent is the generally low skill in precipitation forecasts (particularly in July) and low skill in temperature forecasts in the western United States, the eastern seaboard, and the southern tier of states. These results outline a clear need for additional processing of the NCEP Medium-Range Forecast Model (MRF) output before it is used for hydrologic predictions. Techniques of model output statistics (MOS) are used in this paper to downscale the NCEP forecasts to station locations. Forecasted atmospheric variables (e.g., total column precipitable water, 2-m air temperature) are used as predictors in a forward screening multiple linear regression model to improve forecasts of precipitation and temperature for stations in the National Weather Service cooperative network. This procedure effectively removes all systematic biases in the raw NCEP precipitation and temperature forecasts. MOS guidance also results in substantial improvements in the accuracy of maximum and minimum temperature forecasts throughout the country. For precipitation, forecast improvements were less impressive. MOS guidance increases he accuracy of precipitation forecasts over the northeastern United States, but overall, the accuracy of MOS-based precipitation forecasts is slightly lower than the raw NCEP forecasts. Four basins in the United States were chosen as case studies to evaluate the value of MRF output for predictions of streamflow. Streamflow forecasts using MRF output were generated for one rainfall-dominated basin (Alapaha River at Statenville, Georgia) and three snowmelt-dominated basins (Animas River at Durango, Colorado: East Fork of the Carson River near Gardnerville, Nevada: and Cle Elum River near Roslyn, Washington). Hydrologic model output forced with measured-station data were used as "truth" to focus attention on the hydrologic effects of errors in the MRF forecasts. Eight-day streamflow forecasts produced using the MOS-corrected MRF output as input (MOS) were compared with those produced using the climatic Ensemble Streamflow Prediction (ESP) technique. MOS-based streamflow forecasts showed increased skill in the snowmelt-dominated river basins, where daily variations in streamflow are strongly forced by temperature. In contrast, the skill of MOS forecasts in the rainfall-dominated basin (the Alapaha River) were equivalent to the skill of the ESP forecasts. Further improvements in streamflow forecasts require more accurate local-scale forecasts of precipitation and temperature, more accurate specification of basin initial conditions, and more accurate model simulations of streamflow. ?? 2004 American Meteorological Society.
The Value of Humans in the Operational River Forecasting Enterprise
NASA Astrophysics Data System (ADS)
Pagano, T. C.
2012-04-01
The extent of human control over operational river forecasts, such as by adjusting model inputs and outputs, varies from nearly completely automated systems to those where forecasts are generated after discussion among a group of experts. Historical and realtime data availability, the complexity of hydrologic processes, forecast user needs, and forecasting institution support/resource availability (e.g. computing power, money for model maintenance) influence the character and effectiveness of operational forecasting systems. Automated data quality algorithms, if used at all, are typically very basic (e.g. checks for impossible values); substantial human effort is devoted to cleaning up forcing data using subjective methods. Similarly, although it is an active research topic, nearly all operational forecasting systems struggle to make quantitative use of Numerical Weather Prediction model-based precipitation forecasts, instead relying on the assessment of meteorologists. Conversely, while there is a strong tradition in meteorology of making raw model outputs available to forecast users via the Internet, this is rarely done in hydrology; Operational river forecasters express concerns about exposing users to raw guidance, due to the potential for misinterpretation and misuse. However, this limits the ability of users to build their confidence in operational products through their own value-added analyses. Forecasting agencies also struggle with provenance (i.e. documenting the production process and archiving the pieces that went into creating a forecast) although this is necessary for quantifying the benefits of human involvement in forecasting and diagnosing weak links in the forecasting chain. In hydrology, the space between model outputs and final operational products is nearly unstudied by the academic community, although some studies exist in other fields such as meteorology.
A stochastic HMM-based forecasting model for fuzzy time series.
Li, Sheng-Tun; Cheng, Yi-Chung
2010-10-01
Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.
ERIC Educational Resources Information Center
Zan, Xinxing Anna; Yoon, Sang Won; Khasawneh, Mohammad; Srihari, Krishnaswami
2013-01-01
In an effort to develop a low-cost and user-friendly forecasting model to minimize forecasting error, we have applied average and exponentially weighted return ratios to project undergraduate student enrollment. We tested the proposed forecasting models with different sets of historical enrollment data, such as university-, school-, and…
Evaluation of annual, global seismicity forecasts, including ensemble models
NASA Astrophysics Data System (ADS)
Taroni, Matteo; Zechar, Jeremy; Marzocchi, Warner
2013-04-01
In 2009, the Collaboratory for the Study of the Earthquake Predictability (CSEP) initiated a prototype global earthquake forecast experiment. Three models participated in this experiment for 2009, 2010 and 2011—each model forecast the number of earthquakes above magnitude 6 in 1x1 degree cells that span the globe. Here we use likelihood-based metrics to evaluate the consistency of the forecasts with the observed seismicity. We compare model performance with statistical tests and a new method based on the peer-to-peer gambling score. The results of the comparisons are used to build ensemble models that are a weighted combination of the individual models. Notably, in these experiments the ensemble model always performs significantly better than the single best-performing model. Our results indicate the following: i) time-varying forecasts, if not updated after each major shock, may not provide significant advantages with respect to time-invariant models in 1-year forecast experiments; ii) the spatial distribution seems to be the most important feature to characterize the different forecasting performances of the models; iii) the interpretation of consistency tests may be misleading because some good models may be rejected while trivial models may pass consistency tests; iv) a proper ensemble modeling seems to be a valuable procedure to get the best performing model for practical purposes.
Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)
NASA Astrophysics Data System (ADS)
Luo, Y.
2009-12-01
Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.
Development and validation of a regional coupled forecasting system for S2S forecasts
NASA Astrophysics Data System (ADS)
Sun, R.; Subramanian, A. C.; Hoteit, I.; Miller, A. J.; Ralph, M.; Cornuelle, B. D.
2017-12-01
Accurate and efficient forecasting of oceanic and atmospheric circulation is essential for a wide variety of high-impact societal needs, including: weather extremes; environmental protection and coastal management; management of fisheries, marine conservation; water resources; and renewable energy. Effective forecasting relies on high model fidelity and accurate initialization of the models with observed state of the ocean-atmosphere-land coupled system. A regional coupled ocean-atmosphere model with the Weather Research and Forecasting (WRF) model and the MITGCM ocean model coupled using the ESMF (Earth System Modeling Framework) coupling framework is developed to resolve mesoscale air-sea feedbacks. The regional coupled model allows oceanic mixed layer heat and momentum to interact with the atmospheric boundary layer dynamics at the mesoscale and submesoscale spatiotemporal regimes, thus leading to feedbacks which are otherwise not resolved in coarse resolution global coupled forecasting systems or regional uncoupled forecasting systems. The model is tested in two scenarios in the mesoscale eddy rich Red Sea and Western Indian Ocean region as well as mesoscale eddies and fronts of the California Current System. Recent studies show evidence for air-sea interactions involving the oceanic mesoscale in these two regions which can enhance predictability on sub seasonal timescale. We will present results from this newly developed regional coupled ocean-atmosphere model for forecasts over the Red Sea region as well as the California Current region. The forecasts will be validated against insitu observations in the region as well as reanalysis fields.
NASA Astrophysics Data System (ADS)
Strader, Anne; Schneider, Max; Schorlemmer, Danijel; Liukis, Maria
2016-04-01
The Collaboratory for the Study of Earthquake Predictability (CSEP) was developed to rigorously test earthquake forecasts retrospectively and prospectively through reproducible, completely transparent experiments within a controlled environment (Zechar et al., 2010). During 2006-2011, thirteen five-year time-invariant prospective earthquake mainshock forecasts developed by the Regional Earthquake Likelihood Models (RELM) working group were evaluated through the CSEP testing center (Schorlemmer and Gerstenberger, 2007). The number, spatial, and magnitude components of the forecasts were compared to the respective observed seismicity components using a set of consistency tests (Schorlemmer et al., 2007, Zechar et al., 2010). In the initial experiment, all but three forecast models passed every test at the 95% significance level, with all forecasts displaying consistent log-likelihoods (L-test) and magnitude distributions (M-test) with the observed seismicity. In the ten-year RELM experiment update, we reevaluate these earthquake forecasts over an eight-year period from 2008-2016, to determine the consistency of previous likelihood testing results over longer time intervals. Additionally, we test the Uniform California Earthquake Rupture Forecast (UCERF2), developed by the U.S. Geological Survey (USGS), and the earthquake rate model developed by the California Geological Survey (CGS) and the USGS for the National Seismic Hazard Mapping Program (NSHMP) against the RELM forecasts. Both the UCERF2 and NSHMP forecasts pass all consistency tests, though the Helmstetter et al. (2007) and Shen et al. (2007) models exhibit greater information gain per earthquake according to the T- and W- tests (Rhoades et al., 2011). Though all but three RELM forecasts pass the spatial likelihood test (S-test), multiple forecasts fail the M-test due to overprediction of the number of earthquakes during the target period. Though there is no significant difference between the UCERF2 and NSHMP models, residual scores show that the NSHMP model is preferred in locations with earthquake occurrence, due to the lower seismicity rates forecasted by the UCERF2 model.
Assessing skill of a global bimonthly streamflow ensemble prediction system
NASA Astrophysics Data System (ADS)
van Dijk, A. I.; Peña-Arancibia, J.; Sheffield, J.; Wood, E. F.
2011-12-01
Ideally, a seasonal streamflow forecasting system might be conceived of as a system that ingests skillful climate forecasts from general circulation models and propagates these through thoroughly calibrated hydrological models that are initialised using hydrometric observations. In practice, there are practical problems with each of these aspects. Instead, we analysed whether a comparatively simple hydrological model-based Ensemble Prediction System (EPS) can provide global bimonthly streamflow forecasts with some skill and if so, under what circumstances the greatest skill may be expected. The system tested produces ensemble forecasts for each of six annual bimonthly periods based on the previous 30 years of global daily gridded 1° resolution climate variables and an initialised global hydrological model. To incorporate some of the skill derived from ocean conditions, a post-EPS analog method was used to sample from the ensemble based on El Niño Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation (PDO) index values observed prior to the forecast. Forecasts skill was assessed through a hind-casting experiment for the period 1979-2008. Potential skill was calculated with reference to a model run with the actual forcing for the forecast period (the 'perfect' model) and was compared to actual forecast skill calculated for each of the six forecast times for an average 411 Australian and 51 pan-tropical catchments. Significant potential skill in bimonthly forecasts was largely limited to northern regions during the snow melt period, seasonally wet tropical regions at the transition of wet to dry season, and the Indonesian region where rainfall is well correlated to ENSO. The actual skill was approximately 34-50% of the potential skill. We attribute this primarily to limitations in the model structure, parameterisation and global forcing data. Use of better climate forecasts and remote sensing observations of initial catchment conditions should help to increase actual skill in future. Future work also could address the potential skill gain from using weather and climate forecasts and from a calibrated and/or alternative hydrological model or model ensemble. The approach and data might be useful as a benchmark for joint seasonal forecasting experiments planned under GEWEX.
Delivering Faster Congestion Feedback with the Mark-Front Strategy
NASA Technical Reports Server (NTRS)
Liu, Chunlei; Jain, Raj
2001-01-01
Computer networks use congestion feedback from the routers and destinations to control the transmission load. Delivering timely congestion feedback is essential to the performance of networks. Reaction to the congestion can be more effective if faster feedback is provided. Current TCP/IP networks use timeout, duplicate Acknowledgement Packets (ACKs) and explicit congestion notification (ECN) to deliver the congestion feedback, each provides a faster feedback than the previous method. In this paper, we propose a markfront strategy that delivers an even faster congestion feedback. With analytical and simulation results, we show that mark-front strategy reduces buffer size requirement, improves link efficiency and provides better fairness among users. Keywords: Explicit Congestion Notification, mark-front, congestion control, buffer size requirement, fairness.
Moran, Kelly Renee; Fairchild, Geoffrey; Generous, Nicholas; ...
2016-11-14
Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection andmore » Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. Here, we conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, Kelly Renee; Fairchild, Geoffrey; Generous, Nicholas
Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection andmore » Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. Here, we conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting.« less
Moran, Kelly R.; Fairchild, Geoffrey; Generous, Nicholas; Hickmann, Kyle; Osthus, Dave; Priedhorsky, Reid; Hyman, James; Del Valle, Sara Y.
2016-01-01
Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection and Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. We conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting. PMID:28830111
NASA Astrophysics Data System (ADS)
Ma, Chaoqun; Wang, Tijian; Zang, Zengliang; Li, Zhijin
2018-07-01
Atmospheric chemistry models usually perform badly in forecasting wintertime air pollution because of their uncertainties. Generally, such uncertainties can be decreased effectively by techniques such as data assimilation (DA) and model output statistics (MOS). However, the relative importance and combined effects of the two techniques have not been clarified. Here, a one-month air quality forecast with the Weather Research and Forecasting-Chemistry (WRF-Chem) model was carried out in a virtually operational setup focusing on Hebei Province, China. Meanwhile, three-dimensional variational (3DVar) DA and MOS based on one-dimensional Kalman filtering were implemented separately and simultaneously to investigate their performance in improving the model forecast. Comparison with observations shows that the chemistry forecast with MOS outperforms that with 3DVar DA, which could be seen in all the species tested over the whole 72 forecast hours. Combined use of both techniques does not guarantee a better forecast than MOS only, with the improvements and degradations being small and appearing rather randomly. Results indicate that the implementation of MOS is more suitable than 3DVar DA in improving the operational forecasting ability of WRF-Chem.
A hybrid group method of data handling with discrete wavelet transform for GDP forecasting
NASA Astrophysics Data System (ADS)
Isa, Nadira Mohamed; Shabri, Ani
2013-09-01
This study is proposed the application of hybridization model using Group Method of Data Handling (GMDH) and Discrete Wavelet Transform (DWT) in time series forecasting. The objective of this paper is to examine the flexibility of the hybridization GMDH in time series forecasting by using Gross Domestic Product (GDP). A time series data set is used in this study to demonstrate the effectiveness of the forecasting model. This data are utilized to forecast through an application aimed to handle real life time series. This experiment compares the performances of a hybrid model and a single model of Wavelet-Linear Regression (WR), Artificial Neural Network (ANN), and conventional GMDH. It is shown that the proposed model can provide a promising alternative technique in GDP forecasting.
Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras
2018-05-01
The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.
2014-04-01
WRF ) model is a numerical weather prediction system designed for operational forecasting and atmospheric research. This report examined WRF model... WRF , weather research and forecasting, atmospheric effects 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF...and Forecasting ( WRF ) model. The authors would also like to thank Ms. Sherry Larson, STS Systems Integration, LLC, ARL Technical Publishing Branch
Fuzzy logic-based analogue forecasting and hybrid modelling of horizontal visibility
NASA Astrophysics Data System (ADS)
Tuba, Zoltán; Bottyán, Zsolt
2018-04-01
Forecasting visibility is one of the greatest challenges in aviation meteorology. At the same time, high accuracy visibility forecasts can significantly reduce or make avoidable weather-related risk in aviation as well. To improve forecasting visibility, this research links fuzzy logic-based analogue forecasting and post-processed numerical weather prediction model outputs in hybrid forecast. Performance of analogue forecasting model was improved by the application of Analytic Hierarchy Process. Then, linear combination of the mentioned outputs was applied to create ultra-short term hybrid visibility prediction which gradually shifts the focus from statistical to numerical products taking their advantages during the forecast period. It gives the opportunity to bring closer the numerical visibility forecast to the observations even it is wrong initially. Complete verification of categorical forecasts was carried out; results are available for persistence and terminal aerodrome forecasts (TAF) as well in order to compare. The average value of Heidke Skill Score (HSS) of examined airports of analogue and hybrid forecasts shows very similar results even at the end of forecast period where the rate of analogue prediction in the final hybrid output is 0.1-0.2 only. However, in case of poor visibility (1000-2500 m), hybrid (0.65) and analogue forecasts (0.64) have similar average of HSS in the first 6 h of forecast period, and have better performance than persistence (0.60) or TAF (0.56). Important achievement that hybrid model takes into consideration physics and dynamics of the atmosphere due to the increasing part of the numerical weather prediction. In spite of this, its performance is similar to the most effective visibility forecasting methods and does not follow the poor verification results of clearly numerical outputs.
NASA Astrophysics Data System (ADS)
Kozel, Tomas; Stary, Milos
2017-12-01
The main advantage of stochastic forecasting is fan of possible value whose deterministic method of forecasting could not give us. Future development of random process is described better by stochastic then deterministic forecasting. Discharge in measurement profile could be categorized as random process. Content of article is construction and application of forecasting model for managed large open water reservoir with supply function. Model is based on neural networks (NS) and zone models, which forecasting values of average monthly flow from inputs values of average monthly flow, learned neural network and random numbers. Part of data was sorted to one moving zone. The zone is created around last measurement average monthly flow. Matrix of correlation was assembled only from data belonging to zone. The model was compiled for forecast of 1 to 12 month with using backward month flows (NS inputs) from 2 to 11 months for model construction. Data was got ridded of asymmetry with help of Box-Cox rule (Box, Cox, 1964), value r was found by optimization. In next step were data transform to standard normal distribution. The data were with monthly step and forecast is not recurring. 90 years long real flow series was used for compile of the model. First 75 years were used for calibration of model (matrix input-output relationship), last 15 years were used only for validation. Outputs of model were compared with real flow series. For comparison between real flow series (100% successfully of forecast) and forecasts, was used application to management of artificially made reservoir. Course of water reservoir management using Genetic algorithm (GE) + real flow series was compared with Fuzzy model (Fuzzy) + forecast made by Moving zone model. During evaluation process was founding the best size of zone. Results show that the highest number of input did not give the best results and ideal size of zone is in interval from 25 to 35, when course of management was almost same for all numbers from interval. Resulted course of management was compared with course, which was obtained from using GE + real flow series. Comparing results showed that fuzzy model with forecasted values has been able to manage main malfunction and artificially disorders made by model were founded essential, after values of water volume during management were evaluated. Forecasting model in combination with fuzzy model provide very good results in management of water reservoir with storage function and can be recommended for this purpose.
Dispersion Modeling Using Ensemble Forecasts Compared to ETEX Measurements.
NASA Astrophysics Data System (ADS)
Straume, Anne Grete; N'dri Koffi, Ernest; Nodop, Katrin
1998-11-01
Numerous numerical models are developed to predict long-range transport of hazardous air pollution in connection with accidental releases. When evaluating and improving such a model, it is important to detect uncertainties connected to the meteorological input data. A Lagrangian dispersion model, the Severe Nuclear Accident Program, is used here to investigate the effect of errors in the meteorological input data due to analysis error. An ensemble forecast, produced at the European Centre for Medium-Range Weather Forecasts, is then used as model input. The ensemble forecast members are generated by perturbing the initial meteorological fields of the weather forecast. The perturbations are calculated from singular vectors meant to represent possible forecast developments generated by instabilities in the atmospheric flow during the early part of the forecast. The instabilities are generated by errors in the analyzed fields. Puff predictions from the dispersion model, using ensemble forecast input, are compared, and a large spread in the predicted puff evolutions is found. This shows that the quality of the meteorological input data is important for the success of the dispersion model. In order to evaluate the dispersion model, the calculations are compared with measurements from the European Tracer Experiment. The model manages to predict the measured puff evolution concerning shape and time of arrival to a fairly high extent, up to 60 h after the start of the release. The modeled puff is still too narrow in the advection direction.
An improved Multimodel Approach for Global Sea Surface Temperature Forecasts
NASA Astrophysics Data System (ADS)
Khan, M. Z. K.; Mehrotra, R.; Sharma, A.
2014-12-01
The concept of ensemble combinations for formulating improved climate forecasts has gained popularity in recent years. However, many climate models share similar physics or modeling processes, which may lead to similar (or strongly correlated) forecasts. Recent approaches for combining forecasts that take into consideration differences in model accuracy over space and time have either ignored the similarity of forecast among the models or followed a pairwise dynamic combination approach. Here we present a basis for combining model predictions, illustrating the improvements that can be achieved if procedures for factoring in inter-model dependence are utilised. The utility of the approach is demonstrated by combining sea surface temperature (SST) forecasts from five climate models over a period of 1960-2005. The variable of interest, the monthly global sea surface temperature anomalies (SSTA) at a 50´50 latitude-longitude grid, is predicted three months in advance to demonstrate the utility of the proposed algorithm. Results indicate that the proposed approach offers consistent and significant improvements for majority of grid points compared to the case where the dependence among the models is ignored. Therefore, the proposed approach of combining multiple models by taking into account the existing interdependence, provides an attractive alternative to obtain improved climate forecast. In addition, an approach to combine seasonal forecasts from multiple climate models with varying periods of availability is also demonstrated.
Potential predictability and forecast skill in ensemble climate forecast: a skill-persistence rule
NASA Astrophysics Data System (ADS)
Jin, Yishuai; Rong, Xinyao; Liu, Zhengyu
2017-12-01
This study investigates the factors relationship between the forecast skills for the real world (actual skill) and perfect model (perfect skill) in ensemble climate model forecast with a series of fully coupled general circulation model forecast experiments. It is found that the actual skill for sea surface temperature (SST) in seasonal forecast is substantially higher than the perfect skill on a large part of the tropical oceans, especially the tropical Indian Ocean and the central-eastern Pacific Ocean. The higher actual skill is found to be related to the higher observational SST persistence, suggesting a skill-persistence rule: a higher SST persistence in the real world than in the model could overwhelm the model bias to produce a higher forecast skill for the real world than for the perfect model. The relation between forecast skill and persistence is further proved using a first-order autoregressive model (AR1) analytically for theoretical solutions and numerically for analogue experiments. The AR1 model study shows that the skill-persistence rule is strictly valid in the case of infinite ensemble size, but could be distorted by sampling errors and non-AR1 processes. This study suggests that the so called "perfect skill" is model dependent and cannot serve as an accurate estimate of the true upper limit of real world prediction skill, unless the model can capture at least the persistence property of the observation.
Investigating the effect of freeway congestion thresholds on decision-making inputs.
DOT National Transportation Integrated Search
2010-05-01
Congestion threshold is embedded in the congestion definition. Two basic approaches exist in : current practice for setting the congestion threshold. One common approach uses the free-flow or : unimpeded conditions as the congestion threshold. ...
Assessment of an ensemble seasonal streamflow forecasting system for Australia
NASA Astrophysics Data System (ADS)
Bennett, James C.; Wang, Quan J.; Robertson, David E.; Schepen, Andrew; Li, Ming; Michael, Kelvin
2017-11-01
Despite an increasing availability of skilful long-range streamflow forecasts, many water agencies still rely on simple resampled historical inflow sequences (stochastic scenarios) to plan operations over the coming year. We assess a recently developed forecasting system called forecast guided stochastic scenarios
(FoGSS) as a skilful alternative to standard stochastic scenarios for the Australian continent. FoGSS uses climate forecasts from a coupled ocean-land-atmosphere prediction system, post-processed with the method of calibration, bridging and merging. Ensemble rainfall forecasts force a monthly rainfall-runoff model, while a staged hydrological error model quantifies and propagates hydrological forecast uncertainty through forecast lead times. FoGSS is able to generate ensemble streamflow forecasts in the form of monthly time series to a 12-month forecast horizon. FoGSS is tested on 63 Australian catchments that cover a wide range of climates, including 21 ephemeral rivers. In all perennial and many ephemeral catchments, FoGSS provides an effective alternative to resampled historical inflow sequences. FoGSS generally produces skilful forecasts at shorter lead times ( < 4 months), and transits to climatology-like forecasts at longer lead times. Forecasts are generally reliable and unbiased. However, FoGSS does not perform well in very dry catchments (catchments that experience zero flows more than half the time in some months), sometimes producing strongly negative forecast skill and poor reliability. We attempt to improve forecasts through the use of (i) ESP rainfall forcings, (ii) different rainfall-runoff models, and (iii) a Bayesian prior to encourage the error model to return climatology forecasts in months when the rainfall-runoff model performs poorly. Of these, the use of the prior offers the clearest benefit in very dry catchments, where it moderates strongly negative forecast skill and reduces bias in some instances. However, the prior does not remedy poor reliability in very dry catchments. Overall, FoGSS is an attractive alternative to historical inflow sequences in all but the driest catchments. We discuss ways in which forecast reliability in very dry catchments could be improved in future work.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Fuzzy time-series based on Fibonacci sequence for stock price forecasting
NASA Astrophysics Data System (ADS)
Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia
2007-07-01
Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.
Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil
Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier
2016-01-01
Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315
Analysis of vehicular traffic flow in the major areas of Kuala Lumpur utilizing open-traffic
NASA Astrophysics Data System (ADS)
Manogaran, Saargunawathy; Ali, Muhammad; Yusof, Kamaludin Mohamad; Suhaili, Ramdhan
2017-09-01
Vehicular traffic congestion occurs when a large number of drivers are overcrowded on the road and the traffic flow does not run smoothly. Traffic congestion causes chaos on the road and interruption to daily activities of users. Time consumed on road give lots of negative effects on productivity, social behavior, environmental and cost to economy. Congestion is worsens and leads to havoc during the emergency such as flood, accidents, road maintenance and etc., where behavior of traffic flow is always unpredictable and uncontrollable. Real-time and historical traffic data are critical inputs for most traffic flow analysis applications. Researcher attempt to predict traffic using simulations as there is no exact model of traffic flow exists due to its high complexity. Open Traffic is an open source platform available for traffic data analysis linked to Open Street Map (OSM). This research is aimed to study and understand the Open Traffic platform. The real-time traffic flow pattern in Kuala Lumpur area was successfully been extracted and analyzed using Open Traffic. It was observed that the congestion occurs on every major road in Kuala Lumpur and most of it owes to the offices and the economic and commercial centers during rush hours. At some roads the congestion occurs at night due to the tourism activities.
Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi
2011-01-01
Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by geostationary satellite observations processed on virtual machines powered by Nebula.
NASA Astrophysics Data System (ADS)
Jiang, Jiang; Huang, Yuanyuan; Ma, Shuang; Stacy, Mark; Shi, Zheng; Ricciuto, Daniel M.; Hanson, Paul J.; Luo, Yiqi
2018-03-01
The ability to forecast ecological carbon cycling is imperative to land management in a world where past carbon fluxes are no longer a clear guide in the Anthropocene. However, carbon-flux forecasting has not been practiced routinely like numerical weather prediction. This study explored (1) the relative contributions of model forcing data and parameters to uncertainty in forecasting flux- versus pool-based carbon cycle variables and (2) the time points when temperature and CO2 treatments may cause statistically detectable differences in those variables. We developed an online forecasting workflow (Ecological Platform for Assimilation of Data (EcoPAD)), which facilitates iterative data-model integration. EcoPAD automates data transfer from sensor networks, data assimilation, and ecological forecasting. We used the Spruce and Peatland Responses Under Changing Experiments data collected from 2011 to 2014 to constrain the parameters in the Terrestrial Ecosystem Model, forecast carbon cycle responses to elevated CO2 and a gradient of warming from 2015 to 2024, and specify uncertainties in the model output. Our results showed that data assimilation substantially reduces forecasting uncertainties. Interestingly, we found that the stochasticity of future external forcing contributed more to the uncertainty of forecasting future dynamics of C flux-related variables than model parameters. However, the parameter uncertainty primarily contributes to the uncertainty in forecasting C pool-related response variables. Given the uncertainties in forecasting carbon fluxes and pools, our analysis showed that statistically different responses of fast-turnover pools to various CO2 and warming treatments were observed sooner than slow-turnover pools. Our study has identified the sources of uncertainties in model prediction and thus leads to improve ecological carbon cycling forecasts in the future.
The Rise of Complexity in Flood Forecasting: Opportunities, Challenges and Tradeoffs
NASA Astrophysics Data System (ADS)
Wood, A. W.; Clark, M. P.; Nijssen, B.
2017-12-01
Operational flood forecasting is currently undergoing a major transformation. Most national flood forecasting services have relied for decades on lumped, highly calibrated conceptual hydrological models running on local office computing resources, providing deterministic streamflow predictions at gauged river locations that are important to stakeholders and emergency managers. A variety of recent technological advances now make it possible to run complex, high-to-hyper-resolution models for operational hydrologic prediction over large domains, and the US National Weather Service is now attempting to use hyper-resolution models to create new forecast services and products. Yet other `increased-complexity' forecasting strategies also exist that pursue different tradeoffs between model complexity (i.e., spatial resolution, physics) and streamflow forecast system objectives. There is currently a pressing need for a greater understanding in the hydrology community of the opportunities, challenges and tradeoffs associated with these different forecasting approaches, and for a greater participation by the hydrology community in evaluating, guiding and implementing these approaches. Intermediate-resolution forecast systems, for instance, use distributed land surface model (LSM) physics but retain the agility to deploy ensemble methods (including hydrologic data assimilation and hindcast-based post-processing). Fully coupled numerical weather prediction (NWP) systems, another example, use still coarser LSMs to produce ensemble streamflow predictions either at the model scale or after sub-grid scale runoff routing. Based on the direct experience of the authors and colleagues in research and operational forecasting, this presentation describes examples of different streamflow forecast paradigms, from the traditional to the recent hyper-resolution, to illustrate the range of choices facing forecast system developers. We also discuss the degree to which the strengths and weaknesses of each strategy map onto the requirements for different types of forecasting services (e.g., flash flooding, river flooding, seasonal water supply prediction).
Do we need demographic data to forecast plant population dynamics?
Tredennick, Andrew T.; Hooten, Mevin B.; Adler, Peter B.
2017-01-01
Rapid environmental change has generated growing interest in forecasts of future population trajectories. Traditional population models built with detailed demographic observations from one study site can address the impacts of environmental change at particular locations, but are difficult to scale up to the landscape and regional scales relevant to management decisions. An alternative is to build models using population-level data that are much easier to collect over broad spatial scales than individual-level data. However, it is unknown whether models built using population-level data adequately capture the effects of density-dependence and environmental forcing that are necessary to generate skillful forecasts.Here, we test the consequences of aggregating individual responses when forecasting the population states (percent cover) and trajectories of four perennial grass species in a semi-arid grassland in Montana, USA. We parameterized two population models for each species, one based on individual-level data (survival, growth and recruitment) and one on population-level data (percent cover), and compared their forecasting accuracy and forecast horizons with and without the inclusion of climate covariates. For both models, we used Bayesian ridge regression to weight the influence of climate covariates for optimal prediction.In the absence of climate effects, we found no significant difference between the forecast accuracy of models based on individual-level data and models based on population-level data. Climate effects were weak, but increased forecast accuracy for two species. Increases in accuracy with climate covariates were similar between model types.In our case study, percent cover models generated forecasts as accurate as those from a demographic model. For the goal of forecasting, models based on aggregated individual-level data may offer a practical alternative to data-intensive demographic models. Long time series of percent cover data already exist for many plant species. Modelers should exploit these data to predict the impacts of environmental change.
A Global Aerosol Model Forecast for the ACE-Asia Field Experiment
NASA Technical Reports Server (NTRS)
Chin, Mian; Ginoux, Paul; Lucchesi, Robert; Huebert, Barry; Weber, Rodney; Anderson, Tad; Masonis, Sarah; Blomquist, Byron; Bandy, Alan; Thornton, Donald
2003-01-01
We present the results of aerosol forecast during the Aerosol Characterization Experiment (ACE-Asia) field experiment in spring 2001, using the Georgia Tech/Goddard Global Ozone Chemistry Aerosol Radiation and Transport (GOCART) model and the meteorological forecast fields from the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The aerosol model forecast provides direct information on aerosol optical thickness and concentrations, enabling effective flight planning, while feedbacks from measurements constantly evaluate the model, making successful model improvements. We verify the model forecast skill by comparing model predicted total aerosol extinction, dust, sulfate, and SO2 concentrations with those quantities measured by the C-130 aircraft during the ACE-Asia intensive operation period. The GEOS DAS meteorological forecast system shows excellent skills in predicting winds, relative humidity, and temperature for the ACE-Asia experiment area as well as for each individual flight, with skill scores usually above 0.7. The model is also skillful in forecast of pollution aerosols, with most scores above 0.5. The model correctly predicted the dust outbreak events and their trans-Pacific transport, but it constantly missed the high dust concentrations observed in the boundary layer. We attribute this missing dust source to the desertification regions in the Inner Mongolia Province in China, which have developed in recent years but were not included in the model during forecasting. After incorporating the desertification sources, the model is able to reproduce the observed high dust concentrations at low altitudes over the Yellow Sea. Two key elements for a successful aerosol model forecast are correct source locations that determine where the emissions take place, and realistic forecast winds and convection that determine where the aerosols are transported. We demonstrate that our global model can not only account for the large-scale intercontinental transport, but also produce the small-scale spatial and temporal variations that are adequate for aircraft measurements planning.
An application of ensemble/multi model approach for wind power production forecasting
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Pinson, P.; Hagedorn, R.; Decimi, G.; Sperati, S.
2011-02-01
The wind power forecasts of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast applied in this study is based on meteorological models that provide the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. For this purpose a training of a Neural Network (NN) to link directly the forecasted meteorological data and the power data has been performed. One wind farm has been examined located in a mountain area in the south of Italy (Sicily). First we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by the combination of models (RAMS, ECMWF deterministic, LAMI). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error (normalized by nominal power) of at least 1% compared to the singles models approach. Finally we have focused on the possibility of using the ensemble model system (EPS by ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first three days ahead period.
NASA Astrophysics Data System (ADS)
Karlovits, G. S.; Villarini, G.; Bradley, A.; Vecchi, G. A.
2014-12-01
Forecasts of seasonal precipitation and temperature can provide information in advance of potentially costly disruptions caused by flood and drought conditions. The consequences of these adverse hydrometeorological conditions may be mitigated through informed planning and response, given useful and skillful forecasts of these conditions. However, the potential value and applicability of these forecasts is unavoidably linked to their forecast quality. In this work we evaluate the skill of four global circulation models (GCMs) part of the North American Multi-Model Ensemble (NMME) project in forecasting seasonal precipitation and temperature over the continental United States. The GCMs we consider are the Geophysical Fluid Dynamics Laboratory (GFDL)-CM2.1, NASA Global Modeling and Assimilation Office (NASA-GMAO)-GEOS-5, The Center for Ocean-Land-Atmosphere Studies - Rosenstiel School of Marine & Atmospheric Science (COLA-RSMAS)-CCSM3, Canadian Centre for Climate Modeling and Analysis (CCCma) - CanCM4. These models are available at a resolution of 1-degree and monthly, with a minimum forecast lead time of nine months, up to one year. These model ensembles are compared against gridded monthly temperature and precipitation data created by the PRISM Climate Group, which represent the reference observation dataset in this work. Aspects of forecast quality are quantified using a diagnostic skill score decomposition that allows the evaluation of the potential skill and conditional and unconditional biases associated with these forecasts. The evaluation of the decomposed GCM forecast skill over the continental United States, by season and by lead time allows for a better understanding of the utility of these models for flood and drought predictions. Moreover, it also represents a diagnostic tool that could provide model developers feedback about strengths and weaknesses of their models.
NASA Astrophysics Data System (ADS)
Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio
2015-04-01
The negative effects of severe flood events are usually contrasted through structural measures that, however, do not fully eliminate flood risk. Non-structural measures, such as real-time flood forecasting and warning, are also required. Accurate stage/discharge future predictions with appropriate forecast lead-time are sought by decision-makers for implementing strategies to mitigate the adverse effects of floods. Traditionally, flood forecasting has been approached by using rainfall-runoff and/or flood routing modelling. Indeed, both types of forecasts, cannot be considered perfectly representing future outcomes because of lacking of a complete knowledge of involved processes (Todini, 2004). Nonetheless, although aware that model forecasts are not perfectly representing future outcomes, decision makers are de facto implicitly assuming the forecast of water level/discharge/volume, etc. as "deterministic" and coinciding with what is going to occur. Recently the concept of Predictive Uncertainty (PU) was introduced in hydrology (Krzysztofowicz, 1999), and several uncertainty processors were developed (Todini, 2008). PU is defined as the probability of occurrence of the future realization of a predictand (water level/discharge/volume) conditional on: i) prior observations and knowledge, ii) the available information obtained on the future value, typically provided by one or more forecast models. Unfortunately, PU has been frequently interpreted as a measure of lack of accuracy rather than the appropriate tool allowing to take the most appropriate decisions, given a model or several models' forecasts. With the aim to shed light on the benefits for appropriately using PU, a multi-temporal approach based on the MCP approach (Todini, 2008; Coccia and Todini, 2011) is here applied to stage forecasts at sites along the Upper Tiber River. Specifically, the STAge Forecasting-Rating Curve Model Muskingum-based (STAFOM-RCM) (Barbetta et al., 2014) along with the Rating-Curve Model in Real Time (RCM-RT) (Barbetta and Moramarco, 2014) are used to this end. Both models without considering rainfall information explicitly considers, at each time of forecast, the estimate of lateral contribution along the river reach for which the stage forecast is performed at downstream end. The analysis is performed for several reaches using different lead times according to the channel length. Barbetta, S., Moramarco, T., Brocca, L., Franchini, M. and Melone, F. 2014. Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3),729-743. Barbetta, S. and Moramarco, T. 2014. Real-time flood forecasting by relating local stage and remote discharge. Hydrological Sciences Journal, 59(9 ), 1656-1674. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999. Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Todini, E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746. Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.
NASA Astrophysics Data System (ADS)
Soltanzadeh, I.; Azadi, M.; Vakili, G. A.
2011-07-01
Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.
The Changing Science of Urban Transportation Planning
NASA Astrophysics Data System (ADS)
Kloster, Tom
2010-03-01
The last half of the 20th Century was the age of the automobile, and the development of bigger and faster roads defined urban planning for more than 50 years. During this period, transportation planners developed sophisticated behavior models to help predict future travel patterns in an attempt to keep pace with ever-growing congestion and public demand for more roads. By the 1990s, however, it was clear that eliminating congestion with new road capacity was an unattainable outcome, and had unintended effects that were never considered when the automobile era first emerged. Today, public expectations are rapidly evolving beyond ``building our way out'' of congestion, and toward more complex definitions of desired outcomes in transportation planning. In this new century, planners must improve behavior models to predict not only the travel patterns of the future, but also the subsequent environmental, social and public health effects associated with growth and changes in travel behavior, and provide alternative transportation solutions that respond to these broader outcomes.
Evaluation of Flood Forecast and Warning in Elbe river basin - Impact of Forecaster's Strategy
NASA Astrophysics Data System (ADS)
Danhelka, Jan; Vlasak, Tomas
2010-05-01
Czech Hydrometeorological Institute (CHMI) is responsible for flood forecasting and warning in the Czech Republic. To meet that issue CHMI operates hydrological forecasting systems and publish flow forecast in selected profiles. Flood forecast and warning is an output of system that links observation (flow and atmosphere), data processing, weather forecast (especially NWP's QPF), hydrological modeling and modeled outputs evaluation and interpretation by forecaster. Forecast users are interested in final output without separating uncertainties of separate steps of described process. Therefore an evaluation of final operational forecasts was done for profiles within Elbe river basin produced by AquaLog forecasting system during period 2002 to 2008. Effects of uncertainties of observation, data processing and especially meteorological forecasts were not accounted separately. Forecast of flood levels exceedance (peak over the threshold) during forecasting period was the main criterion as flow increase forecast is of the highest importance. Other evaluation criteria included peak flow and volume difference. In addition Nash-Sutcliffe was computed separately for each time step (1 to 48 h) of forecasting period to identify its change with the lead time. Textual flood warnings are issued for administrative regions to initiate flood protection actions in danger of flood. Flood warning hit rate was evaluated at regions level and national level. Evaluation found significant differences of model forecast skill between forecasting profiles, particularly less skill was evaluated at small headwater basins due to domination of QPF uncertainty in these basins. The average hit rate was 0.34 (miss rate = 0.33, false alarm rate = 0.32). However its explored spatial difference is likely to be influenced also by different fit of parameters sets (due to different basin characteristics) and importantly by different impact of human factor. Results suggest that the practice of interactive model operation, experience and forecasting strategy differs between responsible forecasting offices. Warning is based on model outputs interpretation by hydrologists-forecaster. Warning hit rate reached 0.60 for threshold set to lowest flood stage of which 0.11 was underestimation of flood degree (miss 0.22, false alarm 0.28). Critical success index of model forecast was 0.34, while the same criteria for warning reached 0.55. We assume that the increase accounts not only to change of scale from single forecasting point to region for warning, but partly also to forecaster's added value. There is no official warning strategy preferred in the Czech Republic (f.e. tolerance towards higher false alarm rate). Therefore forecaster decision and personal strategy is of great importance. Results show quite successful warning for 1st flood level exceedance, over-warning for 2nd flood level, but under-warning for 3rd (highest) flood level. That suggests general forecaster's preference of medium level warning (2nd flood level is legally determined to be the start of the flood and flood protection activities). In conclusion human forecaster's experience and analysis skill increases flood warning performance notably. However society preference should be specifically addressed in the warning strategy definition to support forecaster's decision making.
NASA Astrophysics Data System (ADS)
Elmore, K. L.
2016-12-01
The Metorological Phenomemna Identification NeartheGround (mPING) project is an example of a crowd-sourced, citizen science effort to gather data of sufficeint quality and quantity needed by new post processing methods that use machine learning. Transportation and infrastructure are particularly sensitive to precipitation type in winter weather. We extract attributes from operational numerical forecast models and use them in a random forest to generate forecast winter precipitation types. We find that random forests applied to forecast soundings are effective at generating skillful forecasts of surface ptype with consideralbly more skill than the current algorithms, especuially for ice pellets and freezing rain. We also find that three very different forecast models yuield similar overall results, showing that random forests are able to extract essentially equivalent information from different forecast models. We also show that the random forest for each model, and each profile type is unique to the particular forecast model and that the random forests developed using a particular model suffer significant degradation when given attributes derived from a different model. This implies that no single algorithm can perform well across all forecast models. Clearly, random forests extract information unavailable to "physically based" methods because the physical information in the models does not appear as we expect. One intersting result is that results from the classic "warm nose" sounding profile are, by far, the most sensitive to the particular forecast model, but this profile is also the one for which random forests are most skillful. Finally, a method for calibrarting probabilties for each different ptype using multinomial logistic regression is shown.
A 30-day-ahead forecast model for grass pollen in north London, United Kingdom.
Smith, Matt; Emberlin, Jean
2006-03-01
A 30-day-ahead forecast method has been developed for grass pollen in north London. The total period of the grass pollen season is covered by eight multiple regression models, each covering a 10-day period running consecutively from 21 May to 8 August. This means that three models were used for each 30-day forecast. The forecast models were produced using grass pollen and environmental data from 1961 to 1999 and tested on data from 2000 and 2002. Model accuracy was judged in two ways: the number of times the forecast model was able to successfully predict the severity (relative to the 1961-1999 dataset as a whole) of grass pollen counts in each of the eight forecast periods on a scale of 1 to 4; the number of times the forecast model was able to predict whether grass pollen counts were higher or lower than the mean. The models achieved 62.5% accuracy in both assessment years when predicting the relative severity of grass pollen counts on a scale of 1 to 4, which equates to six of the eight 10-day periods being forecast correctly. The models attained 87.5% and 100% accuracy in 2000 and 2002, respectively, when predicting whether grass pollen counts would be higher or lower than the mean. Attempting to predict pollen counts during distinct 10-day periods throughout the grass pollen season is a novel approach. The models also employed original methodology in the use of winter averages of the North Atlantic Oscillation to forecast 10-day means of allergenic pollen counts.
DOT National Transportation Integrated Search
2003-05-06
Congestion pricing can potentially reduce congestion by providing incentives for drivers to shift trips to off-peak periods, use less congested routes, or use alternative modes, thereby spreading out demand for available transportation infrastructure...
NASA Astrophysics Data System (ADS)
Mohite, A. R.; Beria, H.; Behera, A. K.; Chatterjee, C.; Singh, R.
2016-12-01
Flood forecasting using hydrological models is an important and cost-effective non-structural flood management measure. For forecasting at short lead times, empirical models using real-time precipitation estimates have proven to be reliable. However, their skill depreciates with increasing lead time. Coupling a hydrologic model with real-time rainfall forecasts issued from numerical weather prediction (NWP) systems could increase the lead time substantially. In this study, we compared 1-5 days precipitation forecasts from India Meteorological Department (IMD) Multi-Model Ensemble (MME) with European Center for Medium Weather forecast (ECMWF) NWP forecasts for over 86 major river basins in India. We then evaluated the hydrologic utility of these forecasts over Basantpur catchment (approx. 59,000 km2) of the Mahanadi River basin. Coupled MIKE 11 RR (NAM) and MIKE 11 hydrodynamic (HD) models were used for the development of flood forecast system (FFS). RR model was calibrated using IMD station rainfall data. Cross-sections extracted from SRTM 30 were used as input to the MIKE 11 HD model. IMD started issuing operational MME forecasts from the year 2008, and hence, both the statistical and hydrologic evaluation were carried out from 2008-2014. The performance of FFS was evaluated using both the NWP datasets separately for the year 2011, which was a large flood year in Mahanadi River basin. We will present figures and metrics for statistical (threshold based statistics, skill in terms of correlation and bias) and hydrologic (Nash Sutcliffe efficiency, mean and peak error statistics) evaluation. The statistical evaluation will be at pan-India scale for all the major river basins and the hydrologic evaluation will be for the Basantpur catchment of the Mahanadi River basin.
Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns
NASA Astrophysics Data System (ADS)
Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto
2017-09-01
Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.
NASA Astrophysics Data System (ADS)
Shair, Syazreen Niza; Yusof, Aida Yuzi; Asmuni, Nurin Haniah
2017-05-01
Coherent mortality forecasting models have recently received increasing attention particularly in their application to sub-populations. The advantage of coherent models over independent models is the ability to forecast a non-divergent mortality for two or more sub-populations. One of the coherent models was recently developed by [1] known as the product-ratio model. This model is an extension version of the functional independent model from [2]. The product-ratio model has been applied in a developed country, Australia [1] and has been extended in a developing nation, Malaysia [3]. While [3] accounted for coherency of mortality rates between gender and ethnic group, the coherency between states in Malaysia has never been explored. This paper will forecast the mortality rates of Malaysian sub-populations according to states using the product ratio coherent model and its independent version— the functional independent model. The forecast accuracies of two different models are evaluated using the out-of-sample error measurements— the mean absolute forecast error (MAFE) for age-specific death rates and the mean forecast error (MFE) for the life expectancy at birth. We employ Malaysian mortality time series data from 1991 to 2014, segregated by age, gender and states.
Understanding the topological characteristics and flow complexity of urban traffic congestion
NASA Astrophysics Data System (ADS)
Wen, Tzai-Hung; Chin, Wei-Chien-Benny; Lai, Pei-Chun
2017-05-01
For a growing number of developing cities, the capacities of streets cannot meet the rapidly growing demand of cars, causing traffic congestion. Understanding the spatial-temporal process of traffic flow and detecting traffic congestion are important issues associated with developing sustainable urban policies to resolve congestion. Therefore, the objective of this study is to propose a flow-based ranking algorithm for investigating traffic demands in terms of the attractiveness of street segments and flow complexity of the street network based on turning probability. Our results show that, by analyzing the topological characteristics of streets and volume data for a small fraction of street segments in Taipei City, the most congested segments of the city were identified successfully. The identified congested segments are significantly close to the potential congestion zones, including the officially announced most congested streets, the segments with slow moving speeds at rush hours, and the areas near significant landmarks. The identified congested segments also captured congestion-prone areas concentrated in the business districts and industrial areas of the city. Identifying the topological characteristics and flow complexity of traffic congestion provides network topological insights for sustainable urban planning, and these characteristics can be used to further understand congestion propagation.
Feld, April; Madden-Baer, Rose; McCorkle, Ruth
2016-01-01
The Centers for Medicare and Medicaid Services Innovation Center's Episode-Based Payment initiatives propose a large opportunity to reduce cost from waste and variation and stand to align hospitals, physicians, and postacute providers in the redesign of care that achieves savings and improve quality. Community-based organizations are at the forefront of this care redesign through innovative models of care aimed at bridging gaps in care coordination and reducing hospital readmissions. This article describes a community-based provider's approach to participation under the Bundled Payments for Care Improvement initiative and a 90-day model of care for congestive heart failure in home care.
An application of ensemble/multi model approach for wind power production forecast.
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Decimi, G.; Hagedorn, R.; Sperati, S.
2010-09-01
The wind power forecast of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast is based on a mesoscale meteorological models that provides the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. The corrected wind data are then used as input in the wind farm power curve to obtain the power forecast. These computations require historical time series of wind measured data (by an anemometer located in the wind farm or on the nacelle) and power data in order to be able to perform the statistical analysis on the past. For this purpose a Neural Network (NN) is trained on the past data and then applied in the forecast task. Considering that the anemometer measurements are not always available in a wind farm a different approach has also been adopted. A training of the NN to link directly the forecasted meteorological data and the power data has also been performed. The normalized RMSE forecast error seems to be lower in most cases by following the second approach. We have examined two wind farms, one located in Denmark on flat terrain and one located in a mountain area in the south of Italy (Sicily). In both cases we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by using two or more models (RAMS, ECMWF deterministic, LAMI, HIRLAM). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error of at least 1% compared to the singles models approach. Moreover the use of a deterministic global model, (e.g. ECMWF deterministic model) seems to reach similar level of accuracy of those of the mesocale models (LAMI and RAMS). Finally we have focused on the possibility of using the ensemble model (ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first day ahead period. In fact low spreads often correspond to low forecast error. For longer forecast horizon the correlation between RMSE and ensemble spread decrease becoming too low to be used for this purpose.
A Comparison of the Forecast Skills among Three Numerical Models
NASA Astrophysics Data System (ADS)
Lu, D.; Reddy, S. R.; White, L. J.
2003-12-01
Three numerical weather forecast models, MM5, COAMPS and WRF, operating with a joint effort of NOAA HU-NCAS and Jackson State University (JSU) during summer 2003 have been chosen to study their forecast skills against observations. The models forecast over the same region with the same initialization, boundary condition, forecast length and spatial resolution. AVN global dataset have been ingested as initial conditions. Grib resolution of 27 km is chosen to represent the current mesoscale model. The forecasts with the length of 36h are performed to output the result with 12h interval. The key parameters used to evaluate the forecast skill include 12h accumulated precipitation, sea level pressure, wind, surface temperature and dew point. Precipitation is evaluated statistically using conventional skill scores, Threat Score (TS) and Bias Score (BS), for different threshold values based on 12h rainfall observations whereas other statistical methods such as Mean Error (ME), Mean Absolute Error(MAE) and Root Mean Square Error (RMSE) are applied to other forecast parameters.
Load Modeling and Forecasting | Grid Modernization | NREL
Load Modeling and Forecasting Load Modeling and Forecasting NREL's work in load modeling is focused resources (such as rooftop photovoltaic systems) and changing customer energy use profiles, new load models distribution system. In addition, NREL researchers are developing load models for individual appliances and
NASA Astrophysics Data System (ADS)
Singhofen, P.
2017-12-01
The National Water Model (NWM) is a remarkable undertaking. The foundation of the NWM is a 1 square kilometer grid which is used for near real-time modeling and flood forecasting of most rivers and streams in the contiguous United States. However, the NWM falls short in highly urbanized areas with complex drainage infrastructure. To overcome these shortcomings, the presenter proposes to leverage existing local hyper-resolution H&H models and adapt the NWM forcing data to them. Gridded near real-time rainfall, short range forecasts (18-hour) and medium range forecasts (10-day) during Hurricane Irma are applied to numerous detailed H&H models in highly urbanized areas of the State of Florida. Coastal and inland models are evaluated. Comparisons of near real-time rainfall data are made with observed gaged data and the ability to predict flooding in advance based on forecast data is evaluated. Preliminary findings indicate that the near real-time rainfall data is consistently and significantly lower than observed data. The forecast data is more promising. For example, the medium range forecast data provides 2 - 3 days advanced notice of peak flood conditions to a reasonable level of accuracy in most cases relative to both timing and magnitude. Short range forecast data provides about 12 - 14 hours advanced notice. Since these are hyper-resolution models, flood forecasts can be made at the street level, providing emergency response teams with valuable information for coordinating and dispatching limited resources.
Utilizing Climate Forecasts for Improving Water and Power Systems Coordination
NASA Astrophysics Data System (ADS)
Arumugam, S.; Queiroz, A.; Patskoski, J.; Mahinthakumar, K.; DeCarolis, J.
2016-12-01
Climate forecasts, typically monthly-to-seasonal precipitation forecasts, are commonly used to develop streamflow forecasts for improving reservoir management. Irrespective of their high skill in forecasting, temperature forecasts in developing power demand forecasts are not often considered along with streamflow forecasts for improving water and power systems coordination. In this study, we consider a prototype system to analyze the utility of climate forecasts, both precipitation and temperature, for improving water and power systems coordination. The prototype system, a unit-commitment model that schedules power generation from various sources, is considered and its performance is compared with an energy system model having an equivalent reservoir representation. Different skill sets of streamflow forecasts and power demand forecasts are forced on both water and power systems representations for understanding the level of model complexity required for utilizing monthly-to-seasonal climate forecasts to improve coordination between these two systems. The analyses also identify various decision-making strategies - forward purchasing of fuel stocks, scheduled maintenance of various power systems and tradeoff on water appropriation between hydropower and other uses - in the context of various water and power systems configurations. Potential application of such analyses for integrating large power systems with multiple river basins is also discussed.
A comparative study on GM (1,1) and FRMGM (1,1) model in forecasting FBM KLCI
NASA Astrophysics Data System (ADS)
Ying, Sah Pei; Zakaria, Syerrina; Mutalib, Sharifah Sakinah Syed Abd
2017-11-01
FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBM KLCI) is a group of indexes combined in a standardized way and is used to measure the Malaysia overall market across the time. Although composite index can give ideas about stock market to investors, it is hard to predict accurately because it is volatile and it is necessary to identify a best model to forecast FBM KLCI. The objective of this study is to determine the most accurate forecasting model between GM (1,1) model and Fourier Residual Modification GM (1,1) (FRMGM (1,1)) model to forecast FBM KLCI. In this study, the actual daily closing data of FBM KLCI was collected from January 1, 2016 to March 15, 2016. GM (1,1) model and FRMGM (1,1) model were used to build the grey model and to test forecasting power of both models. Mean Absolute Percentage Error (MAPE) was used as a measure to determine the best model. Forecasted value by FRMGM (1,1) model do not differ much than the actual value compare to GM (1,1) model for in-sample and out-sample data. Results from MAPE also show that FRMGM (1,1) model is lower than GM (1,1) model for in-sample and out-sample data. These results shown that FRMGM (1,1) model is better than GM (1,1) model to forecast FBM KLCI.
A channel dynamics model for real-time flood forecasting
Hoos, Anne B.; Koussis, Antonis D.; Beale, Guy O.
1989-01-01
A new channel dynamics scheme (alternative system predictor in real time (ASPIRE)), designed specifically for real-time river flow forecasting, is introduced to reduce uncertainty in the forecast. ASPIRE is a storage routing model that limits the influence of catchment model forecast errors to the downstream station closest to the catchment. Comparisons with the Muskingum routing scheme in field tests suggest that the ASPIRE scheme can provide more accurate forecasts, probably because discharge observations are used to a maximum advantage and routing reaches (and model errors in each reach) are uncoupled. Using ASPIRE in conjunction with the Kalman filter did not improve forecast accuracy relative to a deterministic updating procedure. Theoretical analysis suggests that this is due to a large process noise to measurement noise ratio.
Performance of univariate forecasting on seasonal diseases: the case of tuberculosis.
Permanasari, Adhistya Erna; Rambli, Dayang Rohaya Awang; Dominic, P Dhanapal Durai
2011-01-01
The annual disease incident worldwide is desirable to be predicted for taking appropriate policy to prevent disease outbreak. This chapter considers the performance of different forecasting method to predict the future number of disease incidence, especially for seasonal disease. Six forecasting methods, namely linear regression, moving average, decomposition, Holt-Winter's, ARIMA, and artificial neural network (ANN), were used for disease forecasting on tuberculosis monthly data. The model derived met the requirement of time series with seasonality pattern and downward trend. The forecasting performance was compared using similar error measure in the base of the last 5 years forecast result. The findings indicate that ARIMA model was the most appropriate model since it obtained the less relatively error than the other model.
Error models for official mortality forecasts.
Alho, J M; Spencer, B D
1990-09-01
"The Office of the Actuary, U.S. Social Security Administration, produces alternative forecasts of mortality to reflect uncertainty about the future.... In this article we identify the components and assumptions of the official forecasts and approximate them by stochastic parametric models. We estimate parameters of the models from past data, derive statistical intervals for the forecasts, and compare them with the official high-low intervals. We use the models to evaluate the forecasts rather than to develop different predictions of the future. Analysis of data from 1972 to 1985 shows that the official intervals for mortality forecasts for males or females aged 45-70 have approximately a 95% chance of including the true mortality rate in any year. For other ages the chances are much less than 95%." excerpt
NASA Astrophysics Data System (ADS)
Moore, Robert J.; Wells, Steven C.; Cole, Steven J.
2016-04-01
It has been common for flood forecasting systems to be commissioned at a catchment or regional level in response to local priorities and hydrological conditions, leading to variety in system design and model choice. As systems mature and efficiencies of national management are sought, there can be a drive towards system rationalisation, gaining an overview of model performance and consideration of simplification through model-type convergence. Flood forecasting model assessments, whilst overseen at a national level, may be commissioned and managed at a catchment and regional level, take a variety of forms and be large in number. This presents a challenge when an integrated national assessment is required to guide operational use of flood forecasts and plan future investment in flood forecasting models and supporting hydrometric monitoring. This contribution reports on how a nationally consistent framework for flood forecasting model performance has been developed to embrace many past, ongoing and future assessments for local river systems by engineering consultants across England & Wales. The outcome is a Performance Summary for every site model assessed which, on a single page, contains relevant catchment information for context, a selection of overlain forecast and observed hydrographs and a set of performance statistics with associated displays of novel condensed form. One display provides performance comparison with other models that may exist for the site. The performance statistics include skill scores for forecasting events (flow/level threshold crossings) of differing severity/rarity, indicating their probability and likely timing, which have real value in an operational setting. The local models assessed can be of any type and span rainfall-runoff (conceptual and transfer function) and flow routing (hydrological and hydrodynamic) forms. Also accommodated by the framework is the national G2G (Grid-to-Grid) distributed hydrological model, providing area-wide coverage across the fluvial rivers of England and Wales, which can be assessed at gauged sites. Thus the performance of the national G2G model forecasts can be directly compared with that from the local models. The Performance Summary for each site model is complemented by a national spatial analysis of model performance stratified by model-type, geographical region and forecast lead-time. The map displays provide an extensive evidence-base that can be interrogated, through a Flood Forecasting Model Performance web portal, to reveal fresh insights into comparative performance across locations, lead-times and models. This work was commissioned by the Environment Agency in partnership with Natural Resources Wales and the Flood Forecasting Centre for England and Wales.
NASA Astrophysics Data System (ADS)
Smith, P. J.; Beven, K.; Panziera, L.
2012-04-01
The issuing of timely flood alerts may be dependant upon the ability to predict future values of water level or discharge at locations where observations are available. Catchments at risk of flash flooding often have a rapid natural response time, typically less then the forecast lead time desired for issuing alerts. This work focuses on the provision of short-range (up to 6 hours lead time) predictions of discharge in small catchments based on utilising radar forecasts to drive a hydrological model. An example analysis based upon the Verzasca catchment (Ticino, Switzerland) is presented. Parsimonious time series models with a mechanistic interpretation (so called Data-Based Mechanistic model) have been shown to provide reliable accurate forecasts in many hydrological situations. In this study such a model is developed to predict the discharge at an observed location from observed precipitation data. The model is shown to capture the snow melt response at this site. Observed discharge data is assimilated to improve the forecasts, of up to two hours lead time, that can be generated from observed precipitation. To generate forecasts with greater lead time ensemble precipitation forecasts are utilised. In this study the Nowcasting ORographic precipitation in the Alps (NORA) product outlined in more detail elsewhere (Panziera et al. Q. J. R. Meteorol. Soc. 2011; DOI:10.1002/qj.878) is utilised. NORA precipitation forecasts are derived from historical analogues based on the radar field and upper atmospheric conditions. As such, they avoid the need to explicitly model the evolution of the rainfall field through for example Lagrangian diffusion. The uncertainty in the forecasts is represented by characterisation of the joint distribution of the observed discharge, the discharge forecast using the (in operational conditions unknown) future observed precipitation and that forecast utilising the NORA ensembles. Constructing the joint distribution in this way allows the full historic record of data at the site to inform the predictive distribution. It is shown that, in part due to the limited availability of forecasts, the uncertainty in the relationship between the NORA based forecasts and other variates dominated the resulting predictive uncertainty.
The development rainfall forecasting using kalman filter
NASA Astrophysics Data System (ADS)
Zulfi, Mohammad; Hasan, Moh.; Dwidja Purnomo, Kosala
2018-04-01
Rainfall forecasting is very interesting for agricultural planing. Rainfall information is useful to make decisions about the plan planting certain commodities. In this studies, the rainfall forecasting by ARIMA and Kalman Filter method. Kalman Filter method is used to declare a time series model of which is shown in the form of linear state space to determine the future forecast. This method used a recursive solution to minimize error. The rainfall data in this research clustered by K-means clustering. Implementation of Kalman Filter method is for modelling and forecasting rainfall in each cluster. We used ARIMA (p,d,q) to construct a state space for KalmanFilter model. So, we have four group of the data and one model in each group. In conclusions, Kalman Filter method is better than ARIMA model for rainfall forecasting in each group. It can be showed from error of Kalman Filter method that smaller than error of ARIMA model.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
NASA Astrophysics Data System (ADS)
Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki
2015-04-01
Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.
Recent Achievements of the Collaboratory for the Study of Earthquake Predictability
NASA Astrophysics Data System (ADS)
Jordan, T. H.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Jackson, D. D.; Rhoades, D. A.; Zechar, J. D.; Marzocchi, W.
2016-12-01
The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 442 models under evaluation. The California testing center, started by SCEC, Sept 1, 2007, currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. Our tests are now based on the hypocentral locations and magnitudes of cataloged earthquakes, but we plan to test focal mechanisms, seismic hazard models, ground motion forecasts, and finite rupture forecasts as well. We have increased computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model, introduced Bayesian ensemble models, and implemented support for non-Poissonian simulation-based forecasts models. We are currently developing formats and procedures to evaluate externally hosted forecasts and predictions. CSEP supports the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. We found that earthquakes as small as magnitude 2.5 provide important information on subsequent earthquakes larger than magnitude 5. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence showed that some physics-based and hybrid models outperform catalog-based (e.g., ETAS) models. This experiment also demonstrates the ability of the CSEP infrastructure to support retrospective forecast testing. Current CSEP development activities include adoption of the Comprehensive Earthquake Catalog (ComCat) as an authorized data source, retrospective testing of simulation-based forecasts, and support for additive ensemble methods. We describe the open-source CSEP software that is available to researchers as they develop their forecast models. We also discuss how CSEP procedures are being adapted to intensity and ground motion prediction experiments as well as hazard model testing.
Tsunami Forecast Progress Five Years After Indonesian Disaster
NASA Astrophysics Data System (ADS)
Titov, Vasily V.; Bernard, Eddie N.; Weinstein, Stuart A.; Kanoglu, Utku; Synolakis, Costas E.
2010-05-01
Almost five years after the 26 December 2004 Indian Ocean tragedy, tsunami warnings are finally benefiting from decades of research toward effective model-based forecasts. Since the 2004 tsunami, two seminal advances have been (i) deep-ocean tsunami measurements with tsunameters and (ii) their use in accurately forecasting tsunamis after the tsunami has been generated. Using direct measurements of deep-ocean tsunami heights, assimilated into numerical models for specific locations, greatly improves the real-time forecast accuracy over earthquake-derived magnitude estimates of tsunami impact. Since 2003, this method has been used to forecast tsunamis at specific harbors for different events in the Pacific and Indian Oceans. Recent tsunamis illustrated how this technology is being adopted in global tsunami warning operations. The U.S. forecasting system was used by both research and operations to evaluate the tsunami hazard. Tests demonstrated the effectiveness of operational tsunami forecasting using real-time deep-ocean data assimilated into forecast models. Several examples also showed potential of distributed forecast tools. With IOC and USAID funding, NOAA researchers at PMEL developed the Community Model Interface for Tsunami (ComMIT) tool and distributed it through extensive capacity-building sessions in the Indian Ocean. Over hundred scientists have been trained in tsunami inundation mapping, leading to the first generation of inundation models for many Indian Ocean shorelines. These same inundation models can also be used for real-time tsunami forecasts as was demonstrated during several events. Contact Information Vasily V. Titov, Seattle, Washington, USA, 98115
Forearm vasodilatation following release of venous congestion
Caro, C. G.; Foley, T. H.; Sudlow, M. F.
1970-01-01
1. The volume rate of forearm blood flow was measured with a mercury-in-rubber strain gauge, or with a water-filled plethysmograph, from 1 sec after termination of a 2-3 min period of venous congestion. 2. When congesting pressure had been less than 18 mm Hg, average post-congestion flow (five subjects) was constant during approx. 10 sec and not significantly different from resting flow. 3. When congesting pressure had been 30 mm Hg, average post-congestion flow (eight subjects) was 26% higher than resting, during 3-4 sec after release of congestion, but rose to 273% of resting during 4-6 sec after release of congestion. 4. In other studies forearm vascular resistance had been found normal or increased during such venous congestion, and theoretical studies here indicated that passive mechanical factors could not account for the delayed occurrence of high post-congestion flow. 5. It appears, therefore, that the forearm vascular bed dilates actively shortly after release of substantial venous congestion. It would seem more likely that a myogenic mechanism, rather than a metabolic one, is responsible. PMID:5532541
Residential Saudi load forecasting using analytical model and Artificial Neural Networks
NASA Astrophysics Data System (ADS)
Al-Harbi, Ahmad Abdulaziz
In recent years, load forecasting has become one of the main fields of study and research. Short Term Load Forecasting (STLF) is an important part of electrical power system operation and planning. This work investigates the applicability of different approaches; Artificial Neural Networks (ANNs) and hybrid analytical models to forecast residential load in Kingdom of Saudi Arabia (KSA). These two techniques are based on model human modes behavior formulation. These human modes represent social, religious, official occasions and environmental parameters impact. The analysis is carried out on residential areas for three regions in two countries exposed to distinct people activities and weather conditions. The collected data are for Al-Khubar and Yanbu industrial city in KSA, in addition to Seattle, USA to show the validity of the proposed models applied on residential load. For each region, two models are proposed. First model is next hour load forecasting while second model is next day load forecasting. Both models are analyzed using the two techniques. The obtained results for ANN next hour models yield very accurate results for all areas while relatively reasonable results are achieved when using hybrid analytical model. For next day load forecasting, the two approaches yield satisfactory results. Comparative studies were conducted to prove the effectiveness of the models proposed.
Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.
2015-01-01
Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380
Liu, Fengchen; Porco, Travis C; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K; Bailey, Robin L; Keenan, Jeremy D; Solomon, Anthony W; Emerson, Paul M; Gambhir, Manoj; Lietman, Thomas M
2015-08-01
Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts' opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon's signed-rank statistic. Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher's information. Each individual expert's forecast was poorer than the sum of experts. Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. Clinicaltrials.gov NCT00792922.
Power systems locational marginal pricing in deregulated markets
NASA Astrophysics Data System (ADS)
Wang, Hui-Fung Francis
Since the beginning of the 1990s, the electricity business is transforming from a vertical integrating business to a competitive market operations. The generation, transmission, distribution subsystem of an electricity utility are operated independently as Genco (generation subsystem), Transco (transmission subsystem), and Distco (distribution subsystem). This trend promotes more economical inter- and intra regional transactions to be made by the participating companies and the users of electricity to achieve the intended objectives of deregulation. There are various types of electricity markets that are implemented in the North America in the past few years. However, transmission congestion management becomes a key issue in the electricity market design as more bilateral transactions are traded across long distances competing for scarce transmission resources. It directly alters the traditional concept of energy pricing and impacts the bottom line, revenue and cost of electricity, of both suppliers and buyers. In this research, transmission congestion problem in a deregulated market environment is elucidated by implementing by the Locational Marginal Pricing (LMP) method. With a comprehensive understanding of the LMP method, new mathematical tools will aid electric utilities in exploring new business opportunities are developed and presented in this dissertation. The dissertation focuses on the development of concept of (LMP) forecasting and its implication to the market participants in deregulated market. Specifically, we explore methods of developing fast LMP calculation techniques that are differ from existing LMPs. We also explore and document the usefulness of the proposed LMP in determining electricity pricing of a large scale power system. The developed mathematical tools use of well-known optimization techniques such as linear programming that are support by several flow charts. The fast and practical security constrained unit commitment methods are the integral parts of the LMP algorithms. Different components of optimization techniques, unit commitment, power flow analysis, and matrix manipulations for large scale power systems are integrated and represented by several new flow charts. The LMP concept and processes, mathematical models, and their corresponding algorithms has been implemented to study a small six bus test power system/market and also the real size New York power system/market where the transmission congestion is high and electricity market is deregulated. The simulated results documented in the dissertation are satisfactory and produce very encouraging result when compared to the actual Located Based Marginal Price (LMP) results posted by the New York Independent System Operator (ISO). The further research opportunities inspired by this dissertation are also elaborated.
Evaluating NMME Seasonal Forecast Skill for use in NASA SERVIR Hub Regions
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Roberts, Franklin R.
2013-01-01
The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The coupled forecasts have numerous potential applications, both national and international in scope. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in driving applications models in hub regions including East Africa, the Hindu Kush- Himalayan (HKH) region and Mesoamerica. A prerequisite for seasonal forecast use in application modeling (e.g. hydrology, agriculture) is bias correction and skill assessment. Efforts to address systematic biases and multi-model combination in support of NASA SERVIR impact modeling requirements will be highlighted. Specifically, quantilequantile mapping for bias correction has been implemented for all archived NMME hindcasts. Both deterministic and probabilistic skill estimates for raw, bias-corrected, and multi-model ensemble forecasts as a function of forecast lead will be presented for temperature and precipitation. Complementing this statistical assessment will be case studies of significant events, for example, the ability of the NMME forecasts suite to anticipate the 2010/2011 drought in the Horn of Africa and its relationship to evolving SST patterns.
Zhuo, Fan; Duan, Hucai
2017-01-01
The data sequence of spectrum sensing results injected from dedicated spectrum sensor nodes (SSNs) and the data traffic from upstream secondary users (SUs) lead to unpredictable data loads in a sensor network-aided cognitive radio ad hoc network (SN-CRN). As a result, network congestion may occur at a SU acting as fusion center when the offered data load exceeds its available capacity, which degrades network performance. In this paper, we present an effective approach to mitigate congestion of bottlenecked SUs via a proposed distributed power control framework for SSNs over a rectangular grid based SN-CRN, aiming to balance resource load and avoid excessive congestion. To achieve this goal, a distributed power control framework for SSNs from interior tier (IT) and middle tier (MT) is proposed to achieve the tradeoff between channel capacity and energy consumption. In particular, we firstly devise two pricing factors by considering stability of local spectrum sensing and spectrum sensing quality for SSNs. By the aid of pricing factors, the utility function of this power control problem is formulated by jointly taking into account the revenue of power reduction and the cost of energy consumption for IT or MT SSN. By bearing in mind the utility function maximization and linear differential equation constraint of energy consumption, we further formulate the power control problem as a differential game model under a cooperation or noncooperation scenario, and rigorously obtain the optimal solutions to this game model by employing dynamic programming. Then the congestion mitigation for bottlenecked SUs is derived by alleviating the buffer load over their internal buffers. Simulation results are presented to show the effectiveness of the proposed approach under the rectangular grid based SN-CRN scenario. PMID:28914803
Fuzzy State Transition and Kalman Filter Applied in Short-Term Traffic Flow Forecasting
Ming-jun, Deng; Shi-ru, Qu
2015-01-01
Traffic flow is widely recognized as an important parameter for road traffic state forecasting. Fuzzy state transform and Kalman filter (KF) have been applied in this field separately. But the studies show that the former method has good performance on the trend forecasting of traffic state variation but always involves several numerical errors. The latter model is good at numerical forecasting but is deficient in the expression of time hysteretically. This paper proposed an approach that combining fuzzy state transform and KF forecasting model. In considering the advantage of the two models, a weight combination model is proposed. The minimum of the sum forecasting error squared is regarded as a goal in optimizing the combined weight dynamically. Real detection data are used to test the efficiency. Results indicate that the method has a good performance in terms of short-term traffic forecasting. PMID:26779258
Fuzzy State Transition and Kalman Filter Applied in Short-Term Traffic Flow Forecasting.
Deng, Ming-jun; Qu, Shi-ru
2015-01-01
Traffic flow is widely recognized as an important parameter for road traffic state forecasting. Fuzzy state transform and Kalman filter (KF) have been applied in this field separately. But the studies show that the former method has good performance on the trend forecasting of traffic state variation but always involves several numerical errors. The latter model is good at numerical forecasting but is deficient in the expression of time hysteretically. This paper proposed an approach that combining fuzzy state transform and KF forecasting model. In considering the advantage of the two models, a weight combination model is proposed. The minimum of the sum forecasting error squared is regarded as a goal in optimizing the combined weight dynamically. Real detection data are used to test the efficiency. Results indicate that the method has a good performance in terms of short-term traffic forecasting.
Assessing the performance of eight real-time updating models and procedures for the Brosna River
NASA Astrophysics Data System (ADS)
Goswami, M.; O'Connor, K. M.; Bhattarai, K. P.; Shamseldin, A. Y.
2005-10-01
The flow forecasting performance of eight updating models, incorporated in the Galway River Flow Modelling and Forecasting System (GFMFS), was assessed using daily data (rainfall, evaporation and discharge) of the Irish Brosna catchment (1207 km2), considering their one to six days lead-time discharge forecasts. The Perfect Forecast of Input over the Forecast Lead-time scenario was adopted, where required, in place of actual rainfall forecasts. The eight updating models were: (i) the standard linear Auto-Regressive (AR) model, applied to the forecast errors (residuals) of a simulation (non-updating) rainfall-runoff model; (ii) the Neural Network Updating (NNU) model, also using such residuals as input; (iii) the Linear Transfer Function (LTF) model, applied to the simulated and the recently observed discharges; (iv) the Non-linear Auto-Regressive eXogenous-Input Model (NARXM), also a neural network-type structure, but having wide options of using recently observed values of one or more of the three data series, together with non-updated simulated outflows, as inputs; (v) the Parametric Simple Linear Model (PSLM), of LTF-type, using recent rainfall and observed discharge data; (vi) the Parametric Linear perturbation Model (PLPM), also of LTF-type, using recent rainfall and observed discharge data, (vii) n-AR, an AR model applied to the observed discharge series only, as a naïve updating model; and (viii) n-NARXM, a naive form of the NARXM, using only the observed discharge data, excluding exogenous inputs. The five GFMFS simulation (non-updating) models used were the non-parametric and parametric forms of the Simple Linear Model and of the Linear Perturbation Model, the Linearly-Varying Gain Factor Model, the Artificial Neural Network Model, and the conceptual Soil Moisture Accounting and Routing (SMAR) model. As the SMAR model performance was found to be the best among these models, in terms of the Nash-Sutcliffe R2 value, both in calibration and in verification, the simulated outflows of this model only were selected for the subsequent exercise of producing updated discharge forecasts. All the eight forms of updating models for producing lead-time discharge forecasts were found to be capable of producing relatively good lead-1 (1-day ahead) forecasts, with R2 values almost 90% or above. However, for higher lead time forecasts, only three updating models, viz., NARXM, LTF, and NNU, were found to be suitable, with lead-6 values of R2 about 90% or higher. Graphical comparisons were made of the lead-time forecasts for the two largest floods, one in the calibration period and the other in the verification period.
Cassagne, E; Caillaud, P D; Besancenot, J P; Thibaudon, M
2007-10-01
Pollen of Poaceae is among the most allergenic pollen in Europe with pollen of birch. It is therefore useful to elaborate models to help pollen allergy sufferers. The objective of this study was to construct forecast models that could predict the first day characterized by a certain level of allergic risk called here the Starting Date of the Allergic Risk (SDAR). Models result from four forecast methods (three summing and one multiple regression analysis) used in the literature. They were applied on Nancy and Strasbourg from 1988 to 2005 and were tested on 2006. Mean Absolute Error and Actual forecast ability test are the parameters used to choose best models, assess and compare their accuracy. It was found, on the whole, that all the models presented a good forecast accuracy which was equivalent. They were all reliable and were used in order to forecast the SDAR in 2006 with contrasting results in forecasting precision.
Forecast of dengue incidence using temperature and rainfall.
Hii, Yien Ling; Zhu, Huaiping; Ng, Nawi; Ng, Lee Ching; Rocklöv, Joacim
2012-01-01
An accurate early warning system to predict impending epidemics enhances the effectiveness of preventive measures against dengue fever. The aim of this study was to develop and validate a forecasting model that could predict dengue cases and provide timely early warning in Singapore. We developed a time series Poisson multivariate regression model using weekly mean temperature and cumulative rainfall over the period 2000-2010. Weather data were modeled using piecewise linear spline functions. We analyzed various lag times between dengue and weather variables to identify the optimal dengue forecasting period. Autoregression, seasonality and trend were considered in the model. We validated the model by forecasting dengue cases for week 1 of 2011 up to week 16 of 2012 using weather data alone. Model selection and validation were based on Akaike's Information Criterion, standardized Root Mean Square Error, and residuals diagnoses. A Receiver Operating Characteristics curve was used to analyze the sensitivity of the forecast of epidemics. The optimal period for dengue forecast was 16 weeks. Our model forecasted correctly with errors of 0.3 and 0.32 of the standard deviation of reported cases during the model training and validation periods, respectively. It was sensitive enough to distinguish between outbreak and non-outbreak to a 96% (CI = 93-98%) in 2004-2010 and 98% (CI = 95%-100%) in 2011. The model predicted the outbreak in 2011 accurately with less than 3% possibility of false alarm. We have developed a weather-based dengue forecasting model that allows warning 16 weeks in advance of dengue epidemics with high sensitivity and specificity. We demonstrate that models using temperature and rainfall could be simple, precise, and low cost tools for dengue forecasting which could be used to enhance decision making on the timing, scale of vector control operations, and utilization of limited resources.
NASA Astrophysics Data System (ADS)
Pérez, B.; Brower, R.; Beckers, J.; Paradis, D.; Balseiro, C.; Lyons, K.; Cure, M.; Sotillo, M. G.; Hacket, B.; Verlaan, M.; Alvarez Fanjul, E.
2011-04-01
ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast that makes use of existing storm surge or circulation models today operational in Europe, as well as near-real time tide gauge data in the region, with the following main goals: - providing an easy access to existing forecasts, as well as to its performance and model validation, by means of an adequate visualization tool - generation of better forecasts of sea level, including confidence intervals, by means of the Bayesian Model Average Technique (BMA) The system was developed and implemented within ECOOP (C.No. 036355) European Project for the NOOS and the IBIROOS regions, based on MATROOS visualization tool developed by Deltares. Both systems are today operational at Deltares and Puertos del Estado respectively. The Bayesian Modelling Average technique generates an overall forecast probability density function (PDF) by making a weighted average of the individual forecasts PDF's; the weights represent the probability that a model will give the correct forecast PDF and are determined and updated operationally based on the performance of the models during a recent training period. This implies the technique needs the availability of sea level data from tide gauges in near-real time. Results of validation of the different models and BMA implementation for the main harbours will be presented for the IBIROOS and Western Mediterranean regions, where this kind of activity is performed for the first time. The work has proved to be useful to detect problems in some of the circulation models not previously well calibrated with sea level data, to identify the differences on baroclinic and barotropic models for sea level applications and to confirm the general improvement of the BMA forecasts.
NASA Astrophysics Data System (ADS)
Hong, Mei; Chen, Xi; Zhang, Ren; Wang, Dong; Shen, Shuanghe; Singh, Vijay P.
2018-04-01
With the objective of tackling the problem of inaccurate long-term El Niño-Southern Oscillation (ENSO) forecasts, this paper develops a new dynamical-statistical forecast model of the sea surface temperature anomaly (SSTA) field. To avoid single initial prediction values, a self-memorization principle is introduced to improve the dynamical reconstruction model, thus making the model more appropriate for describing such chaotic systems as ENSO events. The improved dynamical-statistical model of the SSTA field is used to predict SSTA in the equatorial eastern Pacific and during El Niño and La Niña events. The long-term step-by-step forecast results and cross-validated retroactive hindcast results of time series T1 and T2 are found to be satisfactory, with a Pearson correlation coefficient of approximately 0.80 and a mean absolute percentage error (MAPE) of less than 15 %. The corresponding forecast SSTA field is accurate in that not only is the forecast shape similar to the actual field but also the contour lines are essentially the same. This model can also be used to forecast the ENSO index. The temporal correlation coefficient is 0.8062, and the MAPE value of 19.55 % is small. The difference between forecast results in spring and those in autumn is not high, indicating that the improved model can overcome the spring predictability barrier to some extent. Compared with six mature models published previously, the present model has an advantage in prediction precision and length, and is a novel exploration of the ENSO forecast method.
NASA Astrophysics Data System (ADS)
Ma, Feng; Ye, Aizhong; Duan, Qingyun
2017-03-01
An experimental seasonal drought forecasting system is developed based on 29-year (1982-2010) seasonal meteorological hindcasts generated by the climate models from the North American Multi-Model Ensemble (NMME) project. This system made use of a bias correction and spatial downscaling method, and a distributed time-variant gain model (DTVGM) hydrologic model. DTVGM was calibrated using observed daily hydrological data and its streamflow simulations achieved Nash-Sutcliffe efficiency values of 0.727 and 0.724 during calibration (1978-1995) and validation (1996-2005) periods, respectively, at the Danjiangkou reservoir station. The experimental seasonal drought forecasting system (known as NMME-DTVGM) is used to generate seasonal drought forecasts. The forecasts were evaluated against the reference forecasts (i.e., persistence forecast and climatological forecast). The NMME-DTVGM drought forecasts have higher detectability and accuracy and lower false alarm rate than the reference forecasts at different lead times (from 1 to 4 months) during the cold-dry season. No apparent advantage is shown in drought predictions during spring and summer seasons because of a long memory of the initial conditions in spring and a lower predictive skill for precipitation in summer. Overall, the NMME-based seasonal drought forecasting system has meaningful skill in predicting drought several months in advance, which can provide critical information for drought preparedness and response planning as well as the sustainable practice of water resource conservation over the basin.
Research on Urban Road Traffic Congestion Charging Based on Sustainable Development
NASA Astrophysics Data System (ADS)
Ye, Sun
Traffic congestion is a major problem which bothers our urban traffic sustainable development at present. Congestion charging is an effective measure to alleviate urban traffic congestion. The paper first probes into several key issues such as the goal, the pricing, the scope, the method and the redistribution of congestion charging from theoretical angle. Then it introduces congestion charging practice in Singapore and London and draws conclusion and suggestion that traffic congestion charging should take scientific plan, support of public, public transportation development as the premise.
Training the next generation of scientists in Weather Forecasting: new approaches with real models
NASA Astrophysics Data System (ADS)
Carver, Glenn; Váňa, Filip; Siemen, Stephan; Kertesz, Sandor; Keeley, Sarah
2014-05-01
The European Centre for Medium Range Weather Forecasts operationally produce medium range forecasts using what is internationally acknowledged as the world leading global weather forecast model. Future development of this scientifically advanced model relies on a continued availability of experts in the field of meteorological science and with high-level software skills. ECMWF therefore has a vested interest in young scientists and University graduates developing the necessary skills in numerical weather prediction including both scientific and technical aspects. The OpenIFS project at ECMWF maintains a portable version of the ECMWF forecast model (known as IFS) for use in education and research at Universities, National Meteorological Services and other research and education organisations. OpenIFS models can be run on desktop or high performance computers to produce weather forecasts in a similar way to the operational forecasts at ECMWF. ECMWF also provide the Metview desktop application, a modern, graphical, and easy to use tool for analysing and visualising forecasts that is routinely used by scientists and forecasters at ECMWF and other institutions. The combination of Metview with the OpenIFS models has the potential to deliver classroom-friendly tools allowing students to apply their theoretical knowledge to real-world examples using a world-leading weather forecasting model. In this paper we will describe how the OpenIFS model has been used for teaching. We describe the use of Linux based 'virtual machines' pre-packaged on USB sticks that support a technically easy and safe way of providing 'classroom-on-a-stick' learning environments for advanced training in numerical weather prediction. We welcome discussions with interested parties.
NASA Astrophysics Data System (ADS)
Chaudhuri, S.; Das, D.; Goswami, S.; Das, S. K.
2016-11-01
All India summer monsoon rainfall (AISMR) characteristics play a vital role for the policy planning and national economy of the country. In view of the significant impact of monsoon system on regional as well as global climate systems, accurate prediction of summer monsoon rainfall has become a challenge. The objective of this study is to develop an adaptive neuro-fuzzy inference system (ANFIS) for long range forecast of AISMR. The NCEP/NCAR reanalysis data of temperature, zonal and meridional wind at different pressure levels have been taken to construct the input matrix of ANFIS. The membership of the input parameters for AISMR as high, medium or low is estimated with trapezoidal membership function. The fuzzified standardized input parameters and the de-fuzzified target output are trained with artificial neural network models. The forecast of AISMR with ANFIS is compared with non-hybrid multi-layer perceptron model (MLP), radial basis functions network (RBFN) and multiple linear regression (MLR) models. The forecast error analyses of the models reveal that ANFIS provides the best forecast of AISMR with minimum prediction error of 0.076, whereas the errors with MLP, RBFN and MLR models are 0.22, 0.18 and 0.73 respectively. During validation with observations, ANFIS shows its potency over the said comparative models. Performance of the ANFIS model is verified through different statistical skill scores, which also confirms the aptitude of ANFIS in forecasting AISMR. The forecast skill of ANFIS is also observed to be better than Climate Forecast System version 2. The real-time forecast with ANFIS shows possibility of deficit (65-75 cm) AISMR in the year 2015.
Yuan, Xing
2016-06-22
This is the second paper of a two-part series on introducing an experimental seasonal hydrological forecasting system over the Yellow River basin in northern China. While the natural hydrological predictability in terms of initial hydrological conditions (ICs) is investigated in a companion paper, the added value from eight North American Multimodel Ensemble (NMME) climate forecast models with a grand ensemble of 99 members is assessed in this paper, with an implicit consideration of human-induced uncertainty in the hydrological models through a post-processing procedure. The forecast skill in terms of anomaly correlation (AC) for 2 m air temperature and precipitation does not necessarily decrease overmore » leads but is dependent on the target month due to a strong seasonality for the climate over the Yellow River basin. As there is more diversity in the model performance for the temperature forecasts than the precipitation forecasts, the grand NMME ensemble mean forecast has consistently higher skill than the best single model up to 6 months for the temperature but up to 2 months for the precipitation. The NMME climate predictions are downscaled to drive the variable infiltration capacity (VIC) land surface hydrological model and a global routing model regionalized over the Yellow River basin to produce forecasts of soil moisture, runoff and streamflow. And the NMME/VIC forecasts are compared with the Ensemble Streamflow Prediction method (ESP/VIC) through 6-month hindcast experiments for each calendar month during 1982–2010. As verified by the VIC offline simulations, the NMME/VIC is comparable to the ESP/VIC for the soil moisture forecasts, and the former has higher skill than the latter only for the forecasts at long leads and for those initialized in the rainy season. The forecast skill for runoff is lower for both forecast approaches, but the added value from NMME/VIC is more obvious, with an increase of the average AC by 0.08–0.2. To compare with the observed streamflow, both the hindcasts from NMME/VIC and ESP/VIC are post-processed through a linear regression model fitted by using VIC offline-simulated streamflow. The post-processed NMME/VIC reduces the root mean squared error (RMSE) from the post-processed ESP/VIC by 5–15 %. And the reduction occurs mostly during the transition from wet to dry seasons. As a result, with the consideration of the uncertainty in the hydrological models, the added value from climate forecast models is decreased especially at short leads, suggesting the necessity of improving the large-scale hydrological models in human-intervened river basins.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Xing
This is the second paper of a two-part series on introducing an experimental seasonal hydrological forecasting system over the Yellow River basin in northern China. While the natural hydrological predictability in terms of initial hydrological conditions (ICs) is investigated in a companion paper, the added value from eight North American Multimodel Ensemble (NMME) climate forecast models with a grand ensemble of 99 members is assessed in this paper, with an implicit consideration of human-induced uncertainty in the hydrological models through a post-processing procedure. The forecast skill in terms of anomaly correlation (AC) for 2 m air temperature and precipitation does not necessarily decrease overmore » leads but is dependent on the target month due to a strong seasonality for the climate over the Yellow River basin. As there is more diversity in the model performance for the temperature forecasts than the precipitation forecasts, the grand NMME ensemble mean forecast has consistently higher skill than the best single model up to 6 months for the temperature but up to 2 months for the precipitation. The NMME climate predictions are downscaled to drive the variable infiltration capacity (VIC) land surface hydrological model and a global routing model regionalized over the Yellow River basin to produce forecasts of soil moisture, runoff and streamflow. And the NMME/VIC forecasts are compared with the Ensemble Streamflow Prediction method (ESP/VIC) through 6-month hindcast experiments for each calendar month during 1982–2010. As verified by the VIC offline simulations, the NMME/VIC is comparable to the ESP/VIC for the soil moisture forecasts, and the former has higher skill than the latter only for the forecasts at long leads and for those initialized in the rainy season. The forecast skill for runoff is lower for both forecast approaches, but the added value from NMME/VIC is more obvious, with an increase of the average AC by 0.08–0.2. To compare with the observed streamflow, both the hindcasts from NMME/VIC and ESP/VIC are post-processed through a linear regression model fitted by using VIC offline-simulated streamflow. The post-processed NMME/VIC reduces the root mean squared error (RMSE) from the post-processed ESP/VIC by 5–15 %. And the reduction occurs mostly during the transition from wet to dry seasons. As a result, with the consideration of the uncertainty in the hydrological models, the added value from climate forecast models is decreased especially at short leads, suggesting the necessity of improving the large-scale hydrological models in human-intervened river basins.« less
Between the Rock and a Hard Place: The CCMC as a Transit Station Between Modelers and Forecasters
NASA Technical Reports Server (NTRS)
Hesse, Michael
2009-01-01
The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involved model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the latter element. Specifically, we will discuss the process of transition research models, or information generated by research models, to Space Weather Forecasting organizations. We will analyze successes as well as obstacles to further progress, and we will suggest avenues for increased transitioning success.
Small area population forecasting: some experience with British models.
Openshaw, S; Van Der Knaap, G A
1983-01-01
This study is concerned with the evaluation of the various models including time-series forecasts, extrapolation, and projection procedures, that have been developed to prepare population forecasts for planning purposes. These models are evaluated using data for the Netherlands. "As part of a research project at the Erasmus University, space-time population data has been assembled in a geographically consistent way for the period 1950-1979. These population time series are of sufficient length for the first 20 years to be used to build models and then evaluate the performance of the model for the next 10 years. Some 154 different forecasting models for 832 municipalities have been evaluated. It would appear that the best forecasts are likely to be provided by either a Holt-Winters model, or a ratio-correction model, or a low order exponential-smoothing model." excerpt
Optimising seasonal streamflow forecast lead time for operational decision making in Australia
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Q. J.; Zhou, Senlin; Feikema, Paul
2016-10-01
Statistical seasonal forecasts of 3-month streamflow totals are released in Australia by the Bureau of Meteorology and updated on a monthly basis. The forecasts are often released in the second week of the forecast period, due to the onerous forecast production process. The current service relies on models built using data for complete calendar months, meaning the forecast production process cannot begin until the first day of the forecast period. Somehow, the bureau needs to transition to a service that provides forecasts before the beginning of the forecast period; timelier forecast release will become critical as sub-seasonal (monthly) forecasts are developed. Increasing the forecast lead time to one month ahead is not considered a viable option for Australian catchments that typically lack any predictability associated with snowmelt. The bureau's forecasts are built around Bayesian joint probability models that have antecedent streamflow, rainfall and climate indices as predictors. In this study, we adapt the modelling approach so that forecasts have any number of days of lead time. Daily streamflow and sea surface temperatures are used to develop predictors based on 28-day sliding windows. Forecasts are produced for 23 forecast locations with 0-14- and 21-day lead time. The forecasts are assessed in terms of continuous ranked probability score (CRPS) skill score and reliability metrics. CRPS skill scores, on average, reduce monotonically with increase in days of lead time, although both positive and negative differences are observed. Considering only skilful forecast locations, CRPS skill scores at 7-day lead time are reduced on average by 4 percentage points, with differences largely contained within +5 to -15 percentage points. A flexible forecasting system that allows for any number of days of lead time could benefit Australian seasonal streamflow forecast users by allowing more time for forecasts to be disseminated, comprehended and made use of prior to the commencement of a forecast season. The system would allow for forecasts to be updated if necessary.
DOT National Transportation Integrated Search
1995-01-01
The Virginia Department of Transportation uses a cash flow forecasting model to predict operations expenditures by month. Components of this general forecasting model estimate line items in the VDOT budget. The cash flow model was developed in the ea...
Flood Forecasting in Wales: Challenges and Solutions
NASA Astrophysics Data System (ADS)
How, Andrew; Williams, Christopher
2015-04-01
With steep, fast-responding river catchments, exposed coastal reaches with large tidal ranges and large population densities in some of the most at-risk areas; flood forecasting in Wales presents many varied challenges. Utilising advances in computing power and learning from best practice within the United Kingdom and abroad have seen significant improvements in recent years - however, many challenges still remain. Developments in computing and increased processing power comes with a significant price tag; greater numbers of data sources and ensemble feeds brings a better understanding of uncertainty but the wealth of data needs careful management to ensure a clear message of risk is disseminated; new modelling techniques utilise better and faster computation, but lack the history of record and experience gained from the continued use of more established forecasting models. As a flood forecasting team we work to develop coastal and fluvial forecasting models, set them up for operational use and manage the duty role that runs the models in real time. An overview of our current operational flood forecasting system will be presented, along with a discussion on some of the solutions we have in place to address the challenges we face. These include: • real-time updating of fluvial models • rainfall forecasting verification • ensemble forecast data • longer range forecast data • contingency models • offshore to nearshore wave transformation • calculation of wave overtopping
Why preferring parametric forecasting to nonparametric methods?
Jabot, Franck
2015-05-07
A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.
Crase, Beth; Liedloff, Adam; Vesk, Peter A; Fukuda, Yusuke; Wintle, Brendan A
2014-08-01
Species distribution models (SDMs) are widely used to forecast changes in the spatial distributions of species and communities in response to climate change. However, spatial autocorrelation (SA) is rarely accounted for in these models, despite its ubiquity in broad-scale ecological data. While spatial autocorrelation in model residuals is known to result in biased parameter estimates and the inflation of type I errors, the influence of unmodeled SA on species' range forecasts is poorly understood. Here we quantify how accounting for SA in SDMs influences the magnitude of range shift forecasts produced by SDMs for multiple climate change scenarios. SDMs were fitted to simulated data with a known autocorrelation structure, and to field observations of three mangrove communities from northern Australia displaying strong spatial autocorrelation. Three modeling approaches were implemented: environment-only models (most frequently applied in species' range forecasts), and two approaches that incorporate SA; autologistic models and residuals autocovariate (RAC) models. Differences in forecasts among modeling approaches and climate scenarios were quantified. While all model predictions at the current time closely matched that of the actual current distribution of the mangrove communities, under the climate change scenarios environment-only models forecast substantially greater range shifts than models incorporating SA. Furthermore, the magnitude of these differences intensified with increasing increments of climate change across the scenarios. When models do not account for SA, forecasts of species' range shifts indicate more extreme impacts of climate change, compared to models that explicitly account for SA. Therefore, where biological or population processes induce substantial autocorrelation in the distribution of organisms, and this is not modeled, model predictions will be inaccurate. These results have global importance for conservation efforts as inaccurate forecasts lead to ineffective prioritization of conservation activities and potentially to avoidable species extinctions. © 2014 John Wiley & Sons Ltd.
Wang, Deyun; Liu, Yanling; Luo, Hongyuan; Yue, Chenqiang; Cheng, Sheng
2017-01-01
Accurate PM2.5 concentration forecasting is crucial for protecting public health and atmospheric environment. However, the intermittent and unstable nature of PM2.5 concentration series makes its forecasting become a very difficult task. In order to improve the forecast accuracy of PM2.5 concentration, this paper proposes a hybrid model based on wavelet transform (WT), variational mode decomposition (VMD) and back propagation (BP) neural network optimized by differential evolution (DE) algorithm. Firstly, WT is employed to disassemble the PM2.5 concentration series into a number of subsets with different frequencies. Secondly, VMD is applied to decompose each subset into a set of variational modes (VMs). Thirdly, DE-BP model is utilized to forecast all the VMs. Fourthly, the forecast value of each subset is obtained through aggregating the forecast results of all the VMs obtained from VMD decomposition of this subset. Finally, the final forecast series of PM2.5 concentration is obtained by adding up the forecast values of all subsets. Two PM2.5 concentration series collected from Wuhan and Tianjin, respectively, located in China are used to test the effectiveness of the proposed model. The results demonstrate that the proposed model outperforms all the other considered models in this paper. PMID:28704955
Nowcasting and Forecasting the Monthly Food Stamps Data in the US Using Online Search Data
Fantazzini, Dean
2014-01-01
We propose the use of Google online search data for nowcasting and forecasting the number of food stamps recipients. We perform a large out-of-sample forecasting exercise with almost 3000 competing models with forecast horizons up to 2 years ahead, and we show that models including Google search data statistically outperform the competing models at all considered horizons. These results hold also with several robustness checks, considering alternative keywords, a falsification test, different out-of-samples, directional accuracy and forecasts at the state-level. PMID:25369315
Wattad, Malak; Darawsha, Wisam; Solomonica, Amir; Hijazi, Maher; Kaplan, Marielle; Makhoul, Badira F; Abassi, Zaid A; Azzam, Zaher S; Aronson, Doron
2015-04-01
Worsening renal function (WRF) and congestion are inextricably related pathophysiologically, suggesting that WRF occurring in conjunction with persistent congestion would be associated with worse clinical outcome. We studied the interdependence between WRF and persistent congestion in 762 patients with acute decompensated heart failure (HF). WRF was defined as ≥0.3 mg/dl increase in serum creatinine above baseline at any time during hospitalization and persistent congestion as ≥1 sign of congestion at discharge. The primary end point was all-cause mortality with mean follow-up of 15 ± 9 months. Readmission for HF was a secondary end point. Persistent congestion was more common in patients with WRF than in patients with stable renal function (51.0% vs 26.6%, p <0.0001). Both persistent congestion and persistent WRF were significantly associated with mortality (both p <0.0001). There was a strong interaction (p = 0.003) between persistent WRF and congestion, such that the increased risk for mortality occurred predominantly with both WRF and persistent congestion. The adjusted hazard ratio for mortality in patients with persistent congestion as compared with those without was 4.16 (95% confidence interval [CI] 2.20 to 7.86) in patients with WRF and 1.50 (95% CI 1.16 to 1.93) in patients without WRF. In conclusion, persisted congestion is frequently associated with WRF. We have identified a substantial interaction between persistent congestion and WRF such that congestion portends increased mortality particularly when associated with WRF. Copyright © 2015 Elsevier Inc. All rights reserved.
Palm oil price forecasting model: An autoregressive distributed lag (ARDL) approach
NASA Astrophysics Data System (ADS)
Hamid, Mohd Fahmi Abdul; Shabri, Ani
2017-05-01
Palm oil price fluctuated without any clear trend or cyclical pattern in the last few decades. The instability of food commodities price causes it to change rapidly over time. This paper attempts to develop Autoregressive Distributed Lag (ARDL) model in modeling and forecasting the price of palm oil. In order to use ARDL as a forecasting model, this paper modifies the data structure where we only consider lagged explanatory variables to explain the variation in palm oil price. We then compare the performance of this ARDL model with a benchmark model namely ARIMA in term of their comparative forecasting accuracy. This paper also utilize ARDL bound testing approach to co-integration in examining the short run and long run relationship between palm oil price and its determinant; production, stock, and price of soybean as the substitute of palm oil and price of crude oil. The comparative forecasting accuracy suggests that ARDL model has a better forecasting accuracy compared to ARIMA.
A multivariate time series approach to modeling and forecasting demand in the emergency department.
Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L
2009-02-01
The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.
Influenza forecasting with Google Flu Trends.
Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E
2013-01-01
We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.
NASA Technical Reports Server (NTRS)
Wolfson, N.; Thomasell, A.; Alperson, Z.; Brodrick, H.; Chang, J. T.; Gruber, A.; Ohring, G.
1984-01-01
The impact of introducing satellite temperature sounding data on a numerical weather prediction model of a national weather service is evaluated. A dry five level, primitive equation model which covers most of the Northern Hemisphere, is used for these experiments. Series of parallel forecast runs out to 48 hours are made with three different sets of initial conditions: (1) NOSAT runs, only conventional surface and upper air observations are used; (2) SAT runs, satellite soundings are added to the conventional data over oceanic regions and North Africa; and (3) ALLSAT runs, the conventional upper air observations are replaced by satellite soundings over the entire model domain. The impact on the forecasts is evaluated by three verification methods: the RMS errors in sea level pressure forecasts, systematic errors in sea level pressure forecasts, and errors in subjective forecasts of significant weather elements for a selected portion of the model domain. For the relatively short range of the present forecasts, the major beneficial impacts on the sea level pressure forecasts are found precisely in those areas where the satellite sounding are inserted and where conventional upper air observations are sparse. The RMS and systematic errors are reduced in these regions. The subjective forecasts of significant weather elements are improved with the use of the satellite data. It is found that the ALLSAT forecasts are of a quality comparable to the SAR forecasts.
Global Positioning System (GPS) Precipitable Water in Forecasting Lightning at Spaceport Canaveral
NASA Technical Reports Server (NTRS)
Kehrer, Kristen C.; Graf, Brian; Roeder, William
2006-01-01
This paper evaluates the use of precipitable water (PW) from Global Positioning System (GPS) in lightning prediction. Additional independent verification of an earlier model is performed. This earlier model used binary logistic regression with the following four predictor variables optimally selected from a candidate list of 23 candidate predictors: the current precipitable water value for a given time of the day, the change in GPS-PW over the past 9 hours, the KIndex, and the electric field mill value. This earlier model was not optimized for any specific forecast interval, but showed promise for 6 hour and 1.5 hour forecasts. Two new models were developed and verified. These new models were optimized for two operationally significant forecast intervals. The first model was optimized for the 0.5 hour lightning advisories issued by the 45th Weather Squadron. An additional 1.5 hours was allowed for sensor dwell, communication, calculation, analysis, and advisory decision by the forecaster. Therefore the 0.5 hour advisory model became a 2 hour forecast model for lightning within the 45th Weather Squadron advisory areas. The second model was optimized for major ground processing operations supported by the 45th Weather Squadron, which can require lightning forecasts with a lead-time of up to 7.5 hours. Using the same 1.5 lag as in the other new model, this became a 9 hour forecast model for lightning within 37 km (20 NM)) of the 45th Weather Squadron advisory areas. The two new models were built using binary logistic regression from a list of 26 candidate predictor variables: the current GPS-PW value, the change of GPS-PW over 0.5 hour increments from 0.5 to 12 hours, and the K-index. The new 2 hour model found the following for predictors to be statistically significant, listed in decreasing order of contribution to the forecast: the 0.5 hour change in GPS-PW, the 7.5 hour change in GPS-PW, the current GPS-PW value, and the KIndex. The new 9 hour forecast model found the following five independent variables to be statistically significant, listed in decreasing order of contribution to the forecast: the current GPSPW value, the 8.5 hour change in GPS-PW, the 3.5 hour change in GPS-PW, the 12 hour change in GPS-PW, and the K-Index. In both models, the GPS-PW parameters had better correlation to the lightning forecast than the K-Index, a widely used thunderstorm index. Possible future improvements to this study are discussed.
NASA Astrophysics Data System (ADS)
Dutton, John A.; James, Richard P.; Ross, Jeremy D.
2013-06-01
Seasonal probability forecasts produced with numerical dynamics on supercomputers offer great potential value in managing risk and opportunity created by seasonal variability. The skill and reliability of contemporary forecast systems can be increased by calibration methods that use the historical performance of the forecast system to improve the ongoing real-time forecasts. Two calibration methods are applied to seasonal surface temperature forecasts of the US National Weather Service, the European Centre for Medium Range Weather Forecasts, and to a World Climate Service multi-model ensemble created by combining those two forecasts with Bayesian methods. As expected, the multi-model is somewhat more skillful and more reliable than the original models taken alone. The potential value of the multimodel in decision making is illustrated with the profits achieved in simulated trading of a weather derivative. In addition to examining the seasonal models, the article demonstrates that calibrated probability forecasts of weekly average temperatures for leads of 2-4 weeks are also skillful and reliable. The conversion of ensemble forecasts into probability distributions of impact variables is illustrated with degree days derived from the temperature forecasts. Some issues related to loss of stationarity owing to long-term warming are considered. The main conclusion of the article is that properly calibrated probabilistic forecasts possess sufficient skill and reliability to contribute to effective decisions in government and business activities that are sensitive to intraseasonal and seasonal climate variability.
NASA Astrophysics Data System (ADS)
Frolov, Vladimir; Backhaus, Scott; Chertkov, Misha
2014-10-01
We explore optimization methods for planning the placement, sizing and operations of flexible alternating current transmission system (FACTS) devices installed to relieve transmission grid congestion. We limit our selection of FACTS devices to series compensation (SC) devices that can be represented by modification of the inductance of transmission lines. Our master optimization problem minimizes the l1 norm of the inductance modification subject to the usual line thermal-limit constraints. We develop heuristics that reduce this non-convex optimization to a succession of linear programs (LP) that are accelerated further using cutting plane methods. The algorithm solves an instance of the MatPower Polish Grid model (3299 lines and 2746 nodes) in 40 seconds per iteration on a standard laptop—a speed that allows the sizing and placement of a family of SC devices to correct a large set of anticipated congestions. We observe that our algorithm finds feasible solutions that are always sparse, i.e., SC devices are placed on only a few lines. In a companion manuscript, we demonstrate our approach on realistically sized networks that suffer congestion from a range of causes, including generator retirement. In this manuscript, we focus on the development of our approach, investigate its structure on a small test system subject to congestion from uniform load growth, and demonstrate computational efficiency on a realistically sized network.
Frolov, Vladimir; Backhaus, Scott; Chertkov, Misha
2014-10-24
We explore optimization methods for planning the placement, sizing and operations of Flexible Alternating Current Transmission System (FACTS) devices installed to relieve transmission grid congestion. We limit our selection of FACTS devices to Series Compensation (SC) devices that can be represented by modification of the inductance of transmission lines. Our master optimization problem minimizes the l 1 norm of the inductance modification subject to the usual line thermal-limit constraints. We develop heuristics that reduce this non-convex optimization to a succession of Linear Programs (LP) which are accelerated further using cutting plane methods. The algorithm solves an instance of the MatPowermore » Polish Grid model (3299 lines and 2746 nodes) in 40 seconds per iteration on a standard laptop—a speed up that allows the sizing and placement of a family of SC devices to correct a large set of anticipated congestions. We observe that our algorithm finds feasible solutions that are always sparse, i.e., SC devices are placed on only a few lines. In a companion manuscript, we demonstrate our approach on realistically-sized networks that suffer congestion from a range of causes including generator retirement. In this manuscript, we focus on the development of our approach, investigate its structure on a small test system subject to congestion from uniform load growth, and demonstrate computational efficiency on a realistically-sized network.« less
Mean field games with congestion
NASA Astrophysics Data System (ADS)
Achdou, Yves; Porretta, Alessio
2018-03-01
We consider a class of systems of time dependent partial differential equations which arise in mean field type models with congestion. The systems couple a backward viscous Hamilton-Jacobi equation and a forward Kolmogorov equation both posed in $(0,T)\\times (\\mathbb{R}^N /\\mathbb{Z}^N)$. Because of congestion and by contrast with simpler cases, the latter system can never be seen as the optimality conditions of an optimal control problem driven by a partial differential equation. The Hamiltonian vanishes as the density tends to $+\\infty$ and may not even be defined in the regions where the density is zero. After giving a suitable definition of weak solutions, we prove the existence and uniqueness results of the latter under rather general assumptions. No restriction is made on the horizon $T$.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Jeffrey M.; Manobianco, John; Schroeder, John
This Final Report presents a comprehensive description, findings, and conclusions for the Wind Forecast Improvement Project (WFIP) -- Southern Study Area (SSA) work led by AWS Truepower (AWST). This multi-year effort, sponsored by the Department of Energy (DOE) and National Oceanographic and Atmospheric Administration (NOAA), focused on improving short-term (15-minute - 6 hour) wind power production forecasts through the deployment of an enhanced observation network of surface and remote sensing instrumentation and the use of a state-of-the-art forecast modeling system. Key findings from the SSA modeling and forecast effort include: 1. The AWST WFIP modeling system produced an overall 10more » - 20% improvement in wind power production forecasts over the existing Baseline system, especially during the first three forecast hours; 2. Improvements in ramp forecast skill, particularly for larger up and down ramps; 3. The AWST WFIP data denial experiments showed mixed results in the forecasts incorporating the experimental network instrumentation; however, ramp forecasts showed significant benefit from the additional observations, indicating that the enhanced observations were key to the model systems’ ability to capture phenomena responsible for producing large short-term excursions in power production; 4. The OU CAPS ARPS simulations showed that the additional WFIP instrument data had a small impact on their 3-km forecasts that lasted for the first 5-6 hours, and increasing the vertical model resolution in the boundary layer had a greater impact, also in the first 5 hours; and 5. The TTU simulations were inconclusive as to which assimilation scheme (3DVAR versus EnKF) provided better forecasts, and the additional observations resulted in some improvement to the forecasts in the first 1 - 3 hours.« less
NASA Astrophysics Data System (ADS)
Higgins, S. M. W.; Du, H. L.; Smith, L. A.
2012-04-01
Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.
Performance of stochastic approaches for forecasting river water quality.
Ahmad, S; Khan, I H; Parida, B P
2001-12-01
This study analysed water quality data collected from the river Ganges in India from 1981 to 1990 for forecasting using stochastic models. Initially the box and whisker plots and Kendall's tau test were used to identify the trends during the study period. For detecting the possible intervention in the data the time series plots and cusum charts were used. The three approaches of stochastic modelling which account for the effect of seasonality in different ways. i.e. multiplicative autoregressive integrated moving average (ARIMA) model. deseasonalised model and Thomas-Fiering model were used to model the observed pattern in water quality. The multiplicative ARIMA model having both nonseasonal and seasonal components were, in general, identified as appropriate models. In the deseasonalised modelling approach, the lower order ARIMA models were found appropriate for the stochastic component. The set of Thomas-Fiering models were formed for each month for all water quality parameters. These models were then used to forecast the future values. The error estimates of forecasts from the three approaches were compared to identify the most suitable approach for the reliable forecast. The deseasonalised modelling approach was recommended for forecasting of water quality parameters of a river.
NASA Astrophysics Data System (ADS)
Foster, Kean; Bertacchi Uvo, Cintia; Olsson, Jonas
2018-05-01
Hydropower makes up nearly half of Sweden's electrical energy production. However, the distribution of the water resources is not aligned with demand, as most of the inflows to the reservoirs occur during the spring flood period. This means that carefully planned reservoir management is required to help redistribute water resources to ensure optimal production and accurate forecasts of the spring flood volume (SFV) is essential for this. The current operational SFV forecasts use a historical ensemble approach where the HBV model is forced with historical observations of precipitation and temperature. In this work we develop and test a multi-model prototype, building on previous work, and evaluate its ability to forecast the SFV in 84 sub-basins in northern Sweden. The hypothesis explored in this work is that a multi-model seasonal forecast system incorporating different modelling approaches is generally more skilful at forecasting the SFV in snow dominated regions than a forecast system that utilises only one approach. The testing is done using cross-validated hindcasts for the period 1981-2015 and the results are evaluated against both climatology and the current system to determine skill. Both the multi-model methods considered showed skill over the reference forecasts. The version that combined the historical modelling chain, dynamical modelling chain, and statistical modelling chain performed better than the other and was chosen for the prototype. The prototype was able to outperform the current operational system 57 % of the time on average and reduce the error in the SFV by ˜ 6 % across all sub-basins and forecast dates.
Forecasting United States heartworm Dirofilaria immitis prevalence in dogs.
Bowman, Dwight D; Liu, Yan; McMahan, Christopher S; Nordone, Shila K; Yabsley, Michael J; Lund, Robert B
2016-10-10
This paper forecasts next year's canine heartworm prevalence in the United States from 16 climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 31 million antigen heartworm tests conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on 16 predictive factors, including temperature, precipitation, median household income, local forest and surface water coverage, and presence/absence of eight mosquito species. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county heartworm prevalence for the 5-year period 2011-2015 is 0.727, demonstrating reasonable model accuracy. The correlation between 2015 observed and forecasted county-by-county heartworm prevalence is 0.940, demonstrating significant skill and showing that heartworm prevalence can be forecasted reasonably accurately. The forecast presented herein can a priori alert veterinarians to areas expected to see higher than normal heartworm activity. The proposed methods may prove useful for forecasting other diseases.
NASA Astrophysics Data System (ADS)
Haiyang, Yu; Yanmei, Liu; Guijun, Yang; Xiaodong, Yang; Dong, Ren; Chenwei, Nie
2014-03-01
To achieve dynamic winter wheat quality monitoring and forecasting in larger scale regions, the objective of this study was to design and develop a winter wheat quality monitoring and forecasting system by using a remote sensing index and environmental factors. The winter wheat quality trend was forecasted before the harvest and quality was monitored after the harvest, respectively. The traditional quality-vegetation index from remote sensing monitoring and forecasting models were improved. Combining with latitude information, the vegetation index was used to estimate agronomy parameters which were related with winter wheat quality in the early stages for forecasting the quality trend. A combination of rainfall in May, temperature in May, illumination at later May, the soil available nitrogen content and other environmental factors established the quality monitoring model. Compared with a simple quality-vegetation index, the remote sensing monitoring and forecasting model used in this system get greatly improved accuracy. Winter wheat quality was monitored and forecasted based on the above models, and this system was completed based on WebGIS technology. Finally, in 2010 the operation process of winter wheat quality monitoring system was presented in Beijing, the monitoring and forecasting results was outputted as thematic maps.
Delay-based virtual congestion control in multi-tenant datacenters
NASA Astrophysics Data System (ADS)
Liu, Yuxin; Zhu, Danhong; Zhang, Dong
2018-03-01
With the evolution of cloud computing and virtualization, the congestion control of virtual datacenters has become the basic issue for multi-tenant datacenters transmission. Regarding to the friendly conflict of heterogeneous congestion control among multi-tenant, this paper proposes a delay-based virtual congestion control, which translates the multi-tenant heterogeneous congestion control into delay-based feedback uniformly by setting the hypervisor translation layer, modifying three-way handshake of explicit feedback and packet loss feedback and throttling receive window. The simulation results show that the delay-based virtual congestion control can effectively solve the unfairness of heterogeneous feedback congestion control algorithms.
Nambe Pueblo Water Budget and Forecasting model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brainard, James Robert
2009-10-01
This report documents The Nambe Pueblo Water Budget and Water Forecasting model. The model has been constructed using Powersim Studio (PS), a software package designed to investigate complex systems where flows and accumulations are central to the system. Here PS has been used as a platform for modeling various aspects of Nambe Pueblo's current and future water use. The model contains three major components, the Water Forecast Component, Irrigation Scheduling Component, and the Reservoir Model Component. In each of the components, the user can change variables to investigate the impacts of water management scenarios on future water use. The Watermore » Forecast Component includes forecasting for industrial, commercial, and livestock use. Domestic demand is also forecasted based on user specified current population, population growth rates, and per capita water consumption. Irrigation efficiencies are quantified in the Irrigated Agriculture component using critical information concerning diversion rates, acreages, ditch dimensions and seepage rates. Results from this section are used in the Water Demand Forecast, Irrigation Scheduling, and the Reservoir Model components. The Reservoir Component contains two sections, (1) Storage and Inflow Accumulations by Categories and (2) Release, Diversion and Shortages. Results from both sections are derived from the calibrated Nambe Reservoir model where historic, pre-dam or above dam USGS stream flow data is fed into the model and releases are calculated.« less
Linden, Ariel
2018-05-11
Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. ITSA is commonly evaluated using methods which may produce biased results if model assumptions are violated. In this paper, treatment effects are alternatively assessed by using forecasting methods to closely fit the preintervention observations and then forecast the post-intervention trend. A treatment effect may be inferred if the actual post-intervention observations diverge from the forecasts by some specified amount. The forecasting approach is demonstrated using the effect of California's Proposition 99 for reducing cigarette sales. Three forecast models are fit to the preintervention series-linear regression (REG), Holt-Winters (HW) non-seasonal smoothing, and autoregressive moving average (ARIMA)-and forecasts are generated into the post-intervention period. The actual observations are then compared with the forecasts to assess intervention effects. The preintervention data were fit best by HW, followed closely by ARIMA. REG fit the data poorly. The actual post-intervention observations were above the forecasts in HW and ARIMA, suggesting no intervention effect, but below the forecasts in the REG (suggesting a treatment effect), thereby raising doubts about any definitive conclusion of a treatment effect. In a single-group ITSA, treatment effects are likely to be biased if the model is misspecified. Therefore, evaluators should consider using forecast models to accurately fit the preintervention data and generate plausible counterfactual forecasts, thereby improving causal inference of treatment effects in single-group ITSA studies. © 2018 John Wiley & Sons, Ltd.
Seasonal forecasting of discharge for the Raccoon River, Iowa
NASA Astrophysics Data System (ADS)
Slater, Louise; Villarini, Gabriele; Bradley, Allen; Vecchi, Gabriel
2016-04-01
The state of Iowa (central United States) is regularly afflicted by severe natural hazards such as the 2008/2013 floods and the 2012 drought. To improve preparedness for these catastrophic events and allow Iowans to make more informed decisions about the most suitable water management strategies, we have developed a framework for medium to long range probabilistic seasonal streamflow forecasting for the Raccoon River at Van Meter, a 8900-km2 catchment located in central-western Iowa. Our flow forecasts use statistical models to predict seasonal discharge for low to high flows, with lead forecasting times ranging from one to ten months. Historical measurements of daily discharge are obtained from the U.S. Geological Survey (USGS) at the Van Meter stream gage, and used to compute quantile time series from minimum to maximum seasonal flow. The model is forced with basin-averaged total seasonal precipitation records from the PRISM Climate Group and annual row crop production acreage from the U.S. Department of Agriculture's National Agricultural Statistics Services database. For the forecasts, we use corn and soybean production from the previous year (persistence forecast) as a proxy for the impacts of agricultural practices on streamflow. The monthly precipitation forecasts are provided by eight Global Climate Models (GCMs) from the North American Multi-Model Ensemble (NMME), with lead times ranging from 0.5 to 11.5 months, and a resolution of 1 decimal degree. Additionally, precipitation from the month preceding each season is used to characterize antecedent soil moisture conditions. The accuracy of our modelled (1927-2015) and forecasted (2001-2015) discharge values is assessed by comparison with the observed USGS data. We explore the sensitivity of forecast skill over the full range of lead times, flow quantiles, forecast seasons, and with each GCM. Forecast skill is also examined using different formulations of the statistical models, as well as NMME forecast weighting procedures based on the computed potential skill (historical forecast accuracy) of the different GCMs. We find that the models describe the year-to-year variability in streamflow accurately, as well as the overall tendency towards increasing (and more variable) discharge over time. Surprisingly, forecast skill does not decrease markedly with lead time, and high flows tend to be well predicted, suggesting that these forecasts may have considerable practical applications. Further, the seasonal flow forecast accuracy is substantially improved by weighting the contribution of individual GCMs to the forecasts, and also by the inclusion of antecedent precipitation. Our results can provide critical information for adaptation strategies aiming to mitigate the costs and disruptions arising from flood and drought conditions, and allow us to determine how far in advance skillful forecasts can be issued. The availability of these discharge forecasts would have major societal and economic benefits for hydrology and water resources management, agriculture, disaster forecasts and prevention, energy, finance and insurance, food security, policy-making and public authorities, and transportation.
Weather Forecaster Understanding of Climate Models
NASA Astrophysics Data System (ADS)
Bol, A.; Kiehl, J. T.; Abshire, W. E.
2013-12-01
Weather forecasters, particularly those in broadcasting, are the primary conduit to the public for information on climate and climate change. However, many weather forecasters remain skeptical of model-based climate projections. To address this issue, The COMET Program developed an hour-long online lesson of how climate models work, targeting an audience of weather forecasters. The module draws on forecasters' pre-existing knowledge of weather, climate, and numerical weather prediction (NWP) models. In order to measure learning outcomes, quizzes were given before and after the lesson. Preliminary results show large learning gains. For all people that took both pre and post-tests (n=238), scores improved from 48% to 80%. Similar pre/post improvement occurred for National Weather Service employees (51% to 87%, n=22 ) and college faculty (50% to 90%, n=7). We believe these results indicate a fundamental misunderstanding among many weather forecasters of (1) the difference between weather and climate models, (2) how researchers use climate models, and (3) how they interpret model results. The quiz results indicate that efforts to educate the public about climate change need to include weather forecasters, a vital link between the research community and the general public.
Skill of ENSEMBLES seasonal re-forecasts for malaria prediction in West Africa
NASA Astrophysics Data System (ADS)
Jones, A. E.; Morse, A. P.
2012-12-01
This study examines the performance of malaria-relevant climate variables from the ENSEMBLES seasonal ensemble re-forecasts for sub-Saharan West Africa, using a dynamic malaria model to transform temperature and rainfall forecasts into simulated malaria incidence and verifying these forecasts against simulations obtained by driving the malaria model with General Circulation Model-derived reanalysis. Two subregions of forecast skill are identified: the highlands of Cameroon, where low temperatures limit simulated malaria during the forecast period and interannual variability in simulated malaria is closely linked to variability in temperature, and northern Nigeria/southern Niger, where simulated malaria variability is strongly associated with rainfall variability during the peak rain months.
A novel hybrid ensemble learning paradigm for tourism forecasting
NASA Astrophysics Data System (ADS)
Shabri, Ani
2015-02-01
In this paper, a hybrid forecasting model based on Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) is proposed to forecast tourism demand. This methodology first decomposes the original visitor arrival series into several Intrinsic Model Function (IMFs) components and one residual component by EMD technique. Then, IMFs components and the residual components is forecasted respectively using GMDH model whose input variables are selected by using Partial Autocorrelation Function (PACF). The final forecasted result for tourism series is produced by aggregating all the forecasted results. For evaluating the performance of the proposed EMD-GMDH methodologies, the monthly data of tourist arrivals from Singapore to Malaysia are used as an illustrative example. Empirical results show that the proposed EMD-GMDH model outperforms the EMD-ARIMA as well as the GMDH and ARIMA (Autoregressive Integrated Moving Average) models without time series decomposition.
Electricity Load Forecasting Using Support Vector Regression with Memetic Algorithms
Hu, Zhongyi; Xiong, Tao
2013-01-01
Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature. PMID:24459425
Electricity load forecasting using support vector regression with memetic algorithms.
Hu, Zhongyi; Bao, Yukun; Xiong, Tao
2013-01-01
Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR) has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA) based memetic algorithm (FA-MA) to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature.
A GLM Post-processor to Adjust Ensemble Forecast Traces
NASA Astrophysics Data System (ADS)
Thiemann, M.; Day, G. N.; Schaake, J. C.; Draijer, S.; Wang, L.
2011-12-01
The skill of hydrologic ensemble forecasts has improved in the last years through a better understanding of climate variability, better climate forecasts and new data assimilation techniques. Having been extensively utilized for probabilistic water supply forecasting, interest is developing to utilize these forecasts in operational decision making. Hydrologic ensemble forecast members typically have inherent biases in flow timing and volume caused by (1) structural errors in the models used, (2) systematic errors in the data used to calibrate those models, (3) uncertain initial hydrologic conditions, and (4) uncertainties in the forcing datasets. Furthermore, hydrologic models have often not been developed for operational decision points and ensemble forecasts are thus not always available where needed. A statistical post-processor can be used to address these issues. The post-processor should (1) correct for systematic biases in flow timing and volume, (2) preserve the skill of the available raw forecasts, (3) preserve spatial and temporal correlation as well as the uncertainty in the forecasted flow data, (4) produce adjusted forecast ensembles that represent the variability of the observed hydrograph to be predicted, and (5) preserve individual forecast traces as equally likely. The post-processor should also allow for the translation of available ensemble forecasts to hydrologically similar locations where forecasts are not available. This paper introduces an ensemble post-processor (EPP) developed in support of New York City water supply operations. The EPP employs a general linear model (GLM) to (1) adjust available ensemble forecast traces and (2) create new ensembles for (nearby) locations where only historic flow observations are available. The EPP is calibrated by developing daily and aggregated statistical relationships form historical flow observations and model simulations. These are then used in operation to obtain the conditional probability density function (PDF) of the observations to be predicted, thus jointly adjusting individual ensemble members. These steps are executed in a normalized transformed space ('z'-space) to account for the strong non-linearity in the flow observations involved. A data window centered on each calibration date is used to minimize impacts from sampling errors and data noise. Testing on datasets from California and New York suggests that the EPP can successfully minimize biases in ensemble forecasts, while preserving the raw forecast skill in a 'days to weeks' forecast horizon and reproducing the variability of climatology for 'weeks to years' forecast horizons.
Convective Weather Forecast Accuracy Analysis at Center and Sector Levels
NASA Technical Reports Server (NTRS)
Wang, Yao; Sridhar, Banavar
2010-01-01
This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in intensity and location for center forecast are relatively low. For example, 1-hour forecast intensity and horizontal location errors for ZDC center were about 0.12 and 0.13. However, the correlation between sector 1-hour forecast and actual weather coverage was weak, for sector ZDC32, about 32% of the total variation of observation weather intensity was unexplained by forecast; the sector horizontal location error was about 0.10. The paper also introduces an approach to estimate the sector three-dimensional actual weather coverage by using multiple sector forecasts, which turned out to produce better predictions. Using Multiple Linear Regression (MLR) model for this approach, the correlations between actual observation and the multiple sector forecast model prediction improved by several percents at 95% confidence level in comparison with single sector forecast.
Influenza forecasting in human populations: a scoping review.
Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A; McKenzie, F Ellis
2014-01-01
Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms "influenza AND (forecast* OR predict*)", excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials.
Influenza Forecasting in Human Populations: A Scoping Review
Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A.; McKenzie, F. Ellis
2014-01-01
Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms “influenza AND (forecast* OR predict*)”, excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials. PMID:24714027
NASA Astrophysics Data System (ADS)
Sulaiman, M.; El-Shafie, A.; Karim, O.; Basri, H.
2011-10-01
Flood forecasting models are a necessity, as they help in planning for flood events, and thus help prevent loss of lives and minimize damage. At present, artificial neural networks (ANN) have been successfully applied in river flow and water level forecasting studies. ANN requires historical data to develop a forecasting model. However, long-term historical water level data, such as hourly data, poses two crucial problems in data training. First is that the high volume of data slows the computation process. Second is that data training reaches its optimal performance within a few cycles of data training, due to there being a high volume of normal water level data in the data training, while the forecasting performance for high water level events is still poor. In this study, the zoning matching approach (ZMA) is used in ANN to accurately monitor flood events in real time by focusing the development of the forecasting model on high water level zones. ZMA is a trial and error approach, where several training datasets using high water level data are tested to find the best training dataset for forecasting high water level events. The advantage of ZMA is that relevant knowledge of water level patterns in historical records is used. Importantly, the forecasting model developed based on ZMA successfully achieves high accuracy forecasting results at 1 to 3 h ahead and satisfactory performance results at 6 h. Seven performance measures are adopted in this study to describe the accuracy and reliability of the forecasting model developed.
Seasonal forecast of St. Louis encephalitis virus transmission, Florida.
Shaman, Jeffrey; Day, Jonathan F; Stieglitz, Marc; Zebiak, Stephen; Cane, Mark
2004-05-01
Disease transmission forecasts can help minimize human and domestic animal health risks by indicating where disease control and prevention efforts should be focused. For disease systems in which weather-related variables affect pathogen proliferation, dispersal, or transmission, the potential for disease forecasting exists. We present a seasonal forecast of St. Louis encephalitis virus transmission in Indian River County, Florida. We derive an empiric relationship between modeled land surface wetness and levels of SLEV transmission in humans. We then use these data to forecast SLEV transmission with a seasonal lead. Forecast skill is demonstrated, and a real-time seasonal forecast of epidemic SLEV transmission is presented. This study demonstrates how weather and climate forecast skill-verification analyses may be applied to test the predictability of an empiric disease forecast model.
Seasonal Forecast of St. Louis Encephalitis Virus Transmission, Florida
Day, Jonathan F.; Stieglitz, Marc; Zebiak, Stephen; Cane, Mark
2004-01-01
Disease transmission forecasts can help minimize human and domestic animal health risks by indicating where disease control and prevention efforts should be focused. For disease systems in which weather-related variables affect pathogen proliferation, dispersal, or transmission, the potential for disease forecasting exists. We present a seasonal forecast of St. Louis encephalitis virus transmission in Indian River County, Florida. We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission in humans. We then use these data to forecast SLEV transmission with a seasonal lead. Forecast skill is demonstrated, and a real-time seasonal forecast of epidemic SLEV transmission is presented. This study demonstrates how weather and climate forecast skill verification analyses may be applied to test the predictability of an empirical disease forecast model. PMID:15200812
Tourism forecasting using modified empirical mode decomposition and group method of data handling
NASA Astrophysics Data System (ADS)
Yahya, N. A.; Samsudin, R.; Shabri, A.
2017-09-01
In this study, a hybrid model using modified Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) model is proposed for tourism forecasting. This approach reconstructs intrinsic mode functions (IMFs) produced by EMD using trial and error method. The new component and the remaining IMFs is then predicted respectively using GMDH model. Finally, the forecasted results for each component are aggregated to construct an ensemble forecast. The data used in this experiment are monthly time series data of tourist arrivals from China, Thailand and India to Malaysia from year 2000 to 2016. The performance of the model is evaluated using Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) where conventional GMDH model and EMD-GMDH model are used as benchmark models. Empirical results proved that the proposed model performed better forecasts than the benchmarked models.
Multimodal solutions for large scale evacuations.
DOT National Transportation Integrated Search
2009-12-30
In this research, a multimodal transportation model was developed attending the needs of emergency situations, : and the solutions provided by the model could be used to moderate congestion during such events. : The model incorporated features such a...
NASA Astrophysics Data System (ADS)
LI, Y.; Castelletti, A.; Giuliani, M.
2014-12-01
Over recent years, long-term climate forecast from global circulation models (GCMs) has been demonstrated to show increasing skills over the climatology, thanks to the advances in the modelling of coupled ocean-atmosphere dynamics. Improved information from long-term forecast is supposed to be a valuable support to farmers in optimizing farming operations (e.g. crop choice, cropping time) and for more effectively coping with the adverse impacts of climate variability. Yet, evaluating how valuable this information can be is not straightforward and farmers' response must be taken into consideration. Indeed, while long-range forecast are traditionally evaluated in terms of accuracy by comparison of hindcast and observed values, in the context of agricultural systems, potentially useful forecast information should alter the stakeholders' expectation, modify their decisions and ultimately have an impact on their annual benefit. Therefore, it is more desirable to assess the value of those long-term forecasts via decision-making models so as to extract direct indication of probable decision outcomes from farmers, i.e. from an end-to-end perspective. In this work, we evaluate the operational value of thirteen state-of-the-art long-range forecast ensembles against climatology forecast and subjective prediction (i.e. past year climate and historical average) within an integrated agronomic modeling framework embedding an implicit model of farmers' behavior. Collected ensemble datasets are bias-corrected and downscaled using a stochastic weather generator, in order to address the mismatch of the spatio-temporal scale between forecast data from GCMs and distributed crop simulation model. The agronomic model is first simulated using the forecast information (ex-ante), followed by a second run with actual climate (ex-post). Multi-year simulations are performed to account for climate variability and the value of the different climate forecast is evaluated against the perfect foresight scenario based on the expected crop productivity as well as the land-use decisions. Our results show that not all the products generate beneficial effects to farmers and that the forecast errors might be amplified by the farmers decisions.
Extended Range Prediction of Indian Summer Monsoon: Current status
NASA Astrophysics Data System (ADS)
Sahai, A. K.; Abhilash, S.; Borah, N.; Joseph, S.; Chattopadhyay, R.; S, S.; Rajeevan, M.; Mandal, R.; Dey, A.
2014-12-01
The main focus of this study is to develop forecast consensus in the extended range prediction (ERP) of monsoon Intraseasonal oscillations using a suit of different variants of Climate Forecast system (CFS) model. In this CFS based Grand MME prediction system (CGMME), the ensemble members are generated by perturbing the initial condition and using different configurations of CFSv2. This is to address the role of different physical mechanisms known to have control on the error growth in the ERP in the 15-20 day time scale. The final formulation of CGMME is based on 21 ensembles of the standalone Global Forecast System (GFS) forced with bias corrected forecasted SST from CFS, 11 low resolution CFST126 and 11 high resolution CFST382. Thus, we develop the multi-model consensus forecast for the ERP of Indian summer monsoon (ISM) using a suite of different variants of CFS model. This coordinated international effort lead towards the development of specific tailor made regional forecast products over Indian region. Skill of deterministic and probabilistic categorical rainfall forecast as well the verification of large-scale low frequency monsoon intraseasonal oscillations has been carried out using hindcast from 2001-2012 during the monsoon season in which all models are initialized at every five days starting from 16May to 28 September. The skill of deterministic forecast from CGMME is better than the best participating single model ensemble configuration (SME). The CGMME approach is believed to quantify the uncertainty in both initial conditions and model formulation. Main improvement is attained in probabilistic forecast which is because of an increase in the ensemble spread, thereby reducing the error due to over-confident ensembles in a single model configuration. For probabilistic forecast, three tercile ranges are determined by ranking method based on the percentage of ensemble members from all the participating models falls in those three categories. CGMME further added value to both deterministic and probability forecast compared to raw SME's and this better skill is probably flows from large spread and improved spread-error relationship. CGMME system is currently capable of generating ER prediction in real time and successfully delivering its experimental operational ER forecast of ISM for the last few years.
Selection and Classification Using a Forecast Applicant Pool.
ERIC Educational Resources Information Center
Hendrix, William H.
The document presents a forecast model of the future Air Force applicant pool. By forecasting applicants' quality (means and standard deviations of aptitude scores) and quantity (total number of applicants), a potential enlistee could be compared to the forecasted pool. The data used to develop the model consisted of means, standard deviation, and…
Hysteresis phenomena of the intelligent driver model for traffic flow
NASA Astrophysics Data System (ADS)
Dahui, Wang; Ziqiang, Wei; Ying, Fan
2007-07-01
We present hysteresis phenomena of the intelligent driver model for traffic flow in a circular one-lane roadway. We show that the microscopic structure of traffic flow is dependent on its initial state by plotting the fraction of congested vehicles over the density, which shows a typical hysteresis loop, and by investigating the trajectories of vehicles on the velocity-over-headway plane. We find that the trajectories of vehicles on the velocity-over-headway plane, which usually show a hysteresis loop, include multiple loops. We also point out the relations between these hysteresis loops and the congested jams or high-density clusters in traffic flow.
Nature of the Congested Traffic and Quasi-steady States of the General Motor Models
NASA Astrophysics Data System (ADS)
Yang, Bo; Xu, Xihua; Pang, John Z. F.; Monterola, Christopher
2015-03-01
We look at the general motor (GM) class microscopic traffic models and analyze some of the universal features of the (multi-)cluster solutions, including the emergence of an intrinsic scale and the quasisoliton dynamics. We show that the GM models can capture the essential physics of the real traffic dynamics, especially the phase transition from the free flow to the congested phase, from which the wide moving jams emerges (the F-S-J transition pioneered by B.S. Kerner). In particular, the congested phase can be associated with either the multi-cluster quasi-steady states, or their more homogeneous precursor states. In both cases the states can last for a long time, and the narrow clusters will eventually grow and merge, leading to the formation of the wide moving jams. We present a general method to fit the empirical parameters so that both quantitative and qualitative macroscopic empirical features can be reproduced with a minimal GM model. We present numerical results for the traffic dynamics both with and without the bottleneck, including various types of spontaneous and induced ``synchronized flow,'' as well as the evolution of wide moving jams. We also discuss its implications to the nature of different phases in traffic dynamics.
Traffic signal synchronization in the saturated high-density grid road network.
Hu, Xiaojian; Lu, Jian; Wang, Wei; Zhirui, Ye
2015-01-01
Most existing traffic signal synchronization strategies do not perform well in the saturated high-density grid road network (HGRN). Traffic congestion often occurs in the saturated HGRN, and the mobility of the network is difficult to restore. In order to alleviate traffic congestion and to improve traffic efficiency in the network, the study proposes a regional traffic signal synchronization strategy, named the long green and long red (LGLR) traffic signal synchronization strategy. The essence of the strategy is to control the formation and dissipation of queues and to maximize the efficiency of traffic flows at signalized intersections in the saturated HGRN. With this strategy, the same signal control timing plan is used at all signalized intersections in the HGRN, and the straight phase of the control timing plan has a long green time and a long red time. Therefore, continuous traffic flows can be maintained when vehicles travel, and traffic congestion can be alleviated when vehicles stop. Using the strategy, the LGLR traffic signal synchronization model is developed, with the objective of minimizing the number of stops. Finally, the simulation is executed to analyze the performance of the model by comparing it to other models, and the superiority of the LGLR model is evident in terms of delay, number of stops, queue length, and overall performance in the saturated HGRN.
Verification of Ensemble Forecasts for the New York City Operations Support Tool
NASA Astrophysics Data System (ADS)
Day, G.; Schaake, J. C.; Thiemann, M.; Draijer, S.; Wang, L.
2012-12-01
The New York City water supply system operated by the Department of Environmental Protection (DEP) serves nine million people. It covers 2,000 square miles of portions of the Catskill, Delaware, and Croton watersheds, and it includes nineteen reservoirs and three controlled lakes. DEP is developing an Operations Support Tool (OST) to support its water supply operations and planning activities. OST includes historical and real-time data, a model of the water supply system complete with operating rules, and lake water quality models developed to evaluate alternatives for managing turbidity in the New York City Catskill reservoirs. OST will enable DEP to manage turbidity in its unfiltered system while satisfying its primary objective of meeting the City's water supply needs, in addition to considering secondary objectives of maintaining ecological flows, supporting fishery and recreation releases, and mitigating downstream flood peaks. The current version of OST relies on statistical forecasts of flows in the system based on recent observed flows. To improve short-term decision making, plans are being made to transition to National Weather Service (NWS) ensemble forecasts based on hydrologic models that account for short-term weather forecast skill, longer-term climate information, as well as the hydrologic state of the watersheds and recent observed flows. To ensure that the ensemble forecasts are unbiased and that the ensemble spread reflects the actual uncertainty of the forecasts, a statistical model has been developed to post-process the NWS ensemble forecasts to account for hydrologic model error as well as any inherent bias and uncertainty in initial model states, meteorological data and forecasts. The post-processor is designed to produce adjusted ensemble forecasts that are consistent with the DEP historical flow sequences that were used to develop the system operating rules. A set of historical hindcasts that is representative of the real-time ensemble forecasts is needed to verify that the post-processed forecasts are unbiased, statistically reliable, and preserve the skill inherent in the "raw" NWS ensemble forecasts. A verification procedure and set of metrics will be presented that provide an objective assessment of ensemble forecasts. The procedure will be applied to both raw ensemble hindcasts and to post-processed ensemble hindcasts. The verification metrics will be used to validate proper functioning of the post-processor and to provide a benchmark for comparison of different types of forecasts. For example, current NWS ensemble forecasts are based on climatology, using each historical year to generate a forecast trace. The NWS Hydrologic Ensemble Forecast System (HEFS) under development will utilize output from both the National Oceanic Atmospheric Administration (NOAA) Global Ensemble Forecast System (GEFS) and the Climate Forecast System (CFS). Incorporating short-term meteorological forecasts and longer-term climate forecast information should provide sharper, more accurate forecasts. Hindcasts from HEFS will enable New York City to generate verification results to validate the new forecasts and further fine-tune system operating rules. Project verification results will be presented for different watersheds across a range of seasons, lead times, and flow levels to assess the quality of the current ensemble forecasts.
Simultaneous calibration of ensemble river flow predictions over an entire range of lead times
NASA Astrophysics Data System (ADS)
Hemri, S.; Fundel, F.; Zappa, M.
2013-10-01
Probabilistic estimates of future water levels and river discharge are usually simulated with hydrologic models using ensemble weather forecasts as main inputs. As hydrologic models are imperfect and the meteorological ensembles tend to be biased and underdispersed, the ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, in order to achieve both reliable and sharp predictions statistical postprocessing is required. In this work Bayesian model averaging (BMA) is applied to statistically postprocess ensemble runoff raw forecasts for a catchment in Switzerland, at lead times ranging from 1 to 240 h. The raw forecasts have been obtained using deterministic and ensemble forcing meteorological models with different forecast lead time ranges. First, BMA is applied based on mixtures of univariate normal distributions, subject to the assumption of independence between distinct lead times. Then, the independence assumption is relaxed in order to estimate multivariate runoff forecasts over the entire range of lead times simultaneously, based on a BMA version that uses multivariate normal distributions. Since river runoff is a highly skewed variable, Box-Cox transformations are applied in order to achieve approximate normality. Both univariate and multivariate BMA approaches are able to generate well calibrated probabilistic forecasts that are considerably sharper than climatological forecasts. Additionally, multivariate BMA provides a promising approach for incorporating temporal dependencies into the postprocessed forecasts. Its major advantage against univariate BMA is an increase in reliability when the forecast system is changing due to model availability.
The impact of self-driving cars on existing transportation networks
NASA Astrophysics Data System (ADS)
Ji, Xiang
2018-04-01
In this paper, considering the usage of self-driving, I research the congestion problems of traffic networks from both macro and micro levels. Firstly, the macroscopic mathematical model is established using the Greenshields function, analytic hierarchy process and Monte Carlo simulation, where the congestion level is divided into five levels according to the average vehicle speed. The roads with an obvious congestion situation is investigated mainly and the traffic flow and topology of the roads are analyzed firstly. By processing the data, I propose a traffic congestion model. In the model, I assume that half of the non-self-driving cars only take the shortest route and the other half can choose the path randomly. While self-driving cars can obtain vehicle density data of each road and choose the path more reasonable. When the path traffic density exceeds specific value, it cannot be selected. To overcome the dimensional differences of data, I rate the paths by BORDA sorting. The Monte Carlo simulation of Cellular Automaton is used to obtain the negative feedback information of the density of the traffic network, where the vehicles are added into the road network one by one. I then analyze the influence of negative feedback information on path selection of intelligent cars. The conclusion is that the increase of the proportion of intelligent vehicles will make the road load more balanced, and the self-driving cars can avoid the peak and reduce the degree of road congestion. Combined with other models, the optimal self-driving ratio is about sixty-two percent. From the microscopic aspect, by using the single-lane traffic NS rule, another model is established to analyze the road Partition scheme. The self-driving traffic is more intelligent, and their cooperation can reduce the random deceleration probability. By the model, I get the different self-driving ratio of space-time distribution. I also simulate the case of making a lane separately for self-driving, compared to the former model. It is concluded that a single lane is more efficient in a certain interval. However, it is not recommended to offer a lane separately. However, the self-driving also faces the problem of hacker attacks and greater damage after fault. So, when self-driving ratio is higher than a certain value, the increase of traffic flow rate is small. In this article, that value is discussed, and the optimal proportion is determined. Finally, I give a nontechnical explanation of the problem.
Modeling and forecasting U.S. sex differentials in mortality.
Carter, L R; Lee, R D
1992-11-01
"This paper examines differentials in observed and forecasted sex-specific life expectancies and longevity in the United States from 1900 to 2065. Mortality models are developed and used to generate long-run forecasts, with confidence intervals that extend recent work by Lee and Carter (1992). These results are compared for forecast accuracy with univariate naive forecasts of life expectancies and those prepared by the Actuary of the Social Security Administration." excerpt
NASA Astrophysics Data System (ADS)
Tavakkoli-Moghaddam, Reza; Vazifeh-Noshafagh, Samira; Taleizadeh, Ata Allah; Hajipour, Vahid; Mahmoudi, Amin
2017-01-01
This article presents a new multi-objective model for a facility location problem with congestion and pricing policies. This model considers situations in which immobile service facilities are congested by a stochastic demand following M/M/m/k queues. The presented model belongs to the class of mixed-integer nonlinear programming models and NP-hard problems. To solve such a hard model, a new multi-objective optimization algorithm based on a vibration theory, namely multi-objective vibration damping optimization (MOVDO), is developed. In order to tune the algorithms parameters, the Taguchi approach using a response metric is implemented. The computational results are compared with those of the non-dominated ranking genetic algorithm and non-dominated sorting genetic algorithm. The outputs demonstrate the robustness of the proposed MOVDO in large-sized problems.
NASA Astrophysics Data System (ADS)
Ouyang, Huei-Tau
2017-07-01
Three types of model for forecasting inundation levels during typhoons were optimized: the linear autoregressive model with exogenous inputs (LARX), the nonlinear autoregressive model with exogenous inputs with wavelet function (NLARX-W) and the nonlinear autoregressive model with exogenous inputs with sigmoid function (NLARX-S). The forecast performance was evaluated by three indices: coefficient of efficiency, error in peak water level and relative time shift. Historical typhoon data were used to establish water-level forecasting models that satisfy all three objectives. A multi-objective genetic algorithm was employed to search for the Pareto-optimal model set that satisfies all three objectives and select the ideal models for the three indices. Findings showed that the optimized nonlinear models (NLARX-W and NLARX-S) outperformed the linear model (LARX). Among the nonlinear models, the optimized NLARX-W model achieved a more balanced performance on the three indices than the NLARX-S models and is recommended for inundation forecasting during typhoons.
Forecasting the mortality rates using Lee-Carter model and Heligman-Pollard model
NASA Astrophysics Data System (ADS)
Ibrahim, R. I.; Ngataman, N.; Abrisam, W. N. A. Wan Mohd
2017-09-01
Improvement in life expectancies has driven further declines in mortality. The sustained reduction in mortality rates and its systematic underestimation has been attracting the significant interest of researchers in recent years because of its potential impact on population size and structure, social security systems, and (from an actuarial perspective) the life insurance and pensions industry worldwide. Among all forecasting methods, the Lee-Carter model has been widely accepted by the actuarial community and Heligman-Pollard model has been widely used by researchers in modelling and forecasting future mortality. Therefore, this paper only focuses on Lee-Carter model and Heligman-Pollard model. The main objective of this paper is to investigate how accurately these two models will perform using Malaysian data. Since these models involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 8.0 (MATLAB 8.0) software will be used to estimate the parameters of the models. Autoregressive Integrated Moving Average (ARIMA) procedure is applied to acquire the forecasted parameters for both models as the forecasted mortality rates are obtained by using all the values of forecasted parameters. To investigate the accuracy of the estimation, the forecasted results will be compared against actual data of mortality rates. The results indicate that both models provide better results for male population. However, for the elderly female population, Heligman-Pollard model seems to underestimate to the mortality rates while Lee-Carter model seems to overestimate to the mortality rates.
NASA Astrophysics Data System (ADS)
Zheng, Fei; Zhu, Jiang
2017-04-01
How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.
Multicomponent ensemble models to forecast induced seismicity
NASA Astrophysics Data System (ADS)
Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.
2018-01-01
In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels of seismicity days before the occurrence of felt events.
Wildfire suppression cost forecasts from the US Forest Service
Karen L. Abt; Jeffrey P. Prestemon; Krista M. Gebert
2009-01-01
The US Forest Service and other land-management agencies seek better tools for nticipating future expenditures for wildfire suppression. We developed regression models for forecasting US Forest Service suppression spending at 1-, 2-, and 3-year lead times. We compared these models to another readily available forecast model, the 10-year moving average model,...
Forecasting monthly inflow discharge of the Iffezheim reservoir using data-driven models
NASA Astrophysics Data System (ADS)
Zhang, Qing; Aljoumani, Basem; Hillebrand, Gudrun; Hoffmann, Thomas; Hinkelmann, Reinhard
2017-04-01
River stream flow is an essential element in hydrology study fields, especially for reservoir management, since it defines input into reservoirs. Forecasting this stream flow plays an important role in short or long-term planning and management in the reservoir, e.g. optimized reservoir and hydroelectric operation or agricultural irrigation. Highly accurate flow forecasting can significantly reduce economic losses and is always pursued by reservoir operators. Therefore, hydrologic time series forecasting has received tremendous attention of researchers. Many models have been proposed to improve the hydrological forecasting. Due to the fact that most natural phenomena occurring in environmental systems appear to behave in random or probabilistic ways, different cases may need a different methods to forecast the inflow and even a unique treatment to improve the forecast accuracy. The purpose of this study is to determine an appropriate model for forecasting monthly inflow to the Iffezheim reservoir in Germany, which is the last of the barrages in the Upper Rhine. Monthly time series of discharges, measured from 1946 to 2001 at the Plittersdorf station, which is located 6 km downstream of the Iffezheim reservoir, were applied. The accuracies of the used stochastic models - Fiering model and Auto-Regressive Integrated Moving Average models (ARIMA) are compared with Artificial Intelligence (AI) models - single Artificial Neural Network (ANN) and Wavelet ANN models (WANN). The Fiering model is a linear stochastic model and used for generating synthetic monthly data. The basic idea in modeling time series using ARIMA is to identify a simple model with as few model parameters as possible in order to provide a good statistical fit to the data. To identify and fit the ARIMA models, four phase approaches were used: identification, parameter estimation, diagnostic checking, and forecasting. An automatic selection criterion, such as the Akaike information criterion, is utilized to enhance this flexible approach to set up the model. As distinct from both stochastic models, the ANN and its related conjunction methods Wavelet-ANN (WANN) models are effective to handle non-linear systems and have been developed with antecedent flows as inputs to forecast up to 12-months lead-time for the Iffezheim reservoir. In the ANN and WANN models, the Feed Forward Back Propagation method (FFBP) is applied. The sigmoid activity and linear functions were used with several different neurons for the hidden layers and for the output layer, respectively. To compare the accuracy of the different models and identify the most suitable model for reliable forecasting, four quantitative standard statistical performance evaluation measures, the root mean square error (RMSE), the mean bias error (MAE) and the determination correlation coefficient (DC), are employed. The results reveal that the ARIMA (2, 1, 2) performs better than Fiering, ANN and WANN models. Further, the WANN model is found to be slightly better than the ANN model for forecasting monthly inflow of the Iffezheim reservoir. As a result, by using the ARIMA model, the predicted and observed values agree reasonably well.
Discrete post-processing of total cloud cover ensemble forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian
2017-04-01
This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.
NASA Technical Reports Server (NTRS)
Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.
Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.
NASA Astrophysics Data System (ADS)
Wood, A. W.; Clark, E.; Mendoza, P. A.; Nijssen, B.; Newman, A. J.; Clark, M. P.; Arnold, J.; Nowak, K. C.
2016-12-01
Many if not most national operational short-to-medium range streamflow prediction systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow are automated, but others require the hands-on-effort of an experienced human forecaster. This approach evolved out of the need to correct for deficiencies in the models and datasets that were available for forecasting, and often leads to skillful predictions despite the use of relatively simple, conceptual models. On the other hand, the process is not reproducible, which limits opportunities to assess and incorporate process variations, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast ensembles and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun to develop more centralized, `over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, the operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as the systems are being rolled out in major operational forecasting centers. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis, Research, and Prediction' (SHARP) to implement, assess and demonstrate real-time over-the-loop forecasts. We present early hindcast and verification results from SHARP for short to medium range streamflow forecasts in a number of US case study watersheds.
NASA Astrophysics Data System (ADS)
Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.
2015-10-01
Reliable river flow forecasts play a key role in flood risk mitigation. Among different approaches of river flow forecasting, data driven approaches have become increasingly popular in recent years due to their minimum information requirements and ability to simulate nonlinear and non-stationary characteristics of hydrological processes. In this study, attempts are made to apply four different types of data driven approaches, namely traditional artificial neural networks (ANN), adaptive neuro-fuzzy inference systems (ANFIS), wavelet neural networks (WNN), and, hybrid ANFIS with multi resolution analysis using wavelets (WNF). Developed models applied for real time flood forecasting at Casino station on Richmond River, Australia which is highly prone to flooding. Hourly rainfall and runoff data were used to drive the models which have been used for forecasting with 1, 6, 12, 24, 36 and 48 h lead-time. The performance of models further improved by adding an upstream river flow data (Wiangaree station), as another effective input. All models perform satisfactorily up to 12 h lead-time. However, the hybrid wavelet-based models significantly outperforming the ANFIS and ANN models in the longer lead-time forecasting. The results confirm the robustness of the proposed structure of the hybrid models for real time runoff forecasting in the study area.
Assessing a 3D smoothed seismicity model of induced earthquakes
NASA Astrophysics Data System (ADS)
Zechar, Jeremy; Király, Eszter; Gischig, Valentin; Wiemer, Stefan
2016-04-01
As more energy exploration and extraction efforts cause earthquakes, it becomes increasingly important to control induced seismicity. Risk management schemes must be improved and should ultimately be based on near-real-time forecasting systems. With this goal in mind, we propose a test bench to evaluate models of induced seismicity based on metrics developed by the CSEP community. To illustrate the test bench, we consider a model based on the so-called seismogenic index and a rate decay; to produce three-dimensional forecasts, we smooth past earthquakes in space and time. We explore four variants of this model using the Basel 2006 and Soultz-sous-Forêts 2004 datasets to make short-term forecasts, test their consistency, and rank the model variants. Our results suggest that such a smoothed seismicity model is useful for forecasting induced seismicity within three days, and giving more weight to recent events improves forecast performance. Moreover, the location of the largest induced earthquake is forecast well by this model. Despite the good spatial performance, the model does not estimate the seismicity rate well: it frequently overestimates during stimulation and during the early post-stimulation period, and it systematically underestimates around shut-in. In this presentation, we also describe a robust estimate of information gain, a modification that can also benefit forecast experiments involving tectonic earthquakes.
DOT National Transportation Integrated Search
2012-05-01
The role of the REMI Policy Insight+ model in socioeconomic forecasting and economic impact analysis of transportation projects was assessed. The REMI : PI+ model is consistent with the state of the practice in forecasting and impact analysis. REMI P...
DOT National Transportation Integrated Search
2012-05-01
The role of the REMI Policy Insight+ model in socioeconomic forecasting and economic impact analysis of transportation projects was assessed. The REMI : PI+ model is consistent with the state of the practice in forecasting and impact analysis. REMI P...
Moran, Kelly R; Fairchild, Geoffrey; Generous, Nicholas; Hickmann, Kyle; Osthus, Dave; Priedhorsky, Reid; Hyman, James; Del Valle, Sara Y
2016-12-01
Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection and Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. We conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Neural network versus classical time series forecasting models
NASA Astrophysics Data System (ADS)
Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam
2017-05-01
Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.
Rate-based congestion control in networks with smart links, revision. B.S. Thesis - May 1988
NASA Technical Reports Server (NTRS)
Heybey, Andrew Tyrrell
1990-01-01
The author uses a network simulator to explore rate-based congestion control in networks with smart links that can feed back information to tell senders to adjust their transmission rates. This method differs in a very important way from congestion control in which a congested network component just drops packets - the most commonly used method. It is clearly advantageous for the links in the network to communicate with the end users about the network capacity, rather than the users unilaterally picking a transmission rate. The components in the middle of the network, not the end users, have information about the capacity and traffic in the network. The author experiments with three different algorithms for calculating the control rate to feed back to the users. All of the algorithms exhibit problems in the form of large queues when simulated with a configuration modeling the dynamics of a packet-voice system. However, the problems are not with the algorithms themselves, but with the fact that feedback takes time. If the network steady-state utilization is low enough that it can absorb transients in the traffic through it, then the large queues disappear. If the users are modified to start sending slowly, to allow the network to adapt to a new flow without causing congestion, a greater portion of the network's bandwidth can be used.
2010-09-30
and climate forecasting and use of satellite data assimilation for model evaluation. He is a task leader on another NSF_EPSCoR project for the...1 DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited. Data Analysis, Modeling, and Ensemble Forecasting to...observations including remotely sensed data . OBJECTIVES The main objectives of the study are: 1) to further develop, test, and continue twice daily
2011-09-30
forecasting and use of satellite data assimilation for model evaluation (Jiang et al, 2011a). He is a task leader on another NSF EPSCoR project...K. Horvath, R. Belu, 2011a: Application of variational data assimilation to dynamical downscaling of regional wind energy resources in the western...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Data Analysis, Modeling, and Ensemble Forecasting to
2004-03-01
predicting future events ( Heizer and Render , 1999). Forecasting techniques fall into two major categories, qualitative and quantitative methods...Globemaster III.” Excerpt from website. www.globalsecurity.org/military /systems/ aircraft/c-17-history.htm. 2003. Heizer , Jay, and Barry Render ...of the past data used to make the forecast ( Heizer , et. al., 1999). Explanatory forecasting models assume that the variable being forecasted
The Art and Science of Long-Range Space Weather Forecasting
NASA Technical Reports Server (NTRS)
Hathaway, David H.; Wilson, Robert M.
2006-01-01
Long-range space weather forecasts are akin to seasonal forecasts of terrestrial weather. We don t expect to forecast individual events but we do hope to forecast the underlying level of activity important for satellite operations and mission pl&g. Forecasting space weather conditions years or decades into the future has traditionally been based on empirical models of the solar cycle. Models for the shape of the cycle as a function of its amplitude become reliable once the amplitude is well determined - usually two to three years after minimum. Forecasting the amplitude of a cycle well before that time has been more of an art than a science - usually based on cycle statistics and trends. Recent developments in dynamo theory -the theory explaining the generation of the Sun s magnetic field and the solar activity cycle - have now produced models with predictive capabilities. Testing these models with historical sunspot cycle data indicates that these predictions may be highly reliable one, or even two, cycles into the future.
Ensemble-based methods for forecasting census in hospital units
2013-01-01
Background The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. Methods In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Results Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Conclusions Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts. PMID:23721123
Ensemble-based methods for forecasting census in hospital units.
Koestler, Devin C; Ombao, Hernando; Bender, Jesse
2013-05-30
The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts.
Development of predictive weather scenarios for early prediction of rice yield in South Korea
NASA Astrophysics Data System (ADS)
Shin, Y.; Cho, J.; Jung, I.
2017-12-01
International grain prices are becoming unstable due to frequent occurrence of abnormal weather phenomena caused by climate change. Early prediction of grain yield using weather forecast data is important for stabilization of international grain prices. The APEC Climate Center (APCC) is providing seasonal forecast data based on monthly climate prediction models for global seasonal forecasting services. The 3-month and 6-month seasonal forecast data using the multi-model ensemble (MME) technique are provided in their own website, ADSS (APCC Data Service System, http://adss.apcc21.org/). The spatial resolution of seasonal forecast data for each individual model is 2.5°×2.5°(about 250km) and the time scale is created as monthly. In this study, we developed customized weather forecast scenarios that are combined seasonal forecast data and observational data apply to early rice yield prediction model. Statistical downscale method was applied to produce meteorological input data of crop model because field scale crop model (ORYZA2000) requires daily weather data. In order to determine whether the forecasting data is suitable for the crop model, we produced spatio-temporal downscaled weather scenarios and evaluated the predictability by comparison with observed weather data at 57 ASOS stations in South Korea. The customized weather forecast scenarios can be applied to various application fields not only early rice yield prediction. Acknowledgement This work was carried out with the support of "Cooperative Research Program for Agriculture Science and Technology Development (Project No: PJ012855022017)" Rural Development Administration, Republic of Korea.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
2011-12-01
The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.
Using ensembles in water management: forecasting dry and wet episodes
NASA Astrophysics Data System (ADS)
van het Schip-Haverkamp, Tessa; van den Berg, Wim; van de Beek, Remco
2015-04-01
Extreme weather situations as droughts and extensive precipitation are becoming more frequent, which makes it more important to obtain accurate weather forecasts for the short and long term. Ensembles can provide a solution in terms of scenario forecasts. MeteoGroup uses ensembles in a new forecasting technique which presents a number of weather scenarios for a dynamical water management project, called Water-Rijk, in which water storage and water retention plays a large role. The Water-Rijk is part of Park Lingezegen, which is located between Arnhem and Nijmegen in the Netherlands. In collaboration with the University of Wageningen, Alterra and Eijkelkamp a forecasting system is developed for this area which can provide water boards with a number of weather and hydrology scenarios in order to assist in the decision whether or not water retention or water storage is necessary in the near future. In order to make a forecast for drought and extensive precipitation, the difference 'precipitation- evaporation' is used as a measurement of drought in the weather forecasts. In case of an upcoming drought this difference will take larger negative values. In case of a wet episode, this difference will be positive. The Makkink potential evaporation is used which gives the most accurate potential evaporation values during the summer, when evaporation plays an important role in the availability of surface water. Scenarios are determined by reducing the large number of forecasts in the ensemble to a number of averaged members with each its own likelihood of occurrence. For the Water-Rijk project 5 scenario forecasts are calculated: extreme dry, dry, normal, wet and extreme wet. These scenarios are constructed for two forecasting periods, each using its own ensemble technique: up to 48 hours ahead and up to 15 days ahead. The 48-hour forecast uses an ensemble constructed from forecasts of multiple high-resolution regional models: UKMO's Euro4 model,the ECMWF model, WRF and Hirlam. Using multiple model runs and additional post processing, an ensemble can be created from non-ensemble models. The 15-day forecast uses the ECMWF Ensemble Prediction System forecast from which scenarios can be deduced directly. A combination of the ensembles from the two forecasting periods is used in order to have the highest possible resolution of the forecast for the first 48 hours followed by the lower resolution long term forecast.
Liu, Yan; Watson, Stella C; Gettings, Jenna R; Lund, Robert B; Nordone, Shila K; Yabsley, Michael J; McMahan, Christopher S
2017-01-01
This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011-2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases.
Evaluation of CMAQ and CAMx Ensemble Air Quality Forecasts during the 2015 MAPS-Seoul Field Campaign
NASA Astrophysics Data System (ADS)
Kim, E.; Kim, S.; Bae, C.; Kim, H. C.; Kim, B. U.
2015-12-01
The performance of Air quality forecasts during the 2015 MAPS-Seoul Field Campaign was evaluated. An forecast system has been operated to support the campaign's daily aircraft route decisions for airborne measurements to observe long-range transporting plume. We utilized two real-time ensemble systems based on the Weather Research and Forecasting (WRF)-Sparse Matrix Operator Kernel Emissions (SMOKE)-Comprehensive Air quality Model with extensions (CAMx) modeling framework and WRF-SMOKE- Community Multi_scale Air Quality (CMAQ) framework over northeastern Asia to simulate PM10 concentrations. Global Forecast System (GFS) from National Centers for Environmental Prediction (NCEP) was used to provide meteorological inputs for the forecasts. For an additional set of retrospective simulations, ERA Interim Reanalysis from European Centre for Medium-Range Weather Forecasts (ECMWF) was also utilized to access forecast uncertainties from the meteorological data used. Model Inter-Comparison Study for Asia (MICS-Asia) and National Institute of Environment Research (NIER) Clean Air Policy Support System (CAPSS) emission inventories are used for foreign and domestic emissions, respectively. In the study, we evaluate the CMAQ and CAMx model performance during the campaign by comparing the results to the airborne and surface measurements. Contributions of foreign and domestic emissions are estimated using a brute force method. Analyses on model performance and emissions will be utilized to improve air quality forecasts for the upcoming KORUS-AQ field campaign planned in 2016.
Forecasting of Water Consumptions Expenditure Using Holt-Winter’s and ARIMA
NASA Astrophysics Data System (ADS)
Razali, S. N. A. M.; Rusiman, M. S.; Zawawi, N. I.; Arbin, N.
2018-04-01
This study is carried out to forecast water consumption expenditure of Malaysian university specifically at University Tun Hussein Onn Malaysia (UTHM). The proposed Holt-Winter’s and Auto-Regressive Integrated Moving Average (ARIMA) models were applied to forecast the water consumption expenditure in Ringgit Malaysia from year 2006 until year 2014. The two models were compared and performance measurement of the Mean Absolute Percentage Error (MAPE) and Mean Absolute Deviation (MAD) were used. It is found that ARIMA model showed better results regarding the accuracy of forecast with lower values of MAPE and MAD. Analysis showed that ARIMA (2,1,4) model provided a reasonable forecasting tool for university campus water usage.
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.
2018-04-01
A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.
Seasonal Drought Prediction in East Africa: Can National Multi-Model Ensemble Forecasts Help?
NASA Technical Reports Server (NTRS)
Shukla, Shraddhanand; Roberts, J. B.; Funk, Christopher; Robertson, F. R.; Hoell, Andrew
2015-01-01
The increasing food and water demands of East Africa's growing population are stressing the region's inconsistent water resources and rain-fed agriculture. As recently as in 2011 part of this region underwent one of the worst famine events in its history. Timely and skillful drought forecasts at seasonal scale for this region can inform better water and agro-pastoral management decisions, support optimal allocation of the region's water resources, and mitigate socio-economic losses incurred by droughts. However seasonal drought prediction in this region faces several challenges. Lack of skillful seasonal rainfall forecasts; the focus of this presentation, is one of those major challenges. In the past few decades, major strides have been taken towards improvement of seasonal scale dynamical climate forecasts. The National Centers for Environmental Prediction's (NCEP) National Multi-model Ensemble (NMME) is one such state-of-the-art dynamical climate forecast system. The NMME incorporates climate forecasts from 6+ fully coupled dynamical models resulting in 100+ ensemble member forecasts. Recent studies have indicated that in general NMME offers improvement over forecasts from any single model. However thus far the skill of NMME for forecasting rainfall in a vulnerable region like the East Africa has been unexplored. In this presentation we report findings of a comprehensive analysis that examines the strength and weakness of NMME in forecasting rainfall at seasonal scale in East Africa for all three of the prominent seasons for the region. (i.e. March-April-May, July-August-September and October-November- December). Simultaneously we also describe hybrid approaches; that combine statistical approaches with NMME forecasts; to improve rainfall forecast skill in the region when raw NMME forecasts lack in skill.
Seasonal Drought Prediction in East Africa: Can National Multi-Model Ensemble Forecasts Help?
NASA Technical Reports Server (NTRS)
Shukla, Shraddhanand; Roberts, J. B.; Funk, Christopher; Robertson, F. R.; Hoell, Andrew
2014-01-01
The increasing food and water demands of East Africa's growing population are stressing the region's inconsistent water resources and rain-fed agriculture. As recently as in 2011 part of this region underwent one of the worst famine events in its history. Timely and skillful drought forecasts at seasonal scale for this region can inform better water and agro-pastoral management decisions, support optimal allocation of the region's water resources, and mitigate socio-economic losses incurred by droughts. However seasonal drought prediction in this region faces several challenges. Lack of skillful seasonal rainfall forecasts; the focus of this presentation, is one of those major challenges. In the past few decades, major strides have been taken towards improvement of seasonal scale dynamical climate forecasts. The National Centers for Environmental Prediction's (NCEP) National Multi-model Ensemble (NMME) is one such state-of-the-art dynamical climate forecast system. The NMME incorporates climate forecasts from 6+ fully coupled dynamical models resulting in 100+ ensemble member forecasts. Recent studies have indicated that in general NMME offers improvement over forecasts from any single model. However thus far the skill of NMME for forecasting rainfall in a vulnerable region like the East Africa has been unexplored. In this presentation we report findings of a comprehensive analysis that examines the strength and weakness of NMME in forecasting rainfall at seasonal scale in East Africa for all three of the prominent seasons for the region. (i.e. March-April-May, July-August-September and October-November- December). Simultaneously we also describe hybrid approaches; that combine statistical approaches with NMME forecasts; to improve rainfall forecast skill in the region when raw NMME forecasts lack in skill.
Regional Model Nesting Within GFS Daily Forecasts Over West Africa
NASA Technical Reports Server (NTRS)
Druyan, Leonard M.; Fulakeza, Matthew; Lonergan, Patrick; Worrell, Ruben
2010-01-01
The study uses the RM3, the regional climate model at the Center for Climate Systems Research of Columbia University and the NASA/Goddard Institute for Space Studies (CCSR/GISS). The paper evaluates 30 48-hour RM3 weather forecasts over West Africa during September 2006 made on a 0.5 grid nested within 1 Global Forecast System (GFS) global forecasts. September 2006 was the Special Observing Period #3 of the African Monsoon Multidisciplinary Analysis (AMMA). Archived GFS initial conditions and lateral boundary conditions for the simulations from the US National Weather Service, National Oceanographic and Atmospheric Administration were interpolated four times daily. Results for precipitation forecasts are validated against Tropical Rainfall Measurement Mission (TRMM) satellite estimates and data from the Famine Early Warning System (FEWS), which includes rain gauge measurements, and forecasts of circulation are compared to reanalysis 2. Performance statistics for the precipitation forecasts include bias, root-mean-square errors and spatial correlation coefficients. The nested regional model forecasts are compared to GFS forecasts to gauge whether nesting provides additional realistic information. They are also compared to RM3 simulations driven by reanalysis 2, representing high potential skill forecasts, to gauge the sensitivity of results to lateral boundary conditions. Nested RM3/GFS forecasts generate excessive moisture advection toward West Africa, which in turn causes prodigious amounts of model precipitation. This problem is corrected by empirical adjustments in the preparation of lateral boundary conditions and initial conditions. The resulting modified simulations improve on the GFS precipitation forecasts, achieving time-space correlations with TRMM of 0.77 on the first day and 0.63 on the second day. One realtime RM3/GFS precipitation forecast made at and posted by the African Centre of Meteorological Application for Development (ACMAD) in Niamey, Niger is shown.
NASA Astrophysics Data System (ADS)
Wood, Andy; Clark, Elizabeth; Mendoza, Pablo; Nijssen, Bart; Newman, Andy; Clark, Martyn; Nowak, Kenneth; Arnold, Jeffrey
2017-04-01
Many if not most national operational streamflow prediction systems rely on a forecaster-in-the-loop approach that require the hands-on-effort of an experienced human forecaster. This approach evolved from the need to correct for long-standing deficiencies in the models and datasets used in forecasting, and the practice often leads to skillful flow predictions despite the use of relatively simple, conceptual models. Yet the 'in-the-loop' forecast process is not reproducible, which limits opportunities to assess and incorporate new techniques systematically, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun develop more centralized, 'over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, many national operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as such systems are beginning to be deployed operationally in centers such as ECMWF. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the US National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis Research and Prediction Applications' (SHARP) to implement, assess and demonstrate real-time over-the-loop ensemble flow forecasts in a range of US watersheds. The system relies on fully ensemble techniques, including: an 100-member ensemble of meteorological model forcings and an ensemble particle filter data assimilation for initializing watershed states; analog/regression-based downscaling of ensemble weather forecasts from GEFS; and statistical post-processing of ensemble forecast outputs, all of which run in real-time within a workflow managed by ECWMF's ecFlow libraries over large US regional domains. We describe SHARP and present early hindcast and verification results for short to seasonal range streamflow forecasts in a number of US case study watersheds.
So, Rita; Teakles, Andrew; Baik, Jonathan; Vingarzan, Roxanne; Jones, Keith
2018-05-01
Visibility degradation, one of the most noticeable indicators of poor air quality, can occur despite relatively low levels of particulate matter when the risk to human health is low. The availability of timely and reliable visibility forecasts can provide a more comprehensive understanding of the anticipated air quality conditions to better inform local jurisdictions and the public. This paper describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada's operational Regional Air Quality Deterministic Prediction System (RAQDPS) for the Lower Fraser Valley of British Columbia. A baseline model (GM-IMPROVE) was constructed using the revised IMPROVE algorithm based on unprocessed forecasts from the RAQDPS. Three additional prototypes (UMOS-HYB, GM-MLR, GM-RF) were also developed and assessed for forecast performance of up to 48 hr lead time during various air quality and meteorological conditions. Forecast performance was assessed by examining their ability to provide both numerical and categorical forecasts in the form of 1-hr total extinction and Visual Air Quality Ratings (VAQR), respectively. While GM-IMPROVE generally overestimated extinction more than twofold, it had skill in forecasting the relative species contribution to visibility impairment, including ammonium sulfate and ammonium nitrate. Both statistical prototypes, GM-MLR and GM-RF, performed well in forecasting 1-hr extinction during daylight hours, with correlation coefficients (R) ranging from 0.59 to 0.77. UMOS-HYB, a prototype based on postprocessed air quality forecasts without additional statistical modeling, provided reasonable forecasts during most daylight hours. In terms of categorical forecasts, the best prototype was approximately 75 to 87% correct, when forecasting for a condensed three-category VAQR. A case study, focusing on a poor visual air quality yet low Air Quality Health Index episode, illustrated that the statistical prototypes were able to provide timely and skillful visibility forecasts with lead time up to 48 hr. This study describes the development of a visibility forecasting modeling framework, which leverages the existing air quality and meteorological forecasts from Canada's operational Regional Air Quality Deterministic Prediction System. The main applications include tourism and recreation planning, input into air quality management programs, and educational outreach. Visibility forecasts, when supplemented with the existing air quality and health based forecasts, can assist jurisdictions to anticipate the visual air quality impacts as perceived by the public, which can potentially assist in formulating the appropriate air quality bulletins and recommendations.
DOT National Transportation Integrated Search
2014-05-12
This document details the process that the Volpe National Transportation Systems Center (Volpe) used to develop travel forecasting models for the Federal Highway Administration (FHWA). The purpose of these models is to allow FHWA to forecast future c...
THE EMERGENCE OF NUMERICAL AIR QUALITY FORECASTING MODELS AND THEIR APPLICATION
In recent years the U.S. and other nations have begun programs for short-term local through regional air quality forecasting based upon numerical three-dimensional air quality grid models. These numerical air quality forecast (NAQF) models and systems have been developed and test...
A Practical Model for Forecasting New Freshman Enrollment during the Application Period.
ERIC Educational Resources Information Center
Paulsen, Michael B.
1989-01-01
A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)
Bayesian Population Forecasting: Extending the Lee-Carter Method.
Wiśniowski, Arkadiusz; Smith, Peter W F; Bijak, Jakub; Raymer, James; Forster, Jonathan J
2015-06-01
In this article, we develop a fully integrated and dynamic Bayesian approach to forecast populations by age and sex. The approach embeds the Lee-Carter type models for forecasting the age patterns, with associated measures of uncertainty, of fertility, mortality, immigration, and emigration within a cohort projection model. The methodology may be adapted to handle different data types and sources of information. To illustrate, we analyze time series data for the United Kingdom and forecast the components of population change to the year 2024. We also compare the results obtained from different forecast models for age-specific fertility, mortality, and migration. In doing so, we demonstrate the flexibility and advantages of adopting the Bayesian approach for population forecasting and highlight areas where this work could be extended.
An optimization model for roadway pricing on rural freeways.
DOT National Transportation Integrated Search
2012-02-01
The main objective of rural roadway pricing is revenue generation, rather than elimination of congestion externalities. This report presents a model that provides optimum tolls accounting for pavement deterioration and economic impacts. This model co...
Multimodal Solutions for Large Scale Evacuation
DOT National Transportation Integrated Search
2009-12-30
In this research, a multimodal transportation model was developed attending the needs of emergency situations, and the solutions provided by the model could be used to moderate congestion during such events. The model incorporated features such as la...
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
Design and development of surface rainfall forecast products on GRAPES_MESO model
NASA Astrophysics Data System (ADS)
Zhili, Liu
2016-04-01
In this paper, we designed and developed the surface rainfall forecast products using medium scale GRAPES_MESO model precipitation forecast products. The horizontal resolution of GRAPES_MESO model is 10km*10km, the number of Grids points is 751*501, vertical levels is 26, the range is 70°E-145.15°E, 15°N-64.35 °N. We divided the basin into 7 major watersheds. Each watersheds was divided into a number of sub regions. There were 95 sub regions in all. Tyson polygon method is adopted in the calculation of surface rainfall. We used 24 hours forecast precipitation data of GRAPES_MESO model to calculate the surface rainfall. According to the site of information and boundary information of the 95 sub regions, the forecast surface rainfall of each sub regions was calculated. We can provide real-time surface rainfall forecast products every day. We used the method of fuzzy evaluation to carry out a preliminary test and verify about the surface rainfall forecast product. Results shows that the fuzzy score of heavy rain, rainstorm and downpour level forecast rainfall were higher, the fuzzy score of light rain level was lower. The forecast effect of heavy rain, rainstorm and downpour level surface rainfall were better. The rate of missing and empty forecast of light rainfall level surface rainfall were higher, so it's fuzzy score were lower.
NASA Astrophysics Data System (ADS)
Nanda, Trushnamayee; Beria, Harsh; Sahoo, Bhabagrahi; Chatterjee, Chandranath
2016-04-01
Increasing frequency of hydrologic extremes in a warming climate call for the development of reliable flood forecasting systems. The unavailability of meteorological parameters in real-time, especially in the developing parts of the world, makes it a challenging task to accurately predict flood, even at short lead times. The satellite-based Tropical Rainfall Measuring Mission (TRMM) provides an alternative to the real-time precipitation data scarcity. Moreover, rainfall forecasts by the numerical weather prediction models such as the medium term forecasts issued by the European Center for Medium range Weather Forecasts (ECMWF) are promising for multistep-ahead flow forecasts. We systematically evaluate these rainfall products over a large catchment in Eastern India (Mahanadi River basin). We found spatially coherent trends, with both the real-time TRMM rainfall and ECMWF rainfall forecast products overestimating low rainfall events and underestimating high rainfall events. However, no significant bias was found for the medium rainfall events. Another key finding was that these rainfall products captured the phase of the storms pretty well, but suffered from consistent under-prediction. The utility of the real-time TRMM and ECMWF forecast products are evaluated by rainfall-runoff modeling using different artificial neural network (ANN)-based models up to 3-days ahead. Keywords: TRMM; ECMWF; forecast; ANN; rainfall-runoff modeling
Applications of the gambling score in evaluating earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe
2010-05-01
This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2016-01-01
Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for the Liuxihe model parameter optimization effectively and could improve the model capability largely in catchment flood forecasting, thus proving that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological models. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for the Liuxihe model catchment flood forecasting are 20 and 30 respectively.
Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)
NASA Astrophysics Data System (ADS)
Arritt, R.
2009-04-01
Regional climate models (RCMs) have long been used to downscale global climate simulations. In contrast the ability of RCMs to downscale seasonal climate forecasts has received little attention. The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Does dynamical downscaling using RCMs provide additional useful information for seasonal forecasts made by global models? MRED is using a suite of RCMs to downscale seasonal forecasts produced by the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus is on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the usefulness of higher resolution for near-surface fields influenced by high resolution orography. Each RCM covers the conterminous U.S. at approximately 32 km resolution, comparable to the scale of the North American Regional Reanalysis (NARR) which will be used to evaluate the models. The forecast ensemble for each RCM is comprised of 15 members over a period of 22+ years (from 1982 to 2003+) for the forecast period 1 December - 30 April. Each RCM will create a 15-member lagged ensemble by starting on different dates in the preceding November. This results in a 120-member ensemble for each projection (8 RCMs by 15 members per RCM). The RCMs will be continually updated at their lateral boundaries using 6-hourly output from CFS or GEOS5. Hydrometeorological output will be produced in a standard netCDF-based format for a common analysis grid, which simplifies both model intercomparison and the generation of ensembles. MRED will compare individual RCM and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs). Metrics of ensemble spread will also be evaluated. Extensive process-oriented analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will define a strategy for more skillful and useful regional seasonal climate forecasts.
Communicating uncertainty in hydrological forecasts: mission impossible?
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian
2010-05-01
Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.
A framework for improving a seasonal hydrological forecasting system using sensitivity analysis
NASA Astrophysics Data System (ADS)
Arnal, Louise; Pappenberger, Florian; Smith, Paul; Cloke, Hannah
2017-04-01
Seasonal streamflow forecasts are of great value for the socio-economic sector, for applications such as navigation, flood and drought mitigation and reservoir management for hydropower generation and water allocation to agriculture and drinking water. However, as we speak, the performance of dynamical seasonal hydrological forecasting systems (systems based on running seasonal meteorological forecasts through a hydrological model to produce seasonal hydrological forecasts) is still limited in space and time. In this context, the ESP (Ensemble Streamflow Prediction) remains an attractive forecasting method for seasonal streamflow forecasting as it relies on forcing a hydrological model (starting from the latest observed or simulated initial hydrological conditions) with historical meteorological observations. This makes it cheaper to run than a standard dynamical seasonal hydrological forecasting system, for which the seasonal meteorological forecasts will first have to be produced, while still producing skilful forecasts. There is thus the need to focus resources and time towards improvements in dynamical seasonal hydrological forecasting systems which will eventually lead to significant improvements in the skill of the streamflow forecasts generated. Sensitivity analyses are a powerful tool that can be used to disentangle the relative contributions of the two main sources of errors in seasonal streamflow forecasts, namely the initial hydrological conditions (IHC; e.g., soil moisture, snow cover, initial streamflow, among others) and the meteorological forcing (MF; i.e., seasonal meteorological forecasts of precipitation and temperature, input to the hydrological model). Sensitivity analyses are however most useful if they inform and change current operational practices. To this end, we propose a method to improve the design of a seasonal hydrological forecasting system. This method is based on sensitivity analyses, informing the forecasters as to which element of the forecasting chain (i.e., IHC or MF) could potentially lead to the highest increase in seasonal hydrological forecasting performance, after each forecast update.
Tracking signal test to monitor an intelligent time series forecasting model
NASA Astrophysics Data System (ADS)
Deng, Yan; Jaraiedi, Majid; Iskander, Wafik H.
2004-03-01
Extensive research has been conducted on the subject of Intelligent Time Series forecasting, including many variations on the use of neural networks. However, investigation of model adequacy over time, after the training processes is completed, remains to be fully explored. In this paper we demonstrate a how a smoothed error tracking signals test can be incorporated into a neuro-fuzzy model to monitor the forecasting process and as a statistical measure for keeping the forecasting model up-to-date. The proposed monitoring procedure is effective in the detection of nonrandom changes, due to model inadequacy or lack of unbiasedness in the estimation of model parameters and deviations from the existing patterns. This powerful detection device will result in improved forecast accuracy in the long run. An example data set has been used to demonstrate the application of the proposed method.
NASA Astrophysics Data System (ADS)
LI, J.; Chen, Y.; Wang, H. Y.
2016-12-01
In large basin flood forecasting, the forecasting lead time is very important. Advances in numerical weather forecasting in the past decades provides new input to extend flood forecasting lead time in large rivers. Challenges for fulfilling this goal currently is that the uncertainty of QPF with these kinds of NWP models are still high, so controlling the uncertainty of QPF is an emerging technique requirement.The Weather Research and Forecasting (WRF) model is one of these NWPs, and how to control the QPF uncertainty of WRF is the research topic of many researchers among the meteorological community. In this study, the QPF products in the Liujiang river basin, a big river with a drainage area of 56,000 km2, was compared with the ground observation precipitation from a rain gauge networks firstly, and the results show that the uncertainty of the WRF QPF is relatively high. So a post-processed algorithm by correlating the QPF with the observed precipitation is proposed to remove the systematical bias in QPF. With this algorithm, the post-processed WRF QPF is close to the ground observed precipitation in area-averaged precipitation. Then the precipitation is coupled with the Liuxihe model, a physically based distributed hydrological model that is widely used in small watershed flash flood forecasting. The Liuxihe Model has the advantage with gridded precipitation from NWP and could optimize model parameters when there are some observed hydrological data even there is only a few, it also has very high model resolution to improve model performance, and runs on high performance supercomputer with parallel algorithm if executed in large rivers. Two flood events in the Liujiang River were collected, one was used to optimize the model parameters and another is used to validate the model. The results show that the river flow simulation has been improved largely, and could be used for real-time flood forecasting trail in extending flood forecasting leading time.
A comparative verification of high resolution precipitation forecasts using model output statistics
NASA Astrophysics Data System (ADS)
van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees
2017-04-01
Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.
NASA Astrophysics Data System (ADS)
Dreano, Denis; Tsiaras, Kostas; Triantafyllou, George; Hoteit, Ibrahim
2017-07-01
Forecasting the state of large marine ecosystems is important for many economic and public health applications. However, advanced three-dimensional (3D) ecosystem models, such as the European Regional Seas Ecosystem Model (ERSEM), are computationally expensive, especially when implemented within an ensemble data assimilation system requiring several parallel integrations. As an alternative to 3D ecological forecasting systems, we propose to implement a set of regional one-dimensional (1D) water-column ecological models that run at a fraction of the computational cost. The 1D model domains are determined using a Gaussian mixture model (GMM)-based clustering method and satellite chlorophyll-a (Chl-a) data. Regionally averaged Chl-a data is assimilated into the 1D models using the singular evolutive interpolated Kalman (SEIK) filter. To laterally exchange information between subregions and improve the forecasting skills, we introduce a new correction step to the assimilation scheme, in which we assimilate a statistical forecast of future Chl-a observations based on information from neighbouring regions. We apply this approach to the Red Sea and show that the assimilative 1D ecological models can forecast surface Chl-a concentration with high accuracy. The statistical assimilation step further improves the forecasting skill by as much as 50%. This general approach of clustering large marine areas and running several interacting 1D ecological models is very flexible. It allows many combinations of clustering, filtering and regression technics to be used and can be applied to build efficient forecasting systems in other large marine ecosystems.
NASA Astrophysics Data System (ADS)
Tsai, Hsiao-Chung; Chen, Pang-Cheng; Elsberry, Russell L.
2017-04-01
The objective of this study is to evaluate the predictability of the extended-range forecasts of tropical cyclone (TC) in the western North Pacific using reforecasts from National Centers for Environmental Prediction (NCEP) Global Ensemble Forecast System (GEFS) during 1996-2015, and from the Climate Forecast System (CFS) during 1999-2010. Tsai and Elsberry have demonstrated that an opportunity exists to support hydrological operations by using the extended-range TC formation and track forecasts in the western North Pacific from the ECMWF 32-day ensemble. To demonstrate this potential for the decision-making processes regarding water resource management and hydrological operation in Taiwan reservoir watershed areas, special attention is given to the skill of the NCEP GEFS and CFS models in predicting the TCs affecting the Taiwan area. The first objective of this study is to analyze the skill of NCEP GEFS and CFS TC forecasts and quantify the forecast uncertainties via verifications of categorical binary forecasts and probabilistic forecasts. The second objective is to investigate the relationships among the large-scale environmental factors [e.g., El Niño Southern Oscillation (ENSO), Madden-Julian Oscillation (MJO), etc.] and the model forecast errors by using the reforecasts. Preliminary results are indicating that the skill of the TC activity forecasts based on the raw forecasts can be further improved if the model biases are minimized by utilizing these reforecasts.
Spacebuoy: A University Nanosat Space Weather Mission (III)
2013-10-11
ionospheric forecasting models; specifically the operational Global Assimilation of Ionospheric Measurements (GAIM) model currently used by the Air Force... ionospheric forecasting models; specifically the operational Global Assimilation of Ionospheric Measurements (GAIM) model currently used by the Air...Mission Objectives • Provide critical space weather data for use in ionospheric forecasting efforts, particularly assimilated data used in the GAIM
NOAA's weather forecasts go hyper-local with next-generation weather
model NOAA HOME WEATHER OCEANS FISHERIES CHARTING SATELLITES CLIMATE RESEARCH COASTS CAREERS with next-generation weather model New model will help forecasters predict a storm's path, timing and intensity better than ever September 30, 2014 This is a comparison of two weather forecast models looking
Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao
2017-03-15
As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Load-Based Temperature Prediction Model for Anomaly Detection
NASA Astrophysics Data System (ADS)
Sobhani, Masoud
Electric load forecasting, as a basic requirement for the decision-making in power utilities, has been improved in various aspects in the past decades. Many factors may affect the accuracy of the load forecasts, such as data quality, goodness of the underlying model and load composition. Due to the strong correlation between the input variables (e.g., weather and calendar variables) and the load, the quality of input data plays a vital role in forecasting practices. Even if the forecasting model were able to capture most of the salient features of the load, a low quality input data may result in inaccurate forecasts. Most of the data cleansing efforts in the load forecasting literature have been devoted to the load data. Few studies focused on weather data cleansing for load forecasting. This research proposes an anomaly detection method for the temperature data. The method consists of two components: a load-based temperature prediction model and a detection technique. The effectiveness of the proposed method is demonstrated through two case studies: one based on the data from the Global Energy Forecasting Competition 2014, and the other based on the data published by ISO New England. The results show that by removing the detected observations from the original input data, the final load forecast accuracy is enhanced.
NASA Astrophysics Data System (ADS)
DeMott, C. A.; Klingaman, N. P.
2017-12-01
Skillful prediction of the Madden-Julian oscillation (MJO) passage across the Maritime Continent (MC) has important implications for global forecasts of high-impact weather events, such as atmospheric rivers and heat waves. The North American teleconnection response to the MJO is strongest when MJO convection is located in the western Pacific Ocean, but many climate and forecast models are deficient in their simulation of MC-crossing MJO events. Compared to atmosphere-only general circulation models (AGCMs), MJO simulation skill generally improves with the addition of ocean feedbacks in coupled GCMs (CGCMs). Using observations, previous studies have noted that the degree of ocean coupling may vary considerably from one MJO event to the next. The coupling mechanisms may be linked to the presence of ocean Equatorial Rossby waves, the sign and amplitude of Equatorial surface currents, and the upper ocean temperature and salinity profiles. In this study, we assess the role of ocean feedbacks to MJO prediction skill using a subset of CGCMs participating in the Subseasonal-to-Seasonal (S2S) Project database. Oceanic observational and reanalysis datasets are used to characterize the upper ocean background state for observed MJO events that do and do not propagate beyond the MC. The ability of forecast models to capture the oceanic influence on the MJO is first assessed by quantifying SST forecast skill. Next, a set of previously developed air-sea interaction diagnostics is applied to model output to measure the role of SST perturbations on the forecast MJO. The "SST effect" in forecast MJO events is compared to that obtained from reanalysis data. Leveraging all ensemble members of a given forecast helps disentangle oceanic model biases from atmospheric model biases, both of which can influence the expression of ocean feedbacks in coupled forecast systems. Results of this study will help identify areas of needed model improvement for improved MJO forecasts.
Roshani, G H; Karami, A; Salehizadeh, A; Nazemi, E
2017-11-01
The problem of how to precisely measure the volume fractions of oil-gas-water mixtures in a pipeline remains as one of the main challenges in the petroleum industry. This paper reports the capability of Radial Basis Function (RBF) in forecasting the volume fractions in a gas-oil-water multiphase system. Indeed, in the present research, the volume fractions in the annular three-phase flow are measured based on a dual energy metering system including the 152 Eu and 137 Cs and one NaI detector, and then modeled by a RBF model. Since the summation of volume fractions are constant (equal to 100%), therefore it is enough for the RBF model to forecast only two volume fractions. In this investigation, three RBF models are employed. The first model is used to forecast the oil and water volume fractions. The next one is utilized to forecast the water and gas volume fractions, and the last one to forecast the gas and oil volume fractions. In the next stage, the numerical data obtained from MCNP-X code must be introduced to the RBF models. Then, the average errors of these three models are calculated and compared. The model which has the least error is picked up as the best predictive model. Based on the results, the best RBF model, forecasts the oil and water volume fractions with the mean relative error of less than 0.5%, which indicates that the RBF model introduced in this study ensures an effective enough mechanism to forecast the results. Copyright © 2017 Elsevier Ltd. All rights reserved.
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
NASA Astrophysics Data System (ADS)
Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.
2018-03-01
Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.
NASA Astrophysics Data System (ADS)
Passarelli, Luigi; Sanso, Bruno; Laura, Sandri; Marzocchi, Warner
2010-05-01
One of the main goals in volcanology is to forecast volcanic eruptions. A trenchant forecast should be made before the onset of a volcanic eruption, using the data available at that time, with the aim of mitigating the volcanic risk associated to the volcanic event. In other words, models implemented with forecast purposes have to take into account the possibility to provide "forward" forecasts and should avoid the idea of a merely "retrospective" fitting of the data available. In this perspective, the main idea of the present model is to forecast the next volcanic eruption after the end of the last one, using only the data available at that time. We focus our attention on volcanoes with open conduit regime and high eruption frequency. We assume a generalization of the classical time predictable model to describe the eruptive behavior of open conduit volcanoes and we use a Bayesian hierarchical model to make probabilistic forecast. We apply the model to Kilauea volcano eruptive data and Mt. Etna volcano flank eruption data. The aims of this model are: 1) to test whether or not the Kilauea and Mt Etna volcanoes follow a time predictable behavior; 2) to discuss the volcanological implications of the time predictable model parameters inferred; 3) to compare the forecast capabilities of this model with other models present in literature. The results obtained using the MCMC sampling algorithm show that both volcanoes follow a time predictable behavior. The numerical values of the time predictable model parameters inferred suggest that the amount of the erupted volume could change the dynamics of the magma chamber refilling process during the repose period. The probability gain of this model compared with other models already present in literature is appreciably greater than zero. This means that our model performs better forecast than previous models and it could be used in a probabilistic volcanic hazard assessment scheme. In this perspective, the probability of eruptions given by our model for Mt Etna volcano flank eruption are published on a internet website and are updated after any change in the eruptive activity.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2015-10-01
Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for Liuxihe model parameter optimization effectively, and could improve the model capability largely in catchment flood forecasting, thus proven that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological model. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for Liuxihe model catchment flood forcasting is 20 and 30, respectively.
Real-time short-term forecast of water inflow into Bureyskaya reservoir
NASA Astrophysics Data System (ADS)
Motovilov, Yury
2017-04-01
During several recent years, a methodology for operational optimization in hydrosystems including forecasts of the hydrological situation has been developed on example of Burea reservoir. The forecasts accuracy improvement of the water inflow into the reservoir during planning of water and energy regime was one of the main goals for implemented research. Burea river is the second left largest Amur tributary after Zeya river with its 70.7 thousand square kilometers watershed and 723 km-long river course. A variety of natural conditions - from plains in the southern part to northern mountainous areas determine a significant spatio-temporal variability in runoff generation patterns and river regime. Bureyskaya hydropower plant (HPP) with watershed area 65.2 thousand square kilometers is a key station in the Russian Far Eastern energy system providing its reliable operation. With a spacious reservoir, Bureyskaya HPP makes a significant contribution to the protection of the Amur region from catastrophic floods. A physically-based distributed model of runoff generation based on the ECOMAG (ECOlogical Model for Applied Geophysics) hydrological modeling platform has been developed for the Burea River basin. The model describes processes of interception of rainfall/snowfall by the canopy, snow accumulation and melt, soil freezing and thawing, water infiltration into unfrozen and frozen soil, evapotranspiration, thermal and water regime of soil, overland, subsurface, ground and river flow. The governing model's equations are derived from integration of the basic hydro- and thermodynamics equations of water and heat vertical transfer in snowpack, frozen/unfrozen soil, horizontal water flow under and over catchment slopes, etc. The model setup for Bureya river basin included watershed and river network schematization with GIS module by DEM analysis, meteorological time-series preparation, model calibration and validation against historical observations. The results showed good model performance as compared to observed inflow data into the Bureya reservoir and high diagnostic potential of data-modeling system of the runoff formation. With the use of this system the following flowchart for short-range forecasting inflow into Bureyskoe reservoir and forecast correction technique using continuously updated hydrometeorological data has been developed: 1 - Daily renewal of weather observations and forecasts database via the Internet; 2 - Daily runoff calculation from the beginning of the current year to current date is conducted; 3 - Short-range (up to 7 days) forecast is generated based on weather forecast. The idea underlying the model assimilation of newly obtained hydro meteorological information to adjust short-range hydrological forecasts lies in the assumption of the forecast errors inertia. Then the difference between calculated and observed streamflow at the forecast release date is "scattered" with specific weights to calculated streamflow for the forecast lead time. During 2016 this forecasts method of the inflow into the Bureyskaya reservoir up to 7 days is tested in online mode. Satisfactory evaluated short-range inflow forecast success rate is obtained. Tests of developed method have shown strong sensitivity to the results of short-term precipitation forecasts.
A parimutuel gambling perspective to compare probabilistic seismicity forecasts
NASA Astrophysics Data System (ADS)
Zechar, J. Douglas; Zhuang, Jiancang
2014-10-01
Using analogies to gaming, we consider the problem of comparing multiple probabilistic seismicity forecasts. To measure relative model performance, we suggest a parimutuel gambling perspective which addresses shortcomings of other methods such as likelihood ratio, information gain and Molchan diagrams. We describe two variants of the parimutuel approach for a set of forecasts: head-to-head, in which forecasts are compared in pairs, and round table, in which all forecasts are compared simultaneously. For illustration, we compare the 5-yr forecasts of the Regional Earthquake Likelihood Models experiment for M4.95+ seismicity in California.
Model-free aftershock forecasts constructed from similar sequences in the past
NASA Astrophysics Data System (ADS)
van der Elst, N.; Page, M. T.
2017-12-01
The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.
Forecasting the value-at-risk of Chinese stock market using the HARQ model and extreme value theory
NASA Astrophysics Data System (ADS)
Liu, Guangqiang; Wei, Yu; Chen, Yongfei; Yu, Jiang; Hu, Yang
2018-06-01
Using intraday data of the CSI300 index, this paper discusses value-at-risk (VaR) forecasting of the Chinese stock market from the perspective of high-frequency volatility models. First, we measure the realized volatility (RV) with 5-minute high-frequency returns of the CSI300 index and then model it with the newly introduced heterogeneous autoregressive quarticity (HARQ) model, which can handle the time-varying coefficients of the HAR model. Second, we forecast the out-of-sample VaR of the CSI300 index by combining the HARQ model and extreme value theory (EVT). Finally, using several popular backtesting methods, we compare the VaR forecasting accuracy of HARQ model with other traditional HAR-type models, such as HAR, HAR-J, CHAR, and SHAR. The empirical results show that the novel HARQ model can beat other HAR-type models in forecasting the VaR of the Chinese stock market at various risk levels.
On the effect of model parameters on forecast objects
NASA Astrophysics Data System (ADS)
Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott
2018-04-01
Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map
. The field for some quantities generally consists of spatially coherent and disconnected objects
. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output
of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.
Congestion patterns of electric vehicles with limited battery capacity.
Jing, Wentao; Ramezani, Mohsen; An, Kun; Kim, Inhi
2018-01-01
The path choice behavior of battery electric vehicle (BEV) drivers is influenced by the lack of public charging stations, limited battery capacity, range anxiety and long battery charging time. This paper investigates the congestion/flow pattern captured by stochastic user equilibrium (SUE) traffic assignment problem in transportation networks with BEVs, where the BEV paths are restricted by their battery capacities. The BEV energy consumption is assumed to be a linear function of path length and path travel time, which addresses both path distance limit problem and road congestion effect. A mathematical programming model is proposed for the path-based SUE traffic assignment where the path cost is the sum of the corresponding link costs and a path specific out-of-energy penalty. We then apply the convergent Lagrangian dual method to transform the original problem into a concave maximization problem and develop a customized gradient projection algorithm to solve it. A column generation procedure is incorporated to generate the path set. Finally, two numerical examples are presented to demonstrate the applicability of the proposed model and the solution algorithm.
Congestion Prediction Modeling for Quality of Service Improvement in Wireless Sensor Networks
Lee, Ga-Won; Lee, Sung-Young; Huh, Eui-Nam
2014-01-01
Information technology (IT) is pushing ahead with drastic reforms of modern life for improvement of human welfare. Objects constitute “Information Networks” through smart, self-regulated information gathering that also recognizes and controls current information states in Wireless Sensor Networks (WSNs). Information observed from sensor networks in real-time is used to increase quality of life (QoL) in various industries and daily life. One of the key challenges of the WSNs is how to achieve lossless data transmission. Although nowadays sensor nodes have enhanced capacities, it is hard to assure lossless and reliable end-to-end data transmission in WSNs due to the unstable wireless links and low hard ware resources to satisfy high quality of service (QoS) requirements. We propose a node and path traffic prediction model to predict and minimize the congestion. This solution includes prediction of packet generation due to network congestion from both periodic and event data generation. Simulation using NS-2 and Matlab is used to demonstrate the effectiveness of the proposed solution. PMID:24784035
Using Cellular Automata for Parking Recommendations in Smart Environments
Horng, Gwo-Jiun
2014-01-01
In this work, we propose an innovative adaptive recommendation mechanism for smart parking. The cognitive RF module will transmit the vehicle location information and the parking space requirements to the parking congestion computing center (PCCC) when the driver must find a parking space. Moreover, for the parking spaces, we use a cellular automata (CA) model mechanism that can adjust to full and not full parking lot situations. Here, the PCCC can compute the nearest parking lot, the parking lot status and the current or opposite driving direction with the vehicle location information. By considering the driving direction, we can determine when the vehicles must turn around and thus reduce road congestion and speed up finding a parking space. The recommendation will be sent to the drivers through a wireless communication cognitive radio (CR) model after the computation and analysis by the PCCC. The current study evaluates the performance of this approach by conducting computer simulations. The simulation results show the strengths of the proposed smart parking mechanism in terms of avoiding increased congestion and decreasing the time to find a parking space. PMID:25153671
Congestion patterns of electric vehicles with limited battery capacity
2018-01-01
The path choice behavior of battery electric vehicle (BEV) drivers is influenced by the lack of public charging stations, limited battery capacity, range anxiety and long battery charging time. This paper investigates the congestion/flow pattern captured by stochastic user equilibrium (SUE) traffic assignment problem in transportation networks with BEVs, where the BEV paths are restricted by their battery capacities. The BEV energy consumption is assumed to be a linear function of path length and path travel time, which addresses both path distance limit problem and road congestion effect. A mathematical programming model is proposed for the path-based SUE traffic assignment where the path cost is the sum of the corresponding link costs and a path specific out-of-energy penalty. We then apply the convergent Lagrangian dual method to transform the original problem into a concave maximization problem and develop a customized gradient projection algorithm to solve it. A column generation procedure is incorporated to generate the path set. Finally, two numerical examples are presented to demonstrate the applicability of the proposed model and the solution algorithm. PMID:29543875
Measurement of nasal patency in anesthetized and conscious dogs.
Koss, Michael C; Yu, Yongxin; Hey, John A; McLeod, Robbie L
2002-02-01
Experiments were undertaken to characterize a noninvasive chronic, model of nasal congestion in which nasal patency is measured using acoustic rhinometry. Compound 48/80 was administered intranasally to elicit nasal congestion in five beagle dogs either by syringe (0.5 ml) in thiopental sodium-anesthetized animals or as a mist (0.25 ml) in the same animals in the conscious state. Effects of mast cell degranulation on nasal cavity volume as well as on minimal cross-sectional area (A(min)) and intranasal distance to A(min) (D(min)) were studied. Compound 48/80 caused a dose-related decrease in nasal cavity volume and A(min) together with a variable increase in D(min). Maximal responses were seen at 90-120 min. Compound 48/80 was less effective in producing nasal congestion in conscious animals, which also had significantly larger basal nasal cavity volumes. These results demonstrate the utility of using acoustic rhinometry to measure parameters of nasal patency in dogs and suggest that this model may prove useful in studies of the actions of decongestant drugs.
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NAM > EXPERIMENTAL DATA Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION
Medium-range fire weather forecasts
J.O. Roads; K. Ueyoshi; S.C. Chen; J. Alpert; F. Fujioka
1991-01-01
The forecast skill of theNational Meteorological Center's medium range forecast (MRF) numerical forecasts of fire weather variables is assessed for the period June 1,1988 to May 31,1990. Near-surface virtual temperature, relative humidity, wind speed and a derived fire weather index (FWI) are forecast well by the MRF model. However, forecast relative humidity has...
A Canonical Ensemble Correlation Prediction Model for Seasonal Precipitation Anomaly
NASA Technical Reports Server (NTRS)
Shen, Samuel S. P.; Lau, William K. M.; Kim, Kyu-Myong; Li, Guilong
2001-01-01
This report describes an optimal ensemble forecasting model for seasonal precipitation and its error estimation. Each individual forecast is based on the canonical correlation analysis (CCA) in the spectral spaces whose bases are empirical orthogonal functions (EOF). The optimal weights in the ensemble forecasting crucially depend on the mean square error of each individual forecast. An estimate of the mean square error of a CCA prediction is made also using the spectral method. The error is decomposed onto EOFs of the predictand and decreases linearly according to the correlation between the predictor and predictand. This new CCA model includes the following features: (1) the use of area-factor, (2) the estimation of prediction error, and (3) the optimal ensemble of multiple forecasts. The new CCA model is applied to the seasonal forecasting of the United States precipitation field. The predictor is the sea surface temperature.
Multilayer Stock Forecasting Model Using Fuzzy Time Series
Javedani Sadaei, Hossein; Lee, Muhammad Hisyam
2014-01-01
After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS. PMID:24605058
The Use of Ambient Humidity Conditions to Improve Influenza Forecast
NASA Astrophysics Data System (ADS)
Shaman, J. L.; Kandula, S.; Yang, W.; Karspeck, A. R.
2017-12-01
Laboratory and epidemiological evidence indicate that ambient humidity modulates the survival and transmission of influenza. Here we explore whether the inclusion of humidity forcing in mathematical models describing influenza transmission improves the accuracy of forecasts generated with those models. We generate retrospective forecasts for 95 cities over 10 seasons in the United States and assess both forecast accuracy and error. Overall, we find that humidity forcing improves forecast performance and that forecasts generated using daily climatological humidity forcing generally outperform forecasts that utilize daily observed humidity forcing. These findings hold for predictions of outbreak peak intensity, peak timing, and incidence over 2- and 4-week horizons. The results indicate that use of climatological humidity forcing is warranted for current operational influenza forecast and provide further evidence that humidity modulates rates of influenza transmission.
A Modified LS+AR Model to Improve the Accuracy of the Short-term Polar Motion Prediction
NASA Astrophysics Data System (ADS)
Wang, Z. W.; Wang, Q. X.; Ding, Y. Q.; Zhang, J. J.; Liu, S. S.
2017-03-01
There are two problems of the LS (Least Squares)+AR (AutoRegressive) model in polar motion forecast: the inner residual value of LS fitting is reasonable, but the residual value of LS extrapolation is poor; and the LS fitting residual sequence is non-linear. It is unsuitable to establish an AR model for the residual sequence to be forecasted, based on the residual sequence before forecast epoch. In this paper, we make solution to those two problems with two steps. First, restrictions are added to the two endpoints of LS fitting data to fix them on the LS fitting curve. Therefore, the fitting values next to the two endpoints are very close to the observation values. Secondly, we select the interpolation residual sequence of an inward LS fitting curve, which has a similar variation trend as the LS extrapolation residual sequence, as the modeling object of AR for the residual forecast. Calculation examples show that this solution can effectively improve the short-term polar motion prediction accuracy by the LS+AR model. In addition, the comparison results of the forecast models of RLS (Robustified Least Squares)+AR, RLS+ARIMA (AutoRegressive Integrated Moving Average), and LS+ANN (Artificial Neural Network) confirm the feasibility and effectiveness of the solution for the polar motion forecast. The results, especially for the polar motion forecast in the 1-10 days, show that the forecast accuracy of the proposed model can reach the world level.
Trends in the predictive performance of raw ensemble weather forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Scheuerer, Michael; Pappenberger, Florian; Bogner, Konrad; Haiden, Thomas
2015-04-01
Over the last two decades the paradigm in weather forecasting has shifted from being deterministic to probabilistic. Accordingly, numerical weather prediction (NWP) models have been run increasingly as ensemble forecasting systems. The goal of such ensemble forecasts is to approximate the forecast probability distribution by a finite sample of scenarios. Global ensemble forecast systems, like the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble, are prone to probabilistic biases, and are therefore not reliable. They particularly tend to be underdispersive for surface weather parameters. Hence, statistical post-processing is required in order to obtain reliable and sharp forecasts. In this study we apply statistical post-processing to ensemble forecasts of near-surface temperature, 24-hour precipitation totals, and near-surface wind speed from the global ECMWF model. Our main objective is to evaluate the evolution of the difference in skill between the raw ensemble and the post-processed forecasts. The ECMWF ensemble is under continuous development, and hence its forecast skill improves over time. Parts of these improvements may be due to a reduction of probabilistic bias. Thus, we first hypothesize that the gain by post-processing decreases over time. Based on ECMWF forecasts from January 2002 to March 2014 and corresponding observations from globally distributed stations we generate post-processed forecasts by ensemble model output statistics (EMOS) for each station and variable. Parameter estimates are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over rolling training periods that consist of the n days preceding the initialization dates. Given the higher average skill in terms of CRPS of the post-processed forecasts for all three variables, we analyze the evolution of the difference in skill between raw ensemble and EMOS forecasts. The fact that the gap in skill remains almost constant over time, especially for near-surface wind speed, suggests that improvements to the atmospheric model have an effect quite different from what calibration by statistical post-processing is doing. That is, they are increasing potential skill. Thus this study indicates that (a) further model development is important even if one is just interested in point forecasts, and (b) statistical post-processing is important because it will keep adding skill in the foreseeable future.
Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application
NASA Astrophysics Data System (ADS)
Chen, Jinduan; Boccelli, Dominic L.
2018-02-01
Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voisin, Nathalie; Pappenberger, Florian; Lettenmaier, D. P.
2011-08-15
A 10-day globally applicable flood prediction scheme was evaluated using the Ohio River basin as a test site for the period 2003-2007. The Variable Infiltration Capacity (VIC) hydrology model was initialized with the European Centre for Medium Range Weather Forecasts (ECMWF) analysis temperatures and wind, and Tropical Rainfall Monitoring Mission Multi Satellite Precipitation Analysis (TMPA) precipitation up to the day of forecast. In forecast mode, the VIC model was then forced with a calibrated and statistically downscaled ECMWF ensemble prediction system (EPS) 10-day ensemble forecast. A parallel set up was used where ECMWF EPS forecasts were interpolated to the spatialmore » scale of the hydrology model. Each set of forecasts was extended by 5 days using monthly mean climatological variables and zero precipitation in order to account for the effect of initial conditions. The 15-day spatially distributed ensemble runoff forecasts were then routed to four locations in the basin, each with different drainage areas. Surrogates for observed daily runoff and flow were provided by the reference run, specifically VIC simulation forced with ECMWF analysis fields and TMPA precipitation fields. The flood prediction scheme using the calibrated and downscaled ECMWF EPS forecasts was shown to be more accurate and reliable than interpolated forecasts for both daily distributed runoff forecasts and daily flow forecasts. Initial and antecedent conditions dominated the flow forecasts for lead times shorter than the time of concentration depending on the flow forecast amounts and the drainage area sizes. The flood prediction scheme had useful skill for the 10 following days at all sites.« less
van Baal, Pieter H; Wong, Albert
2012-12-01
Although the effect of time to death (TTD) on health care expenditures (HCE) has been investigated using individual level data, the most profound implications of TTD have been for the forecasting of macro-level HCE. Here we estimate the TTD model using macro-level data from the Netherlands consisting of mortality rates and age- and gender-specific per capita health expenditures for the years 1981-2007. Forecasts for the years 2008-2020 of this macro-level TTD model were compared to forecasts that excluded TTD. Results revealed that the effect of TTD on HCE in our macro model was similar to those found in micro-econometric studies. As the inclusion of TTD pushed growth rate estimates from unidentified causes upwards, however, the two models' forecasts of HCE for the 2008-2020 were similar. We argue that including TTD, if modeled correctly, does not lower forecasts of HCE. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kumar, Prashant; Gopalan, Kaushik; Shukla, Bipasha Paul; Shyam, Abhineet
2017-11-01
Specifying physically consistent and accurate initial conditions is one of the major challenges of numerical weather prediction (NWP) models. In this study, ground-based global positioning system (GPS) integrated water vapor (IWV) measurements available from the International Global Navigation Satellite Systems (GNSS) Service (IGS) station in Bangalore, India, are used to assess the impact of GPS data on NWP model forecasts over southern India. Two experiments are performed with and without assimilation of GPS-retrieved IWV observations during the Indian winter monsoon period (November-December, 2012) using a four-dimensional variational (4D-Var) data assimilation method. Assimilation of GPS data improved the model IWV analysis as well as the subsequent forecasts. There is a positive impact of ˜10 % over Bangalore and nearby regions. The Weather Research and Forecasting (WRF) model-predicted 24-h surface temperature forecasts have also improved when compared with observations. Small but significant improvements were found in the rainfall forecasts compared to control experiments.
Improved Neural Networks with Random Weights for Short-Term Load Forecasting
Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo
2015-01-01
An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting. PMID:26629825
Improved Neural Networks with Random Weights for Short-Term Load Forecasting.
Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo
2015-01-01
An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting.
A Space Weather Forecasting System with Multiple Satellites Based on a Self-Recognizing Network
Tokumitsu, Masahiro; Ishida, Yoshiteru
2014-01-01
This paper proposes a space weather forecasting system at geostationary orbit for high-energy electron flux (>2 MeV). The forecasting model involves multiple sensors on multiple satellites. The sensors interconnect and evaluate each other to predict future conditions at geostationary orbit. The proposed forecasting model is constructed using a dynamic relational network for sensor diagnosis and event monitoring. The sensors of the proposed model are located at different positions in space. The satellites for solar monitoring equip with monitoring devices for the interplanetary magnetic field and solar wind speed. The satellites orbit near the Earth monitoring high-energy electron flux. We investigate forecasting for typical two examples by comparing the performance of two models with different numbers of sensors. We demonstrate the prediction by the proposed model against coronal mass ejections and a coronal hole. This paper aims to investigate a possibility of space weather forecasting based on the satellite network with in-situ sensing. PMID:24803190
A space weather forecasting system with multiple satellites based on a self-recognizing network.
Tokumitsu, Masahiro; Ishida, Yoshiteru
2014-05-05
This paper proposes a space weather forecasting system at geostationary orbit for high-energy electron flux (>2 MeV). The forecasting model involves multiple sensors on multiple satellites. The sensors interconnect and evaluate each other to predict future conditions at geostationary orbit. The proposed forecasting model is constructed using a dynamic relational network for sensor diagnosis and event monitoring. The sensors of the proposed model are located at different positions in space. The satellites for solar monitoring equip with monitoring devices for the interplanetary magnetic field and solar wind speed. The satellites orbit near the Earth monitoring high-energy electron flux. We investigate forecasting for typical two examples by comparing the performance of two models with different numbers of sensors. We demonstrate the prediction by the proposed model against coronal mass ejections and a coronal hole. This paper aims to investigate a possibility of space weather forecasting based on the satellite network with in-situ sensing.
Impact of archeomagnetic field model data on modern era geomagnetic forecasts
NASA Astrophysics Data System (ADS)
Tangborn, Andrew; Kuang, Weijia
2018-03-01
A series of geomagnetic data assimilation experiments have been carried out to demonstrate the impact of assimilating archeomagnetic data via the CALS3k.4 geomagnetic field model from the period between 10 and 1590 CE. The assimilation continues with the gufm1 model from 1590 to 1990 and CM4 model from 1990 to 2000 as observations, and comparisons between these models and the geomagnetic forecasts are used to determine an optimal maximum degree for the archeomagnetic observations, and to independently estimate errors for these observations. These are compared with an assimilation experiment that uses the uncertainties provided with CALS3k.4. Optimal 20 year forecasts in 1990 are found when the Gauss coefficients up to degree 3 are assimilated. In addition we demonstrate how a forecast and observation bias correction scheme could be used to reduce bias in modern era forecasts. Initial experiments show that this approach can reduce modern era forecast biases by as much as 50%.
[Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].
Zheng, Chang-song; Ma, Biao
2009-04-01
The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
Auctionable fixed transmission rights for congestion management
NASA Astrophysics Data System (ADS)
Alomoush, Muwaffaq Irsheid
Electric power deregulation has proposed a major change to the regulated utility monopoly. The change manifests the main part of engineers' efforts to reshape three components of today's regulated monopoly: generation, distribution and transmission. In this open access deregulated power market, transmission network plays a major role, and transmission congestion is a major problem that requires further consideration especially when inter-zonal/intra-zonal scheme is implemented. Declaring that engineering studies and experience are the criteria to define zonal boundaries or defining a zone based on the fact that a zone is a densely interconnected area (lake) and paths connecting these densely interconnected areas are inter-zonal lines will render insufficient and fuzzy definitions. Moreover, a congestion problem formulation should take into consideration interactions between intra-zonal and inter-zonal flows and their effects on power systems. In this thesis, we introduce a procedure for minimizing the number of adjustments of preferred schedules to alleviate congestion and apply control schemes to minimize interactions between zones. In addition, we give the zone definition a certain criterion based on the Locational Marginal Price (LMP). This concept will be used to define congestion zonal boundaries and to decide whether any zone should be merged with another zone or split into new zones. The thesis presents a unified scheme that combines zonal and FTR schemes to manage congestion. This combined scheme is utilized with LMPs to define zonal boundaries more appropriately. The presented scheme gains the best features of the FTR scheme, which are providing financial certainty, maximizing the efficient use of the system and making users pay for the actual use of congested paths. LMPs may give an indication of the impact of wheeling transactions, and calculations of and comparisons of LMPs with and without wheeling transactions should be adequate criteria to approve the transaction by the ISO, take a decision to expand the existing system, or retain the original structure of the system. Also, the thesis investigates the impact of wheeling transactions on congestion management, where we present a generalized mathematical model for the Fixed Transmission Right (FTR) auction. The auction guarantees FTR availability to all participants on a non-discriminatory basis, in which system users are permitted to buy, sell and trade FTRs through an auction. When FTRs are utilized with LMPs, they increase the efficient use of the transmission system and let a transmission customer gain advantageous features such as acquiring a mechanism to offset the extra cost due to congestion charges, providing financial and operational certainty and making system users pay for the actual use of the congested paths. The thesis also highlighted FTR trading in secondary markets to self-arrange access across different paths, create long-term transmission rights and provide more commercial certainty.
National Centers for Environmental Prediction
SYSTEM CFS CLIMATE FORECAST SYSTEM NAQFC NAQFC MODEL GEFS GLOBAL ENSEMBLE FORECAST SYSTEM HWRF HURRICANE WEATHER RESEARCH and FORECASTING HMON HMON - OPERATIONAL HURRICANE FORECASTING WAVEWATCH III WAVEWATCH III
NASA Astrophysics Data System (ADS)
Ali, Mumtaz; Deo, Ravinesh C.; Downs, Nathan J.; Maraseni, Tek
2018-07-01
Forecasting drought by means of the World Meteorological Organization-approved Standardized Precipitation Index (SPI) is considered to be a fundamental task to support socio-economic initiatives and effectively mitigating the climate-risk. This study aims to develop a robust drought modelling strategy to forecast multi-scalar SPI in drought-rich regions of Pakistan where statistically significant lagged combinations of antecedent SPI are used to forecast future SPI. With ensemble-Adaptive Neuro Fuzzy Inference System ('ensemble-ANFIS') executed via a 10-fold cross-validation procedure, a model is constructed by randomly partitioned input-target data. Resulting in 10-member ensemble-ANFIS outputs, judged by mean square error and correlation coefficient in the training period, the optimal forecasts are attained by the averaged simulations, and the model is benchmarked with M5 Model Tree and Minimax Probability Machine Regression (MPMR). The results show the proposed ensemble-ANFIS model's preciseness was notably better (in terms of the root mean square and mean absolute error including the Willmott's, Nash-Sutcliffe and Legates McCabe's index) for the 6- and 12- month compared to the 3-month forecasts as verified by the largest error proportions that registered in smallest error band. Applying 10-member simulations, ensemble-ANFIS model was validated for its ability to forecast severity (S), duration (D) and intensity (I) of drought (including the error bound). This enabled uncertainty between multi-models to be rationalized more efficiently, leading to a reduction in forecast error caused by stochasticity in drought behaviours. Through cross-validations at diverse sites, a geographic signature in modelled uncertainties was also calculated. Considering the superiority of ensemble-ANFIS approach and its ability to generate uncertainty-based information, the study advocates the versatility of a multi-model approach for drought-risk forecasting and its prime importance for estimating drought properties over confidence intervals to generate better information for strategic decision-making.
Initial conditions and ENSO prediction using a coupled ocean-atmosphere model
NASA Astrophysics Data System (ADS)
Larow, T. E.; Krishnamurti, T. N.
1998-01-01
A coupled ocean-atmosphere initialization scheme using Newtonian relaxation has been developed for the Florida State University coupled ocean-atmosphere global general circulation model. The initialization scheme is used to initialize the coupled model for seasonal forecasting the boreal summers of 1987 and 1988. The atmosphere model is a modified version of the Florida State University global spectral model, resolution T-42. The ocean general circulation model consists of a slightly modified version of the Hamburg's climate group model described in Latif (1987) and Latif et al. (1993). The coupling is synchronous with information exchanged every two model hours. Using ECMWF atmospheric daily analysis and observed monthly mean SSTs, two, 1-year, time-dependent, Newtonian relaxation were performed using the coupled model prior to conducting the seasonal forecasts. The coupled initializations were conducted from 1 June 1986 to 1 June 1987 and from 1 June 1987 to 1 June 1988. Newtonian relaxation was applied to the prognostic atmospheric vorticity, divergence, temperature and dew point depression equations. In the ocean model the relaxation was applied to the surface temperature. Two, 10-member ensemble integrations were conducted to examine the impact of the coupled initialization on the seasonal forecasts. The initial conditions used for the ensembles are the ocean's final state after the initialization and the atmospheric initial conditions are ECMWF analysis. Examination of the SST root mean square error and anomaly correlations between observed and forecasted SSTs in the Niño-3 and Niño-4 regions for the 2 seasonal forecasts, show closer agreement between the initialized forecast than two, 10-member non-initialized ensemble forecasts. The main conclusion here is that a single forecast with the coupled initialization outperforms, in SST anomaly prediction, against each of the control forecasts (members of the ensemble) which do not include such an initialization, indicating possible importance for the inclusion of the atmosphere during the coupled initialization.
A 30-day forecast experiment with the GISS model and updated sea surface temperatures
NASA Technical Reports Server (NTRS)
Spar, J.; Atlas, R.; Kuo, E.
1975-01-01
The GISS model was used to compute two parallel global 30-day forecasts for the month January 1974. In one forecast, climatological January sea surface temperatures were used, while in the other observed sea temperatures were inserted and updated daily. A comparison of the two forecasts indicated no clear-cut beneficial effect of daily updating of sea surface temperatures. Despite the rapid decay of daily predictability, the model produced a 30-day mean forecast for January 1974 that was generally superior to persistence and climatology when evaluated over either the globe or the Northern Hemisphere, but not over smaller regions.
Use of the Box and Jenkins time series technique in traffic forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nihan, N.L.; Holmesland, K.O.
The use of recently developed time series techniques for short-term traffic volume forecasting is examined. A data set containing monthly volumes on a freeway segment for 1968-76 is used to fit a time series model. The resultant model is used to forecast volumes for 1977. The forecast volumes are then compared with actual volumes in 1977. Time series techniques can be used to develop highly accurate and inexpensive short-term forecasts. The feasibility of using these models to evaluate the effects of policy changes or other outside impacts is considered. (1 diagram, 1 map, 14 references,2 tables)
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Kabuchanga, Eric; Zavodsky, Bradley T.; Limaye, Ashutosh S.
2014-01-01
SPoRT/SERVIR/RCMRD/KMS Collaboration: Builds off strengths of each organization. SPoRT: Transition of satellite, modeling and verification capabilities; SERVIR-Africa/RCMRD: International capacity-building expertise; KMS: Operational organization with regional weather forecasting expertise in East Africa. Hypothesis: Improved land-surface initialization over Eastern Africa can lead to better temperature, moisture, and ultimately precipitation forecasts in NWP models. KMS currently initializes Weather Research and Forecasting (WRF) model with NCEP/Global Forecast System (GFS) model 0.5-deg initial / boundary condition data. LIS will provide much higher-resolution land-surface data at a scale more representative to regional WRF configuration. Future implementation of real-time NESDIS/VIIRS vegetation fraction to further improve land surface representativeness.
Verification of short lead time forecast models: applied to Kp and Dst forecasting
NASA Astrophysics Data System (ADS)
Wintoft, Peter; Wik, Magnus
2016-04-01
In the ongoing EU/H2020 project PROGRESS models that predicts Kp, Dst, and AE from L1 solar wind data will be used as inputs to radiation belt models. The possible lead times from L1 measurements are shorter (10s of minutes to hours) than the typical duration of the physical phenomena that should be forecast. Under these circumstances several metrics fail to single out trivial cases, such as persistence. In this work we explore metrics and approaches for short lead time forecasts. We apply these to current Kp and Dst forecast models. This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637302.
Tracking Expected Improvements of Decadal Prediction in Climate Services
NASA Astrophysics Data System (ADS)
Suckling, E.; Thompson, E.; Smith, L. A.
2013-12-01
Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.
The use of ambient humidity conditions to improve influenza forecast.
Shaman, Jeffrey; Kandula, Sasikiran; Yang, Wan; Karspeck, Alicia
2017-11-01
Laboratory and epidemiological evidence indicate that ambient humidity modulates the survival and transmission of influenza. Here we explore whether the inclusion of humidity forcing in mathematical models describing influenza transmission improves the accuracy of forecasts generated with those models. We generate retrospective forecasts for 95 cities over 10 seasons in the United States and assess both forecast accuracy and error. Overall, we find that humidity forcing improves forecast performance (at 1-4 lead weeks, 3.8% more peak week and 4.4% more peak intensity forecasts are accurate than with no forcing) and that forecasts generated using daily climatological humidity forcing generally outperform forecasts that utilize daily observed humidity forcing (4.4% and 2.6% respectively). These findings hold for predictions of outbreak peak intensity, peak timing, and incidence over 2- and 4-week horizons. The results indicate that use of climatological humidity forcing is warranted for current operational influenza forecast.
The use of ambient humidity conditions to improve influenza forecast
Kandula, Sasikiran; Karspeck, Alicia
2017-01-01
Laboratory and epidemiological evidence indicate that ambient humidity modulates the survival and transmission of influenza. Here we explore whether the inclusion of humidity forcing in mathematical models describing influenza transmission improves the accuracy of forecasts generated with those models. We generate retrospective forecasts for 95 cities over 10 seasons in the United States and assess both forecast accuracy and error. Overall, we find that humidity forcing improves forecast performance (at 1–4 lead weeks, 3.8% more peak week and 4.4% more peak intensity forecasts are accurate than with no forcing) and that forecasts generated using daily climatological humidity forcing generally outperform forecasts that utilize daily observed humidity forcing (4.4% and 2.6% respectively). These findings hold for predictions of outbreak peak intensity, peak timing, and incidence over 2- and 4-week horizons. The results indicate that use of climatological humidity forcing is warranted for current operational influenza forecast. PMID:29145389