Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The ECE application forecasts annual costs of preventive and corrective maintenance for budgeting purposes. Features within the application enable the user to change the specifications of the model to customize your forecast to best fit their needs and support “what if” analysis. Based on the user's selections, the ECE model forecasts annual maintenance costs. Preventive maintenance costs include the cost of labor to perform preventive maintenance activities at the specific frequency and labor rate. Corrective maintenance costs include the cost of labor and the cost of replacement parts. The application presents forecasted maintenance costs for the next five years inmore » two tables: costs by year and costs by site.« less
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-01-01
Voice applications, data applications, video applications, impacted baseline forecasts, market distribution model, net long haul forecasts, trunking earth station definition and costs, trunking space segment cost, trunking entrance/exit links, trunking network costs and crossover distances with terrestrial tariffs, net addressable forecasts, capacity requirements, improving spectrum utilization, satellite system market development, and the 30/20 net accessible market are considered.
NASA Astrophysics Data System (ADS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-09-01
Voice applications, data applications, video applications, impacted baseline forecasts, market distribution model, net long haul forecasts, trunking earth station definition and costs, trunking space segment cost, trunking entrance/exit links, trunking network costs and crossover distances with terrestrial tariffs, net addressable forecasts, capacity requirements, improving spectrum utilization, satellite system market development, and the 30/20 net accessible market are considered.
Habka, Dany; Mann, David; Landes, Ronald; Soto-Gutierrez, Alejandro
2015-01-01
During the past 20 years liver transplantation has become the definitive treatment for most severe types of liver failure and hepatocellular carcinoma, in both children and adults. In the U.S., roughly 16,000 individuals are on the liver transplant waiting list. Only 38% of them will receive a transplant due to the organ shortage. This paper explores another option: bioengineering an autologous liver graft. We developed a 20-year model projecting future demand for liver transplants, along with costs based on current technology. We compared these cost projections against projected costs to bioengineer autologous liver grafts. The model was divided into: 1) the epidemiology model forecasting the number of wait-listed patients, operated patients and postoperative patients; and 2) the treatment model forecasting costs (pre-transplant-related costs; transplant (admission)-related costs; and 10-year post-transplant-related costs) during the simulation period. The patient population was categorized using the Model for End-Stage Liver Disease score. The number of patients on the waiting list was projected to increase 23% over 20 years while the weighted average treatment costs in the pre-liver transplantation phase were forecast to increase 83% in Year 20. Projected demand for livers will increase 10% in 10 years and 23% in 20 years. Total costs of liver transplantation are forecast to increase 33% in 10 years and 81% in 20 years. By comparison, the projected cost to bioengineer autologous liver grafts is $9.7M based on current catalog prices for iPS-derived liver cells. The model projects a persistent increase in need and cost of donor livers over the next 20 years that’s constrained by a limited supply of donor livers. The number of patients who die while on the waiting list will reflect this ever-growing disparity. Currently, bioengineering autologous liver grafts is cost prohibitive. However, costs will decline rapidly with the introduction of new manufacturing strategies and economies of scale. PMID:26177505
Habka, Dany; Mann, David; Landes, Ronald; Soto-Gutierrez, Alejandro
2015-01-01
During the past 20 years liver transplantation has become the definitive treatment for most severe types of liver failure and hepatocellular carcinoma, in both children and adults. In the U.S., roughly 16,000 individuals are on the liver transplant waiting list. Only 38% of them will receive a transplant due to the organ shortage. This paper explores another option: bioengineering an autologous liver graft. We developed a 20-year model projecting future demand for liver transplants, along with costs based on current technology. We compared these cost projections against projected costs to bioengineer autologous liver grafts. The model was divided into: 1) the epidemiology model forecasting the number of wait-listed patients, operated patients and postoperative patients; and 2) the treatment model forecasting costs (pre-transplant-related costs; transplant (admission)-related costs; and 10-year post-transplant-related costs) during the simulation period. The patient population was categorized using the Model for End-Stage Liver Disease score. The number of patients on the waiting list was projected to increase 23% over 20 years while the weighted average treatment costs in the pre-liver transplantation phase were forecast to increase 83% in Year 20. Projected demand for livers will increase 10% in 10 years and 23% in 20 years. Total costs of liver transplantation are forecast to increase 33% in 10 years and 81% in 20 years. By comparison, the projected cost to bioengineer autologous liver grafts is $9.7M based on current catalog prices for iPS-derived liver cells. The model projects a persistent increase in need and cost of donor livers over the next 20 years that's constrained by a limited supply of donor livers. The number of patients who die while on the waiting list will reflect this ever-growing disparity. Currently, bioengineering autologous liver grafts is cost prohibitive. However, costs will decline rapidly with the introduction of new manufacturing strategies and economies of scale.
Suppression cost forecasts in advance of wildfire seasons
Jeffrey P. Prestemon; Karen Abt; Krista Gebert
2008-01-01
Approaches for forecasting wildfire suppression costs in advance of a wildfire season are demonstrated for two lead times: fall and spring of the current fiscal year (Oct. 1âSept. 30). Model functional forms are derived from aggregate expressions of a least cost plus net value change model. Empirical estimates of these models are used to generate advance-of-season...
ERIC Educational Resources Information Center
Zan, Xinxing Anna; Yoon, Sang Won; Khasawneh, Mohammad; Srihari, Krishnaswami
2013-01-01
In an effort to develop a low-cost and user-friendly forecasting model to minimize forecasting error, we have applied average and exponentially weighted return ratios to project undergraduate student enrollment. We tested the proposed forecasting models with different sets of historical enrollment data, such as university-, school-, and…
Wildfire suppression cost forecasts from the US Forest Service
Karen L. Abt; Jeffrey P. Prestemon; Krista M. Gebert
2009-01-01
The US Forest Service and other land-management agencies seek better tools for nticipating future expenditures for wildfire suppression. We developed regression models for forecasting US Forest Service suppression spending at 1-, 2-, and 3-year lead times. We compared these models to another readily available forecast model, the 10-year moving average model,...
NASA Astrophysics Data System (ADS)
Kumar, I.; Josset, L.; e Silva, E. C.; Possas, J. M. C.; Asfora, M. C.; Lall, U.
2017-12-01
The financial health and sustainability, ensuring adequate supply, and adapting to climate are fundamental challenges faced by water managers. These challenges are worsened in semi-arid regions with socio-economic pressures, seasonal supply of water, and projected increase in intensity and frequency of droughts. Over time, probabilistic rainfall forecasts are improving and for water managers, it could be key in addressing the above challenges. Using forecasts can also help make informed decisions about future infrastructure. The study proposes a model to minimize cost of water supply (including cost of deficit) given ensemble forecasts. The model can be applied to seasonal to annual ensemble forecasts, to determine the least cost solution. The objective of the model is to evaluate the resiliency and cost associated to supplying water. A case study is conducted in one of the largest reservoirs (Jucazinho) in Pernambuco state, Brazil, and four other reservoirs, which provide water to nineteen municipalities in the Jucazinho system. The state has been in drought since 2011, and the Jucazinho reservoir, has been empty since January 2017. The importance of climate adaptation along with risk management and financial sustainability are important to the state as it is extremely vulnerable to droughts, and has seasonal streamflow. The objectives of the case study are first, to check if streamflow forecasts help reduce future supply costs by comparing k-nearest neighbor ensemble forecasts with a fixed release policy. Second, to determine the value of future infrastructure, a new source of supply from Rio São Francisco, considered to mitigate drought conditions. The study concludes that using forecasts improve the supply and financial sustainability of water, by reducing cost of failure. It also concludes that additional infrastructure can help reduce the risks of failure significantly, but does not guarantee supply during prolonged droughts like the one experienced currently.
Weight and cost forecasting for advanced manned space vehicles
NASA Technical Reports Server (NTRS)
Williams, Raymond
1989-01-01
A mass and cost estimating computerized methology for predicting advanced manned space vehicle weights and costs was developed. The user friendly methology designated MERCER (Mass Estimating Relationship/Cost Estimating Relationship) organizes the predictive process according to major vehicle subsystem levels. Design, development, test, evaluation, and flight hardware cost forecasting is treated by the study. This methodology consists of a complete set of mass estimating relationships (MERs) which serve as the control components for the model and cost estimating relationships (CERs) which use MER output as input. To develop this model, numerous MER and CER studies were surveyed and modified where required. Additionally, relationships were regressed from raw data to accommodate the methology. The models and formulations which estimated the cost of historical vehicles to within 20 percent of the actual cost were selected. The result of the research, along with components of the MERCER Program, are reported. On the basis of the analysis, the following conclusions were established: (1) The cost of a spacecraft is best estimated by summing the cost of individual subsystems; (2) No one cost equation can be used for forecasting the cost of all spacecraft; (3) Spacecraft cost is highly correlated with its mass; (4) No study surveyed contained sufficient formulations to autonomously forecast the cost and weight of the entire advanced manned vehicle spacecraft program; (5) No user friendly program was found that linked MERs with CERs to produce spacecraft cost; and (6) The group accumulation weight estimation method (summing the estimated weights of the various subsystems) proved to be a useful method for finding total weight and cost of a spacecraft.
A novel single-parameter approach for forecasting algal blooms.
Xiao, Xi; He, Junyu; Huang, Haomin; Miller, Todd R; Christakos, George; Reichwaldt, Elke S; Ghadouani, Anas; Lin, Shengpan; Xu, Xinhua; Shi, Jiyan
2017-01-01
Harmful algal blooms frequently occur globally, and forecasting could constitute an essential proactive strategy for bloom control. To decrease the cost of aquatic environmental monitoring and increase the accuracy of bloom forecasting, a novel single-parameter approach combining wavelet analysis with artificial neural networks (WNN) was developed and verified based on daily online monitoring datasets of algal density in the Siling Reservoir, China and Lake Winnebago, U.S.A. Firstly, a detailed modeling process was illustrated using the forecasting of cyanobacterial cell density in the Chinese reservoir as an example. Three WNN models occupying various prediction time intervals were optimized through model training using an early stopped training approach. All models performed well in fitting historical data and predicting the dynamics of cyanobacterial cell density, with the best model predicting cyanobacteria density one-day ahead (r = 0.986 and mean absolute error = 0.103 × 10 4 cells mL -1 ). Secondly, the potential of this novel approach was further confirmed by the precise predictions of algal biomass dynamics measured as chl a in both study sites, demonstrating its high performance in forecasting algal blooms, including cyanobacteria as well as other blooming species. Thirdly, the WNN model was compared to current algal forecasting methods (i.e. artificial neural networks, autoregressive integrated moving average model), and was found to be more accurate. In addition, the application of this novel single-parameter approach is cost effective as it requires only a buoy-mounted fluorescent probe, which is merely a fraction (∼15%) of the cost of a typical auto-monitoring system. As such, the newly developed approach presents a promising and cost-effective tool for the future prediction and management of harmful algal blooms. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-01-01
Voice applications, data applications, video applications, impacted baseline forecasts, market distribution, potential CPS (customers premises services) user classes, net long haul forecasts, CPS cost analysis, overall satellite forecast, CPS satellite market, Ka-band CPS satellite forecast, nationwide traffic distribution model, and intra-urban topology are discussed.
NASA Astrophysics Data System (ADS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-08-01
Voice applications, data applications, video applications, impacted baseline forecasts, market distribution, potential CPS (customers premises services) user classes, net long haul forecasts, CPS cost analysis, overall satellite forecast, CPS satellite market, Ka-band CPS satellite forecast, nationwide traffic distribution model, and intra-urban topology are discussed.
Forecasting in foodservice: model development, testing, and evaluation.
Miller, J L; Thompson, P A; Orabella, M M
1991-05-01
This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolinger, Mark; Wiser, Ryan; Golove, William
2003-08-13
Against the backdrop of increasingly volatile natural gas prices, renewable energy resources, which by their nature are immune to natural gas fuel price risk, provide a real economic benefit. Unlike many contracts for natural gas-fired generation, renewable generation is typically sold under fixed-price contracts. Assuming that electricity consumers value long-term price stability, a utility or other retail electricity supplier that is looking to expand its resource portfolio (or a policymaker interested in evaluating different resource options) should therefore compare the cost of fixed-price renewable generation to the hedged or guaranteed cost of new natural gas-fired generation, rather than to projectedmore » costs based on uncertain gas price forecasts. To do otherwise would be to compare apples to oranges: by their nature, renewable resources carry no natural gas fuel price risk, and if the market values that attribute, then the most appropriate comparison is to the hedged cost of natural gas-fired generation. Nonetheless, utilities and others often compare the costs of renewable to gas-fired generation using as their fuel price input long-term gas price forecasts that are inherently uncertain, rather than long-term natural gas forward prices that can actually be locked in. This practice raises the critical question of how these two price streams compare. If they are similar, then one might conclude that forecast-based modeling and planning exercises are in fact approximating an apples-to-apples comparison, and no further consideration is necessary. If, however, natural gas forward prices systematically differ from price forecasts, then the use of such forecasts in planning and modeling exercises will yield results that are biased in favor of either renewable (if forwards < forecasts) or natural gas-fired generation (if forwards > forecasts). In this report we compare the cost of hedging natural gas price risk through traditional gas-based hedging instruments (e.g., futures, swaps, and fixed-price physical supply contracts) to contemporaneous forecasts of spot natural gas prices, with the purpose of identifying any systematic differences between the two. Although our data set is quite limited, we find that over the past three years, forward gas prices for durations of 2-10 years have been considerably higher than most natural gas spot price forecasts, including the reference case forecasts developed by the Energy Information Administration (EIA). This difference is striking, and implies that resource planning and modeling exercises based on these forecasts over the past three years have yielded results that are biased in favor of gas-fired generation (again, presuming that long-term stability is desirable). As discussed later, these findings have important ramifications for resource planners, energy modelers, and policy-makers.« less
2000-06-20
smoothing and regression which includes curve fitting are two principle forecasting model types utilized in the vast majority of forecasting applications ... model were compared against the VA Office of Policy and Planning forecasting study commissioned with the actuarial firm of Milliman & Robertson (M & R... Application to the Veterans Healthcare System The development of a model to forecast future VEV needs, utilization, and cost of the Acute Care and
ERIC Educational Resources Information Center
Hoffman, Benjamin B.
Forecasting models for maximizing postsecondary futures and applications of the model are considered. The forecasting of broad human futures has many parallels to human futures in the field of medical prognosis. The concept of "exasperated negative" is used to refer to the suppression of critical information about a negative future with…
The Impact of Implementing a Demand Forecasting System into a Low-Income Country’s Supply Chain
Mueller, Leslie E.; Haidari, Leila A.; Wateska, Angela R.; Phillips, Roslyn J.; Schmitz, Michelle M.; Connor, Diana L.; Norman, Bryan A.; Brown, Shawn T.; Welling, Joel S.; Lee, Bruce Y.
2016-01-01
OBJECTIVE To evaluate the potential impact and value of applications (e.g., ordering levels, storage capacity, transportation capacity, distribution frequency) of data from demand forecasting systems implemented in a lower-income country’s vaccine supply chain with different levels of population change to urban areas. MATERIALS AND METHODS Using our software, HERMES, we generated a detailed discrete event simulation model of Niger’s entire vaccine supply chain, including every refrigerator, freezer, transport, personnel, vaccine, cost, and location. We represented the introduction of a demand forecasting system to adjust vaccine ordering that could be implemented with increasing delivery frequencies and/or additions of cold chain equipment (storage and/or transportation) across the supply chain during varying degrees of population movement. RESULTS Implementing demand forecasting system with increased storage and transport frequency increased the number of successfully administered vaccine doses and lowered the logistics cost per dose up to 34%. Implementing demand forecasting system without storage/transport increases actually decreased vaccine availability in certain circumstances. DISCUSSION The potential maximum gains of a demand forecasting system may only be realized if the system is implemented to both augment the supply chain cold storage and transportation. Implementation may have some impact but, in certain circumstances, may hurt delivery. Therefore, implementation of demand forecasting systems with additional storage and transport may be the better approach. Significant decreases in the logistics cost per dose with more administered vaccines support investment in these forecasting systems. CONCLUSION Demand forecasting systems have the potential to greatly improve vaccine demand fulfillment, and decrease logistics cost/dose when implemented with storage and transportation increases direct vaccines. Simulation modeling can demonstrate the potential health and economic benefits of supply chain improvements. PMID:27219341
The impact of implementing a demand forecasting system into a low-income country's supply chain.
Mueller, Leslie E; Haidari, Leila A; Wateska, Angela R; Phillips, Roslyn J; Schmitz, Michelle M; Connor, Diana L; Norman, Bryan A; Brown, Shawn T; Welling, Joel S; Lee, Bruce Y
2016-07-12
To evaluate the potential impact and value of applications (e.g. adjusting ordering levels, storage capacity, transportation capacity, distribution frequency) of data from demand forecasting systems implemented in a lower-income country's vaccine supply chain with different levels of population change to urban areas. Using our software, HERMES, we generated a detailed discrete event simulation model of Niger's entire vaccine supply chain, including every refrigerator, freezer, transport, personnel, vaccine, cost, and location. We represented the introduction of a demand forecasting system to adjust vaccine ordering that could be implemented with increasing delivery frequencies and/or additions of cold chain equipment (storage and/or transportation) across the supply chain during varying degrees of population movement. Implementing demand forecasting system with increased storage and transport frequency increased the number of successfully administered vaccine doses and lowered the logistics cost per dose up to 34%. Implementing demand forecasting system without storage/transport increases actually decreased vaccine availability in certain circumstances. The potential maximum gains of a demand forecasting system may only be realized if the system is implemented to both augment the supply chain cold storage and transportation. Implementation may have some impact but, in certain circumstances, may hurt delivery. Therefore, implementation of demand forecasting systems with additional storage and transport may be the better approach. Significant decreases in the logistics cost per dose with more administered vaccines support investment in these forecasting systems. Demand forecasting systems have the potential to greatly improve vaccine demand fulfilment, and decrease logistics cost/dose when implemented with storage and transportation increases. Simulation modeling can demonstrate the potential health and economic benefits of supply chain improvements. Copyright © 2016 Elsevier Ltd. All rights reserved.
Models for forecasting energy use in the US farm sector
NASA Astrophysics Data System (ADS)
Christensen, L. R.
1981-07-01
Econometric models were developed and estimated for the purpose of forecasting electricity and petroleum demand in US agriculture. A structural approach is pursued which takes account of the fact that the quantity demanded of any one input is a decision made in conjunction with other input decisions. Three different functional forms of varying degrees of complexity are specified for the structural cost function, which describes the cost of production as a function of the level of output and factor prices. Demand for materials (all purchased inputs) is derived from these models. A separate model which break this demand up into demand for the four components of materials is used to produce forecasts of electricity and petroleum is a stepwise manner.
Global Positioning System (GPS) Precipitable Water in Forecasting Lightning at Spaceport Canaveral
NASA Technical Reports Server (NTRS)
Kehrer, Kristen; Graf, Brian G.; Roeder, William
2005-01-01
Using meteorology data, focusing on precipitable water (PW), obtained during the 2000-2003 thunderstorm seasons in Central Florida, this paper will, one, assess the skill and accuracy measurements of the current Mazany forecasting tool and, two, provide additional forecasting tools that can be used in predicting lightning. Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) are located in east Central Florida. KSC and CCAFS process and launch manned (NASA Space Shuttle) and unmanned (NASA and Air Force Expendable Launch Vehicles) space vehicles. One of the biggest cost impacts is unplanned launch scrubs due to inclement weather conditions such as thunderstorms. Each launch delay/scrub costs over a quarter million dollars, and the need to land the Shuttle at another landing site and return to KSC costs approximately $ 1M. Given the amount of time lost and costs incurred, the ability to accurately forecast (predict) when lightning will occur can result in significant cost and time savings. All lightning prediction models were developed using binary logistic regression. Lightning is the dependent variable and is binary. The independent variables are the Precipitable Water (PW) value for a given time of the day, the change in PW up to 12 hours, the electric field mill value, and the K-index value. In comparing the Mazany model results for the 1999 period B against actual observations for the 2000-2003 thunderstorm seasons, differences were found in the False Alarm Rate (FAR), Probability of Detection (POD) and Hit Rate (H). On average, the False Alarm Rate (FAR) increased by 58%, the Probability of Detection (POD) decreased by 31% and the Hit Rate decreased by 20%. In comparing the performance of the 6 hour forecast period to the performance of the 1.5 hour forecast period for the Mazany model, the FAR was lower by 15% and the Hit Rate was higher by 7%. However, the POD for the 6 hour forecast period was lower by 16% as compared to the POD of the 1.5 hour forecast period. Neither forecast period performed at the accuracy measures expected. A 2-Hr Forecasting Tool was developed to support a Phase I Lightning Advisory, which requires a 30-minute lead time for predicting lightning.
Construction cost forecast model : model documentation and technical notes.
DOT National Transportation Integrated Search
2013-05-01
Construction cost indices are generally estimated with Laspeyres, Paasche, or Fisher indices that allow changes : in the quantities of construction bid items, as well as changes in price to change the cost indices of those items. : These cost indices...
Aggregate Auto Travel Forecasting : State of the Art and Suggestions for Future Research
DOT National Transportation Integrated Search
1976-12-01
The report reviews existing forecasting models of auto vehicle miles of travel (VMT), and presents evidence that such models incorrectly omit time cost and spatial form variables. The omission of these variables biases parameter estimates in existing...
Demonstrating the Alaska Ocean Observing System in Prince William Sound
NASA Astrophysics Data System (ADS)
Schoch, G. Carl; McCammon, Molly
2013-07-01
The Alaska Ocean Observing System and the Oil Spill Recovery Institute developed a demonstration project over a 5 year period in Prince William Sound. The primary goal was to develop a quasi-operational system that delivers weather and ocean information in near real time to diverse user communities. This observing system now consists of atmospheric and oceanic sensors, and a new generation of computer models to numerically simulate and forecast weather, waves, and ocean circulation. A state of the art data management system provides access to these products from one internet portal at http://www.aoos.org. The project culminated in a 2009 field experiment that evaluated the observing system and performance of the model forecasts. Observations from terrestrial weather stations and weather buoys validated atmospheric circulation forecasts. Observations from wave gages on weather buoys validated forecasts of significant wave heights and periods. There was an emphasis on validation of surface currents forecasted by the ocean circulation model for oil spill response and search and rescue applications. During the 18 day field experiment a radar array mapped surface currents and drifting buoys were deployed. Hydrographic profiles at fixed stations, and by autonomous vehicles along transects, were made to acquire measurements through the water column. Terrestrial weather stations were the most reliable and least costly to operate, and in situ ocean sensors were more costly and considerably less reliable. The radar surface current mappers were the least reliable and most costly but provided the assimilation and validation data that most improved ocean circulation forecasts. We describe the setting of Prince William Sound and the various observational platforms and forecast models of the observing system, and discuss recommendations for future development.
Mental Models of Software Forecasting
NASA Technical Reports Server (NTRS)
Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.
1993-01-01
The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.
NASA Technical Reports Server (NTRS)
Liu, Yuqiong; Weerts, A.; Clark, M.; Hendricks Franssen, H.-J; Kumar, S.; Moradkhani, H.; Seo, D.-J.; Schwanenberg, D.; Smith, P.; van Dijk, A. I. J. M.;
2012-01-01
Data assimilation (DA) holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters. The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, the Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1) theoretical or mathematical aspects in DA algorithms, (2) the estimation of different types of uncertainty, (3) new observations and their objective use in hydrologic DA, (4) the use of DA for real-time control of water resources systems, and (5) the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among hydrologic modellers, DA developers, and operational forecasters.
Robustness of disaggregate oil and gas discovery forecasting models
Attanasi, E.D.; Schuenemeyer, J.H.
1989-01-01
The trend in forecasting oil and gas discoveries has been to develop and use models that allow forecasts of the size distribution of future discoveries. From such forecasts, exploration and development costs can more readily be computed. Two classes of these forecasting models are the Arps-Roberts type models and the 'creaming method' models. This paper examines the robustness of the forecasts made by these models when the historical data on which the models are based have been subject to economic upheavals or when historical discovery data are aggregated from areas having widely differing economic structures. Model performance is examined in the context of forecasting discoveries for offshore Texas State and Federal areas. The analysis shows how the model forecasts are limited by information contained in the historical discovery data. Because the Arps-Roberts type models require more regularity in discovery sequence than the creaming models, prior information had to be introduced into the Arps-Roberts models to accommodate the influence of economic changes. The creaming methods captured the overall decline in discovery size but did not easily allow introduction of exogenous information to compensate for incomplete historical data. Moreover, the predictive log normal distribution associated with the creaming model methods appears to understate the importance of the potential contribution of small fields. ?? 1989.
Econometrics 101: forecasting demystified
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crow, R.T.
1980-05-01
Forecasting by econometric modeling is described in a commonsense way which omits much of the technical jargon. A trend of continuous growth is no longer an adequate forecasting tool. Today's forecasters must consider rapid changes in price, policies, regulations, capital availability, and the cost of being wrong. A forecasting model is designed by identifying future influences on electricity purchases and quantifying their relationships to each other. A record is produced which can be evaluated and used to make corrections in the models. Residential consumption is used to illustrate how this works and to demonstrate how power consumption is also relatedmore » to the purchase and use of equipment. While models can quantify behavioral relationships, they cannot account for the impacts of non-price factors because of limited data. (DCK)« less
Forecast horizon of multi-item dynamic lot size model with perishable inventory.
Jing, Fuying; Lan, Zirui
2017-01-01
This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon.
Forecast horizon of multi-item dynamic lot size model with perishable inventory
Jing, Fuying
2017-01-01
This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon. PMID:29125856
NASA Technical Reports Server (NTRS)
Rango, A.
1981-01-01
Both LANDSAT and NOAA satellite data were used in improving snowmelt runoff forecasts. When the satellite snow cover data were tested in both empirical seasonal runoff estimation and short term modeling approaches, a definite potential for reducing forecast error was evident. A cost benefit analysis run in conjunction with the snow mapping indicated a $36.5 million annual benefit accruing from a one percent improvement in forecast accuracy using the snow cover data for the western United States. The annual cost of employing the system would be $505,000. The snow mapping has proven that satellite snow cover data can be used to reduce snowmelt runoff forecast error in a cost effective manner once all operational satellite data are available within 72 hours after acquisition. Executive summaries of the individual snow mapping projects are presented.
Intermittent Demand Forecasting in a Tertiary Pediatric Intensive Care Unit.
Cheng, Chen-Yang; Chiang, Kuo-Liang; Chen, Meng-Yin
2016-10-01
Forecasts of the demand for medical supplies both directly and indirectly affect the operating costs and the quality of the care provided by health care institutions. Specifically, overestimating demand induces an inventory surplus, whereas underestimating demand possibly compromises patient safety. Uncertainty in forecasting the consumption of medical supplies generates intermittent demand events. The intermittent demand patterns for medical supplies are generally classified as lumpy, erratic, smooth, and slow-moving demand. This study was conducted with the purpose of advancing a tertiary pediatric intensive care unit's efforts to achieve a high level of accuracy in its forecasting of the demand for medical supplies. On this point, several demand forecasting methods were compared in terms of the forecast accuracy of each. The results confirm that applying Croston's method combined with a single exponential smoothing method yields the most accurate results for forecasting lumpy, erratic, and slow-moving demand, whereas the Simple Moving Average (SMA) method is the most suitable for forecasting smooth demand. In addition, when the classification of demand consumption patterns were combined with the demand forecasting models, the forecasting errors were minimized, indicating that this classification framework can play a role in improving patient safety and reducing inventory management costs in health care institutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagnon, Pieter; Barbose, Galen L.; Stoll, Brady
Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities; forecasting capabilities can be improved, but generally at a cost. This paper informs this decision-space by using a suite of models to explore the capacity expansion and operation of the Western Interconnection over a 15-year period across a wide range of DPV growth rates and misforecast severities. The system costs under a misforecast are compared against the costs under a perfect forecast, to quantify the costs of misforecasting. Using a simplified probabilistic method applied to these modeling results, an analyst can make a first-ordermore » estimate of the financial benefit of improving a utility’s forecasting capabilities, and thus be better informed about whether to make such an investment. For example, under our base assumptions, a utility with 10 TWh per year of retail electric sales who initially estimates that DPV growth could range from 2% to 7.5% of total generation over the next 15 years could expect total present-value savings of approximately $4 million if they could reduce the severity of misforecasting to within ±25%. Utility resource planners can compare those savings against the costs needed to achieve that level of precision, to guide their decision on whether to make an investment in tools or resources.« less
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-01-01
Development of a forecast of the total domestic telecommunications demand, identification of that portion of the telecommunications demand suitable for transmission by satellite systems, identification of that portion of the satellite market addressable by CPS systems, identification of that portion of the satellite market addressable by Ka-band CPS system, and postulation of a Ka-band CPS network on a nationwide and local level were achieved. The approach employed included the use of a variety of forecasting models, a parametric cost model, a market distribution model and a network optimization model. Forecasts were developed for: 1980, 1990, 2000; voice, data and video services; terrestrial and satellite delivery modes; and C, Ku and Ka-bands.
Forecasting of global solar radiation using anfis and armax techniques
NASA Astrophysics Data System (ADS)
Muhammad, Auwal; Gaya, M. S.; Aliyu, Rakiya; Aliyu Abdulkadir, Rabi'u.; Dauda Umar, Ibrahim; Aminu Yusuf, Lukuman; Umar Ali, Mudassir; Khairi, M. T. M.
2018-01-01
Procurement of measuring device, maintenance cost coupled with calibration of the instrument contributed to the difficulty in forecasting of global solar radiation in underdeveloped countries. Most of the available regressional and mathematical models do not capture well the behavior of the global solar radiation. This paper presents the comparison of Adaptive Neuro Fuzzy Inference System (ANFIS) and Autoregressive Moving Average with eXogenous term (ARMAX) in forecasting global solar radiation. Full-Scale (experimental) data of Nigerian metrological agency, Sultan Abubakar III international airport Sokoto was used to validate the models. The simulation results demonstrated that the ANFIS model having achieved MAPE of 5.34% outperformed the ARMAX model. The ANFIS could be a valuable tool for forecasting the global solar radiation.
NASA Astrophysics Data System (ADS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-08-01
Development of a forecast of the total domestic telecommunications demand, identification of that portion of the telecommunications demand suitable for transmission by satellite systems, identification of that portion of the satellite market addressable by CPS systems, identification of that portion of the satellite market addressable by Ka-band CPS system, and postulation of a Ka-band CPS network on a nationwide and local level were achieved. The approach employed included the use of a variety of forecasting models, a parametric cost model, a market distribution model and a network optimization model. Forecasts were developed for: 1980, 1990, 2000; voice, data and video services; terrestrial and satellite delivery modes; and C, Ku and Ka-bands.
Developing a Universal Navy Uniform Adoption Model for Use in Forecasting
2015-12-01
manpower , and allowance data in order to build the model. Once chosen, the best candidate model will be validated against alternate sales data from a...inventory shortage or excess inventory holding costs caused by overestimation. 14. SUBJECT TERMS demand management, demand forecasting, Defense...software will be used to identify relationships between uniform sales, time, manpower , and allowance data in order to build the model. Once chosen, the
Measuring the effectiveness of earthquake forecasting in insurance strategies
NASA Astrophysics Data System (ADS)
Mignan, A.; Muir-Wood, R.
2009-04-01
Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.
NASA Astrophysics Data System (ADS)
Saharia, M.; Wood, A.; Clark, M. P.; Bennett, A.; Nijssen, B.; Clark, E.; Newman, A. J.
2017-12-01
Most operational streamflow forecasting systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow require an experienced human forecaster. But this approach faces challenges surrounding process reproducibility, hindcasting capability, and extension to large domains. The operational hydrologic community is increasingly moving towards `over-the-loop' (completely automated) large-domain simulations yet recent developments indicate a widespread lack of community knowledge about the strengths and weaknesses of such systems for forecasting. A realistic representation of land surface hydrologic processes is a critical element for improving forecasts, but often comes at the substantial cost of forecast system agility and efficiency. While popular grid-based models support the distributed representation of land surface processes, intermediate-scale Hydrologic Unit Code (HUC)-based modeling could provide a more efficient and process-aligned spatial discretization, reducing the need for tradeoffs between model complexity and critical forecasting requirements such as ensemble methods and comprehensive model calibration. The National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the USACE to implement, assess, and demonstrate real-time, over-the-loop distributed streamflow forecasting for several large western US river basins and regions. In this presentation, we present early results from short to medium range hydrologic and streamflow forecasts for the Pacific Northwest (PNW). We employ a real-time 1/16th degree daily ensemble model forcings as well as downscaled Global Ensemble Forecasting System (GEFS) meteorological forecasts. These datasets drive an intermediate-scale configuration of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model, which represents the PNW using over 11,700 HUCs. The system produces not only streamflow forecasts (using the MizuRoute channel routing tool) but also distributed model states such as soil moisture and snow water equivalent. We also describe challenges in distributed model-based forecasting, including the application and early results of real-time hydrologic data assimilation.
Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)
NASA Astrophysics Data System (ADS)
OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.
2016-02-01
Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.
Forecasting daily streamflow using online sequential extreme learning machines
NASA Astrophysics Data System (ADS)
Lima, Aranildo R.; Cannon, Alex J.; Hsieh, William W.
2016-06-01
While nonlinear machine methods have been widely used in environmental forecasting, in situations where new data arrive continually, the need to make frequent model updates can become cumbersome and computationally costly. To alleviate this problem, an online sequential learning algorithm for single hidden layer feedforward neural networks - the online sequential extreme learning machine (OSELM) - is automatically updated inexpensively as new data arrive (and the new data can then be discarded). OSELM was applied to forecast daily streamflow at two small watersheds in British Columbia, Canada, at lead times of 1-3 days. Predictors used were weather forecast data generated by the NOAA Global Ensemble Forecasting System (GEFS), and local hydro-meteorological observations. OSELM forecasts were tested with daily, monthly or yearly model updates. More frequent updating gave smaller forecast errors, including errors for data above the 90th percentile. Larger datasets used in the initial training of OSELM helped to find better parameters (number of hidden nodes) for the model, yielding better predictions. With the online sequential multiple linear regression (OSMLR) as benchmark, we concluded that OSELM is an attractive approach as it easily outperformed OSMLR in forecast accuracy.
A national econometric forecasting model of the dental sector.
Feldstein, P J; Roehrig, C S
1980-01-01
The Econometric Model of the the Dental Sector forecasts a broad range of dental sector variables, including dental care prices; the amount of care produced and consumed; employment of hygienists, dental assistants, and clericals; hours worked by dentists; dental incomes; and number of dentists. These forecasts are based upon values specified by the user for the various factors which help determine the supply an demand for dental care, such as the size of the population, per capita income, the proportion of the population covered by private dental insurance, the cost of hiring clericals and dental assistants, and relevant government policies. In a test of its reliability, the model forecast dental sector behavior quite accurately for the period 1971 through 1977. PMID:7461974
Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts
NASA Astrophysics Data System (ADS)
Hemri, S.; Klein, B.
2017-11-01
Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.
Pathways to designing and running an operational flood forecasting system: an adventure game!
NASA Astrophysics Data System (ADS)
Arnal, Louise; Pappenberger, Florian; Ramos, Maria-Helena; Cloke, Hannah; Crochemore, Louise; Giuliani, Matteo; Aalbers, Emma
2017-04-01
In the design and building of an operational flood forecasting system, a large number of decisions have to be taken. These include technical decisions related to the choice of the meteorological forecasts to be used as input to the hydrological model, the choice of the hydrological model itself (its structure and parameters), the selection of a data assimilation procedure to run in real-time, the use (or not) of a post-processor, and the computing environment to run the models and display the outputs. Additionally, a number of trans-disciplinary decisions are also involved in the process, such as the way the needs of the users will be considered in the modelling setup and how the forecasts (and their quality) will be efficiently communicated to ensure usefulness and build confidence in the forecasting system. We propose to reflect on the numerous, alternative pathways to designing and running an operational flood forecasting system through an adventure game. In this game, the player is the protagonist of an interactive story driven by challenges, exploration and problem-solving. For this presentation, you will have a chance to play this game, acting as the leader of a forecasting team at an operational centre. Your role is to manage the actions of your team and make sequential decisions that impact the design and running of the system in preparation to and during a flood event, and that deal with the consequences of the forecasts issued. Your actions are evaluated by how much they cost you in time, money and credibility. Your aim is to take decisions that will ultimately lead to a good balance between time and money spent, while keeping your credibility high over the whole process. This game was designed to highlight the complexities behind decision-making in an operational forecasting and emergency response context, in terms of the variety of pathways that can be selected as well as the timescale, cost and timing of effective actions.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC.
This supplementary report identifies and provides individual descriptions and reviews of 71 retirement forecasting models. Composed of appendices, it is intended as a source of more detailed information than that included in the main volume of the report. Appendix I is an introduction. Appendix II contains individual descriptions of 32 models of…
A Comparative Verification of Forecasts from Two Operational Solar Wind Models
2010-12-16
knowing how much confidence to place on predicted parameters. Cost /benefit information is provided to administrators who decide to sustain or...components of the magnetic field vector in the geocentric solar magnetospheric (GSM) coordinate system at each hour of forecast time. For an example of a
NASA Astrophysics Data System (ADS)
Karlovits, G. S.; Villarini, G.; Bradley, A.; Vecchi, G. A.
2014-12-01
Forecasts of seasonal precipitation and temperature can provide information in advance of potentially costly disruptions caused by flood and drought conditions. The consequences of these adverse hydrometeorological conditions may be mitigated through informed planning and response, given useful and skillful forecasts of these conditions. However, the potential value and applicability of these forecasts is unavoidably linked to their forecast quality. In this work we evaluate the skill of four global circulation models (GCMs) part of the North American Multi-Model Ensemble (NMME) project in forecasting seasonal precipitation and temperature over the continental United States. The GCMs we consider are the Geophysical Fluid Dynamics Laboratory (GFDL)-CM2.1, NASA Global Modeling and Assimilation Office (NASA-GMAO)-GEOS-5, The Center for Ocean-Land-Atmosphere Studies - Rosenstiel School of Marine & Atmospheric Science (COLA-RSMAS)-CCSM3, Canadian Centre for Climate Modeling and Analysis (CCCma) - CanCM4. These models are available at a resolution of 1-degree and monthly, with a minimum forecast lead time of nine months, up to one year. These model ensembles are compared against gridded monthly temperature and precipitation data created by the PRISM Climate Group, which represent the reference observation dataset in this work. Aspects of forecast quality are quantified using a diagnostic skill score decomposition that allows the evaluation of the potential skill and conditional and unconditional biases associated with these forecasts. The evaluation of the decomposed GCM forecast skill over the continental United States, by season and by lead time allows for a better understanding of the utility of these models for flood and drought predictions. Moreover, it also represents a diagnostic tool that could provide model developers feedback about strengths and weaknesses of their models.
Uncertainty quantification and optimal decisions
2017-01-01
A mathematical model can be analysed to construct policies for action that are close to optimal for the model. If the model is accurate, such policies will be close to optimal when implemented in the real world. In this paper, the different aspects of an ideal workflow are reviewed: modelling, forecasting, evaluating forecasts, data assimilation and constructing control policies for decision-making. The example of the oil industry is used to motivate the discussion, and other examples, such as weather forecasting and precision agriculture, are used to argue that the same mathematical ideas apply in different contexts. Particular emphasis is placed on (i) uncertainty quantification in forecasting and (ii) how decisions are optimized and made robust to uncertainty in models and judgements. This necessitates full use of the relevant data and by balancing costs and benefits into the long term may suggest policies quite different from those relevant to the short term. PMID:28484343
NASA Astrophysics Data System (ADS)
Mohite, A. R.; Beria, H.; Behera, A. K.; Chatterjee, C.; Singh, R.
2016-12-01
Flood forecasting using hydrological models is an important and cost-effective non-structural flood management measure. For forecasting at short lead times, empirical models using real-time precipitation estimates have proven to be reliable. However, their skill depreciates with increasing lead time. Coupling a hydrologic model with real-time rainfall forecasts issued from numerical weather prediction (NWP) systems could increase the lead time substantially. In this study, we compared 1-5 days precipitation forecasts from India Meteorological Department (IMD) Multi-Model Ensemble (MME) with European Center for Medium Weather forecast (ECMWF) NWP forecasts for over 86 major river basins in India. We then evaluated the hydrologic utility of these forecasts over Basantpur catchment (approx. 59,000 km2) of the Mahanadi River basin. Coupled MIKE 11 RR (NAM) and MIKE 11 hydrodynamic (HD) models were used for the development of flood forecast system (FFS). RR model was calibrated using IMD station rainfall data. Cross-sections extracted from SRTM 30 were used as input to the MIKE 11 HD model. IMD started issuing operational MME forecasts from the year 2008, and hence, both the statistical and hydrologic evaluation were carried out from 2008-2014. The performance of FFS was evaluated using both the NWP datasets separately for the year 2011, which was a large flood year in Mahanadi River basin. We will present figures and metrics for statistical (threshold based statistics, skill in terms of correlation and bias) and hydrologic (Nash Sutcliffe efficiency, mean and peak error statistics) evaluation. The statistical evaluation will be at pan-India scale for all the major river basins and the hydrologic evaluation will be for the Basantpur catchment of the Mahanadi River basin.
NASA Astrophysics Data System (ADS)
Gass, S. I.
1982-05-01
The theoretical and applied state of the art of oil and gas supply models was discussed. The following areas were addressed: the realities of oil and gas supply, prediction of oil and gas production, problems in oil and gas modeling, resource appraisal procedures, forecasting field size and production, investment and production strategies, estimating cost and production schedules for undiscovered fields, production regulations, resource data, sensitivity analysis of forecasts, econometric analysis of resource depletion, oil and gas finding rates, and various models of oil and gas supply.
Forecast of dengue incidence using temperature and rainfall.
Hii, Yien Ling; Zhu, Huaiping; Ng, Nawi; Ng, Lee Ching; Rocklöv, Joacim
2012-01-01
An accurate early warning system to predict impending epidemics enhances the effectiveness of preventive measures against dengue fever. The aim of this study was to develop and validate a forecasting model that could predict dengue cases and provide timely early warning in Singapore. We developed a time series Poisson multivariate regression model using weekly mean temperature and cumulative rainfall over the period 2000-2010. Weather data were modeled using piecewise linear spline functions. We analyzed various lag times between dengue and weather variables to identify the optimal dengue forecasting period. Autoregression, seasonality and trend were considered in the model. We validated the model by forecasting dengue cases for week 1 of 2011 up to week 16 of 2012 using weather data alone. Model selection and validation were based on Akaike's Information Criterion, standardized Root Mean Square Error, and residuals diagnoses. A Receiver Operating Characteristics curve was used to analyze the sensitivity of the forecast of epidemics. The optimal period for dengue forecast was 16 weeks. Our model forecasted correctly with errors of 0.3 and 0.32 of the standard deviation of reported cases during the model training and validation periods, respectively. It was sensitive enough to distinguish between outbreak and non-outbreak to a 96% (CI = 93-98%) in 2004-2010 and 98% (CI = 95%-100%) in 2011. The model predicted the outbreak in 2011 accurately with less than 3% possibility of false alarm. We have developed a weather-based dengue forecasting model that allows warning 16 weeks in advance of dengue epidemics with high sensitivity and specificity. We demonstrate that models using temperature and rainfall could be simple, precise, and low cost tools for dengue forecasting which could be used to enhance decision making on the timing, scale of vector control operations, and utilization of limited resources.
40 CFR Appendix A to Part 57 - Primary Nonferrous Smelter Order (NSO) Application
Code of Federal Regulations, 2011 CFR
2011-07-01
... Profit and Loss Summary A.4 Historical Capital Investment Summary B.1 Pre-Control Revenue Forecast B.2 Pre-Control Cost Forecast B.3 Pre-Control Forecast Profit and Loss Summary B.4 Constant Controls Revenue Forecast B.5 Constant Controls Cost Forecast B.6 Constant Controls Forecast Profit and Loss...
Development of S-ARIMA Model for Forecasting Demand in a Beverage Supply Chain
NASA Astrophysics Data System (ADS)
Mircetic, Dejan; Nikolicic, Svetlana; Maslaric, Marinko; Ralevic, Nebojsa; Debelic, Borna
2016-11-01
Demand forecasting is one of the key activities in planning the freight flows in supply chains, and accordingly it is essential for planning and scheduling of logistic activities within observed supply chain. Accurate demand forecasting models directly influence the decrease of logistics costs, since they provide an assessment of customer demand. Customer demand is a key component for planning all logistic processes in supply chain, and therefore determining levels of customer demand is of great interest for supply chain managers. In this paper we deal with exactly this kind of problem, and we develop the seasonal Autoregressive IntegratedMoving Average (SARIMA) model for forecasting demand patterns of a major product of an observed beverage company. The model is easy to understand, flexible to use and appropriate for assisting the expert in decision making process about consumer demand in particular periods.
The Economic Value of Air Quality Forecasting
NASA Astrophysics Data System (ADS)
Anderson-Sumo, Tasha
Both long-term and daily air quality forecasts provide an essential component to human health and impact costs. According the American Lung Association, the estimated current annual cost of air pollution related illness in the United States, adjusted for inflation (3% per year), is approximately $152 billion. Many of the risks such as hospital visits and morality are associated with poor air quality days (where the Air Quality Index is greater than 100). Groups such as sensitive groups become more susceptible to the resulting conditions and more accurate forecasts would help to take more appropriate precautions. This research focuses on evaluating the utility of air quality forecasting in terms of its potential impacts by building on air quality forecasting and economical metrics. Our analysis includes data collected during the summertime ozone seasons between 2010 and 2012 from air quality models for the Washington, DC/Baltimore, MD region. The metrics that are relevant to our analysis include: (1) The number of times that a high ozone or particulate matter (PM) episode is correctly forecasted, (2) the number of times that high ozone or PM episode is forecasted when it does not occur and (3) the number of times when the air quality forecast predicts a cleaner air episode when the air was observed to have high ozone or PM. Our collection of data included available air quality model forecasts of ozone and particulate matter data from the U.S. Environmental Protection Agency (EPA)'s AIRNOW as well as observational data of ozone and particulate matter from Clean Air Partners. We evaluated the performance of the air quality forecasts with that of the observational data and found that the forecast models perform well for the Baltimore/Washington region and the time interval observed. We estimate the potential amount for the Baltimore/Washington region accrues to a savings of up to 5,905 lives and 5.9 billion dollars per year. This total assumes perfect compliance with bad air quality warning and forecast air quality forecasts. There is a difficulty presented with evaluating the economic utility of the forecasts. All may not comply and even with a low compliance rate of 5% and 72% as the average probability of detection of poor air quality days by the air quality models, we estimate that the forecasting program saves 412 lives or 412 million dollars per year for the region. The totals we found are great or greater than other typical yearly meteorological hazard programs such as tornado or hurricane forecasting and it is clear that the economic value of air quality forecasting in the Baltimore/Washington region is vital.
Essays on oil price volatility and irreversible investment
NASA Astrophysics Data System (ADS)
Pastor, Daniel J.
In chapter 1, we provide an extensive and systematic evaluation of the relative forecasting performance of several models for the volatility of daily spot crude oil prices. Empirical research over the past decades has uncovered significant gains in forecasting performance of Markov Switching GARCH models over GARCH models for the volatility of financial assets and crude oil futures. We find that, for spot oil price returns, non-switching models perform better in the short run, whereas switching models tend to do better at longer horizons. In chapter 2, I investigate the impact of volatility on firms' irreversible investment decisions using real options theory. Cost incurred in oil drilling is considered sunk cost, thus irreversible. I collect detailed data on onshore, development oil well drilling on the North Slope of Alaska from 2003 to 2014. Volatility is modeled by constructing GARCH, EGARCH, and GJR-GARCH forecasts based on monthly real oil prices, and realized volatility from 5-minute intraday returns of oil futures prices. Using a duration model, I show that oil price volatility generally has a negative relationship with the hazard rate of drilling an oil well both when aggregating all the fields, and in individual fields.
NASA Astrophysics Data System (ADS)
Dreano, Denis; Tsiaras, Kostas; Triantafyllou, George; Hoteit, Ibrahim
2017-07-01
Forecasting the state of large marine ecosystems is important for many economic and public health applications. However, advanced three-dimensional (3D) ecosystem models, such as the European Regional Seas Ecosystem Model (ERSEM), are computationally expensive, especially when implemented within an ensemble data assimilation system requiring several parallel integrations. As an alternative to 3D ecological forecasting systems, we propose to implement a set of regional one-dimensional (1D) water-column ecological models that run at a fraction of the computational cost. The 1D model domains are determined using a Gaussian mixture model (GMM)-based clustering method and satellite chlorophyll-a (Chl-a) data. Regionally averaged Chl-a data is assimilated into the 1D models using the singular evolutive interpolated Kalman (SEIK) filter. To laterally exchange information between subregions and improve the forecasting skills, we introduce a new correction step to the assimilation scheme, in which we assimilate a statistical forecast of future Chl-a observations based on information from neighbouring regions. We apply this approach to the Red Sea and show that the assimilative 1D ecological models can forecast surface Chl-a concentration with high accuracy. The statistical assimilation step further improves the forecasting skill by as much as 50%. This general approach of clustering large marine areas and running several interacting 1D ecological models is very flexible. It allows many combinations of clustering, filtering and regression technics to be used and can be applied to build efficient forecasting systems in other large marine ecosystems.
Research on regional numerical weather prediction
NASA Technical Reports Server (NTRS)
Kreitzberg, C. W.
1976-01-01
Extension of the predictive power of dynamic weather forecasting to scales below the conventional synoptic or cyclonic scales in the near future is assessed. Lower costs per computation, more powerful computers, and a 100 km mesh over the North American area (with coarser mesh extending beyond it) are noted at present. Doubling the resolution even locally (to 50 km mesh) would entail a 16-fold increase in costs (including vertical resolution and halving the time interval), and constraints on domain size and length of forecast. Boundary conditions would be provided by the surrounding 100 km mesh, and time-varying lateral boundary conditions can be considered to handle moving phenomena. More physical processes to treat, more efficient numerical techniques, and faster computers (improved software and hardware) backing up satellite and radar data could produce further improvements in forecasting in the 1980s. Boundary layer modeling, initialization techniques, and quantitative precipitation forecasting are singled out among key tasks.
NASA Astrophysics Data System (ADS)
Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.
2015-12-01
One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.
Zhao, Xiuli; Yiranbon, Ethel
2014-01-01
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, “least-cost,” and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor. PMID:24511292
Zhao, Xiuli; Asante Antwi, Henry; Yiranbon, Ethel
2014-01-01
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, "least-cost," and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor.
NASA Astrophysics Data System (ADS)
Kato, Takeyoshi; Sone, Akihito; Shimakage, Toyonari; Suzuoki, Yasuo
A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on the demonstrative studies on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization and daily operation evaluated with the cost. The main results are as follows. The required capacity of NaS battery must be increased by 10-40% against the ideal situation without the forecast error of PVS power output. The influence of forecast error on the received grid electricity would not be so significant on annual basis because the positive and negative forecast error varies with days. The annual total cost of facility and operation increases by 2-7% due to the forecast error applied in this study. The impact of forecast error on the facility optimization and operation optimization is almost the same each other at a few percentages, implying that the forecast accuracy should be improved in terms of both the number of times with large forecast error and the average error.
NASA Technical Reports Server (NTRS)
French, V. (Principal Investigator)
1982-01-01
An evaluation was made of Thompson-Type models which use trend terms (as a surrogate for technology), meteorological variables based on monthly average temperature, and total precipitation to forecast and estimate corn yields in Iowa, Illinois, and Indiana. Pooled and unpooled Thompson-type models were compared. Neither was found to be consistently superior to the other. Yield reliability indicators show that the models are of limited use for large area yield estimation. The models are objective and consistent with scientific knowledge. Timely yield forecasts and estimates can be made during the growing season by using normals or long range weather forecasts. The models are not costly to operate and are easy to use and understand. The model standard errors of prediction do not provide a useful current measure of modeled yield reliability.
Supplier Short Term Load Forecasting Using Support Vector Regression and Exogenous Input
NASA Astrophysics Data System (ADS)
Matijaš, Marin; Vukićcević, Milan; Krajcar, Slavko
2011-09-01
In power systems, task of load forecasting is important for keeping equilibrium between production and consumption. With liberalization of electricity markets, task of load forecasting changed because each market participant has to forecast their own load. Consumption of end-consumers is stochastic in nature. Due to competition, suppliers are not in a position to transfer their costs to end-consumers; therefore it is essential to keep forecasting error as low as possible. Numerous papers are investigating load forecasting from the perspective of the grid or production planning. We research forecasting models from the perspective of a supplier. In this paper, we investigate different combinations of exogenous input on the simulated supplier loads and show that using points of delivery as a feature for Support Vector Regression leads to lower forecasting error, while adding customer number in different datasets does the opposite.
Glied, Sherry; Zaylor, Abigail
2015-07-01
The authors assess how Medicare financing and projections of future costs have changed since 2000. They also assess the impact of legislative reforms on the sources and levels of financing and compare cost forecasts made at different times. Although the aging U.S. population and rising health care costs are expected to increase the share of gross domestic product devoted to Medicare, changes made in the program over the past decade have helped stabilize Medicare's financial outlook--even as benefits have been expanded. Long-term forecasting uncertainty should make policymakers and beneficiaries wary of dramatic changes to the program in the near term that are intended to alter its long-term forecast: the range of error associated with cost forecasts rises as the forecast window lengthens. Instead, policymakers should focus on the immediate policy window, taking steps to reduce the current burden of Medicare costs by containing spending today.
A model to forecast data centre infrastructure costs.
NASA Astrophysics Data System (ADS)
Vernet, R.
2015-12-01
The computing needs in the HEP community are increasing steadily, but the current funding situation in many countries is tight. As a consequence experiments, data centres, and funding agencies have to rationalize resource usage and expenditures. CC-IN2P3 (Lyon, France) provides computing resources to many experiments including LHC, and is a major partner for astroparticle projects like LSST, CTA or Euclid. The financial cost to accommodate all these experiments is substantial and has to be planned well in advance for funding and strategic reasons. In that perspective, leveraging infrastructure expenses, electric power cost and hardware performance observed in our site over the last years, we have built a model that integrates these data and provides estimates of the investments that would be required to cater to the experiments for the mid-term future. We present how our model is built and the expenditure forecast it produces, taking into account the experiment roadmaps. We also examine the resource growth predicted by our model over the next years assuming a flat-budget scenario.
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1984-01-01
The overall purpose was to forecast the potential United States domestic telecommunications demand for satellite provided customer promises voice, data and video services through the year 2000, so that this information on service demand would be available to aid in NASA program planning. To accomplish this overall purpose the following objectives were achieved: (1) development of a forecast of the total domestic telecommunications demand; (2) identification of that portion of the telecommunications demand suitable for transmission by satellite systems; (3) identification of that portion of the satellite market addressable by consumer promises service (CPS) systems; (4) identification of that portion of the satellite market addressable by Ka-band CPS system; and (5) postulation of a Ka-band CPS network on a nationwide and local level. The approach employed included the use of a variety of forecasting models, a parametric cost model, a market distribution model and a network optimization model. Forecasts were developed for: 1980, 1990, and 2000; voice, data and video services; terrestrial and satellite delivery modes; and C, Ku and Ka-bands.
NASA Astrophysics Data System (ADS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1984-03-01
The overall purpose was to forecast the potential United States domestic telecommunications demand for satellite provided customer promises voice, data and video services through the year 2000, so that this information on service demand would be available to aid in NASA program planning. To accomplish this overall purpose the following objectives were achieved: (1) development of a forecast of the total domestic telecommunications demand; (2) identification of that portion of the telecommunications demand suitable for transmission by satellite systems; (3) identification of that portion of the satellite market addressable by consumer promises service (CPS) systems; (4) identification of that portion of the satellite market addressable by Ka-band CPS system; and (5) postulation of a Ka-band CPS network on a nationwide and local level. The approach employed included the use of a variety of forecasting models, a parametric cost model, a market distribution model and a network optimization model. Forecasts were developed for: 1980, 1990, and 2000; voice, data and video services; terrestrial and satellite delivery modes; and C, Ku and Ka-bands.
A Machine LearningFramework to Forecast Wave Conditions
NASA Astrophysics Data System (ADS)
Zhang, Y.; James, S. C.; O'Donncha, F.
2017-12-01
Recently, significant effort has been undertaken to quantify and extract wave energy because it is renewable, environmental friendly, abundant, and often close to population centers. However, a major challenge is the ability to accurately and quickly predict energy production, especially across a 48-hour cycle. Accurate forecasting of wave conditions is a challenging undertaking that typically involves solving the spectral action-balance equation on a discretized grid with high spatial resolution. The nature of the computations typically demands high-performance computing infrastructure. Using a case-study site at Monterey Bay, California, a machine learning framework was trained to replicate numerically simulated wave conditions at a fraction of the typical computational cost. Specifically, the physics-based Simulating WAves Nearshore (SWAN) model, driven by measured wave conditions, nowcast ocean currents, and wind data, was used to generate training data for machine learning algorithms. The model was run between April 1st, 2013 and May 31st, 2017 generating forecasts at three-hour intervals yielding 11,078 distinct model outputs. SWAN-generated fields of 3,104 wave heights and a characteristic period could be replicated through simple matrix multiplications using the mapping matrices from machine learning algorithms. In fact, wave-height RMSEs from the machine learning algorithms (9 cm) were less than those for the SWAN model-verification exercise where those simulations were compared to buoy wave data within the model domain (>40 cm). The validated machine learning approach, which acts as an accurate surrogate for the SWAN model, can now be used to perform real-time forecasts of wave conditions for the next 48 hours using available forecasted boundary wave conditions, ocean currents, and winds. This solution has obvious applications to wave-energy generation as accurate wave conditions can be forecasted with over a three-order-of-magnitude reduction in computational expense. The low computational cost (and by association low computer-power requirement) means that the machine learning algorithms could be installed on a wave-energy converter as a form of "edge computing" where a device could forecast its own 48-hour energy production.
Weighing costs and losses: A decision making game using probabilistic forecasts
NASA Astrophysics Data System (ADS)
Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan
2017-04-01
Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting and water-related risks, and one of the audiences comprised a group of experts in probabilistic forecasting. Results show that the different shop owners do take the costs of taking action and the potential losses into account in their decisions. Shop owners with a low cost/loss ratio were found to be more inclined to take actions based on the forecasts, though the absolute value of the losses also increased the willingness to take action. Little differentiation was found between the different groups of players.
Valuing hydrological forecasts for a pumped storage assisted hydro facility
NASA Astrophysics Data System (ADS)
Zhao, Guangzhi; Davison, Matt
2009-07-01
SummaryThis paper estimates the value of a perfectly accurate short-term hydrological forecast to the operator of a hydro electricity generating facility which can sell its power at time varying but predictable prices. The expected value of a less accurate forecast will be smaller. We assume a simple random model for water inflows and that the costs of operating the facility, including water charges, will be the same whether or not its operator has inflow forecasts. Thus, the improvement in value from better hydrological prediction results from the increased ability of the forecast using facility to sell its power at high prices. The value of the forecast is therefore the difference between the sales of a facility operated over some time horizon with a perfect forecast, and the sales of a similar facility operated over the same time horizon with similar water inflows which, though governed by the same random model, cannot be forecast. This paper shows that the value of the forecast is an increasing function of the inflow process variance and quantifies how much the value of this perfect forecast increases with the variance of the water inflow process. Because the lifetime of hydroelectric facilities is long, the small increase observed here can lead to an increase in the profitability of hydropower investments.
Labovitz, Jonathan M; Kominski, Gerald F
2016-05-01
Because value-based care is critical to the Affordable Care Act success, we forecasted inpatient costs and the potential impact of podiatric medical care on savings in the diabetic population through improved care quality and decreased resource use during implementation of the health reform initiatives in California. We forecasted enrollment of diabetic adults into Medicaid and subsidized health benefit exchange programs using the California Simulation of Insurance Markets (CalSIM) base model. Amputations and admissions per 1,000 diabetic patients and inpatient costs were based on the California Office of Statewide Health Planning and Development 2009-2011 inpatient discharge files. We evaluated cost in three categories: uncomplicated admissions, amputations during admissions, and discharges to a skilled nursing facility. Total costs and projected savings were calculated by applying the metrics and cost to the projected enrollment. Diabetic patients accounted for 6.6% of those newly eligible for Medicaid or health benefit exchange subsidies, with a 60.8% take-up rate. We project costs to be $24.2 million in the diabetic take-up population from 2014 to 2019. Inpatient costs were 94.3% higher when amputations occurred during the admission and 46.7% higher when discharged to a skilled nursing facility. Meanwhile, 61.0% of costs were attributed to uncomplicated admissions. Podiatric medical services saved 4.1% with a 10% reduction in admissions and amputations and an additional 1% for every 10% improvement in access to podiatric medical care. When implementing the Affordable Care Act, inclusion of podiatric medical services on multidisciplinary teams and in chronic-care models featuring prevention helps shift care to ambulatory settings to realize the greatest cost savings.
Analysis of forecasting and inventory control of raw material supplies in PT INDAC INT’L
NASA Astrophysics Data System (ADS)
Lesmana, E.; Subartini, B.; Riaman; Jabar, D. A.
2018-03-01
This study discusses the data forecasting sales of carbon electrodes at PT. INDAC INT L uses winters and double moving average methods, while for predicting the amount of inventory and cost required in ordering raw material of carbon electrode next period using Economic Order Quantity (EOQ) model. The result of error analysis shows that winters method for next period gives result of MAE, MSE, and MAPE, the winters method is a better forecasting method for forecasting sales of carbon electrode products. So that PT. INDAC INT L is advised to provide products that will be sold following the sales amount by the winters method.
NASA Astrophysics Data System (ADS)
Prudden, R.; Arribas, A.; Tomlinson, J.; Robinson, N.
2017-12-01
The Unified Model is a numerical model of the atmosphere used at the UK Met Office (and numerous partner organisations including Korean Meteorological Agency, Australian Bureau of Meteorology and US Air Force) for both weather and climate applications.Especifically, dynamical models such as the Unified Model are now a central part of weather forecasting. Starting from basic physical laws, these models make it possible to predict events such as storms before they have even begun to form. The Unified Model can be simply described as having two components: one component solves the navier-stokes equations (usually referred to as the "dynamics"); the other solves relevant sub-grid physical processes (usually referred to as the "physics"). Running weather forecasts requires substantial computing resources - for example, the UK Met Office operates the largest operational High Performance Computer in Europe - and the cost of a typical simulation is spent roughly 50% in the "dynamics" and 50% in the "physics". Therefore there is a high incentive to reduce cost of weather forecasts and Machine Learning is a possible option because, once a machine learning model has been trained, it is often much faster to run than a full simulation. This is the motivation for a technique called model emulation, the idea being to build a fast statistical model which closely approximates a far more expensive simulation. In this paper we discuss the use of Machine Learning as an emulator to replace the "physics" component of the Unified Model. Various approaches and options will be presented and the implications for further model development, operational running of forecasting systems, development of data assimilation schemes, and development of ensemble prediction techniques will be discussed.
Planetary spacecraft cost modeling utilizing labor estimating relationships
NASA Technical Reports Server (NTRS)
Williams, Raymond
1990-01-01
A basic computerized technology is presented for estimating labor hours and cost of unmanned planetary and lunar programs. The user friendly methodology designated Labor Estimating Relationship/Cost Estimating Relationship (LERCER) organizes the forecasting process according to vehicle subsystem levels. The level of input variables required by the model in predicting cost is consistent with pre-Phase A type mission analysis. Twenty one program categories were used in the modeling. To develop the model, numerous LER and CER studies were surveyed and modified when required. The result of the research along with components of the LERCER program are reported.
NASA Astrophysics Data System (ADS)
Martinez, C. J.; Starkweather, S.; Cox, C. J.; Solomon, A.; Shupe, M.
2015-12-01
Radiosondes are balloon-borne meteorological sensors used to acquire profiles of temperature and humidity. Radiosonde data are essential inputs for numerical weather prediction models and are used for climate research, particularly in the creation of reanalysis products. However, radiosonde programs are costly to maintain, in particular in the remote regions of the Arctic (e.g., $440,000/yr at Summit, Greenland), where only 40 of approximately 1000 routine global launches are made. The climate of this data-sparse region is poorly understood and forecast data assimilation procedures are designed for global applications. Thus, observations may be rejected from the data assimilation because they are too far from the model expectations. For the most cost-efficient deployment of resources and to improve forecasting methods, analyses of the effectiveness of individual radiosonde programs are necessary. Here, we evaluate how radiosondes launched twice daily (0 and 12 UTC) from Summit Station, Greenland, (72.58⁰N, 38.48⁰W, 3210 masl) influence the European Centre for Medium Range Weather Forecasting (ECMWF) operational forecasts from June 2013 through May of 2015. A statistical analysis is conducted to determine the impact of the observations on the forecast model and the meteorological regimes that the model fails to reproduce are identified. Assimilation rates in the inversion layer are lower than any other part of the troposphere. Above the inversion, assimilation rates range from 85%-100%, 60%-98%, and > 99% for temperature, humidity, and wind, respectively. The lowest assimilation rates are found near the surface, possibly associated with biases in the representation of the temperature inversion by the ECMWF model at Summit. Consequently, assimilation rates are lower near the surface during winter when strong temperature inversions are frequently observed. Our findings benefit the scientific community who uses this information for climatological analysis of the Greenland Ice Sheet, and thus further analysis is warranted.
NASA Technical Reports Server (NTRS)
Manobianco, John; Zack, John W.; Taylor, Gregory E.
1996-01-01
This paper describes the capabilities and operational utility of a version of the Mesoscale Atmospheric Simulation System (MASS) that has been developed to support operational weather forecasting at the Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS). The implementation of local, mesoscale modeling systems at KSC/CCAS is designed to provide detailed short-range (less than 24 h) forecasts of winds, clouds, and hazardous weather such as thunderstorms. Short-range forecasting is a challenge for daily operations, and manned and unmanned launches since KSC/CCAS is located in central Florida where the weather during the warm season is dominated by mesoscale circulations like the sea breeze. For this application, MASS has been modified to run on a Stardent 3000 workstation. Workstation-based, real-time numerical modeling requires a compromise between the requirement to run the system fast enough so that the output can be used before expiration balanced against the desire to improve the simulations by increasing resolution and using more detailed physical parameterizations. It is now feasible to run high-resolution mesoscale models such as MASS on local workstations to provide timely forecasts at a fraction of the cost required to run these models on mainframe supercomputers. MASS has been running in the Applied Meteorology Unit (AMU) at KSC/CCAS since January 1994 for the purpose of system evaluation. In March 1995, the AMU began sending real-time MASS output to the forecasters and meteorologists at CCAS, Spaceflight Meteorology Group (Johnson Space Center, Houston, Texas), and the National Weather Service (Melbourne, Florida). However, MASS is not yet an operational system. The final decision whether to transition MASS for operational use will depend on a combination of forecaster feedback, the AMU's final evaluation results, and the life-cycle costs of the operational system.
The value of information as applied to the Landsat Follow-on benefit-cost analysis
NASA Technical Reports Server (NTRS)
Wood, D. B.
1978-01-01
An econometric model was run to compare the current forecasting system with a hypothetical (Landsat Follow-on) space-based system. The baseline current system was a hybrid of USDA SRS domestic forecasts and the best known foreign data. The space-based system improved upon the present Landsat by the higher spatial resolution capability of the thematic mapper. This satellite system is a major improvement for foreign forecasts but no better than SRS for domestic forecasts. The benefit analysis was concentrated on the use of Landsat Follow-on to forecast world wheat production. Results showed that it was possible to quantify the value of satellite information and that there are significant benefits in more timely and accurate crop condition information.
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles
2017-06-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.
A research model--forecasting incident rates from optimized safety program intervention strategies.
Iyer, P S; Haight, J M; Del Castillo, E; Tink, B W; Hawkins, P W
2005-01-01
INTRODUCTION/PROBLEM: Property damage incidents, workplace injuries, and safety programs designed to prevent them, are expensive aspects of doing business in contemporary industry. The National Safety Council (2002) estimated that workplace injuries cost $146.6 billion per year. Because companies are resource limited, optimizing intervention strategies to decrease incidents with less costly programs can contribute to improved productivity. Systematic data collection methods were employed and the forecasting ability of a time-lag relationship between interventions and incident rates was studied using various statistical methods (an intervention is not expected to have an immediate nor infinitely lasting effect on the incident rate). As a follow up to the initial work, researchers developed two models designed to forecast incident rates. One is based on past incident rate performance and the other on the configuration and level of effort applied to the safety and health program. Researchers compared actual incident performance to the prediction capability of each model over 18 months in the forestry operations at an electricity distribution company and found the models to allow accurate prediction of incident rates. These models potentially have powerful implications as a business-planning tool for human resource allocation and for designing an optimized safety and health intervention program to minimize incidents. Depending on the mathematical relationship, one can determine what interventions, where and how much to apply them, and when to increase or reduce human resource input as determined by the forecasted performance.
Performance of fuzzy approach in Malaysia short-term electricity load forecasting
NASA Astrophysics Data System (ADS)
Mansor, Rosnalini; Zulkifli, Malina; Yusof, Muhammad Mat; Ismail, Mohd Isfahani; Ismail, Suzilah; Yin, Yip Chee
2014-12-01
Many activities such as economic, education and manafucturing would paralyse with limited supply of electricity but surplus contribute to high operating cost. Therefore electricity load forecasting is important in order to avoid shortage or excess. Previous finding showed festive celebration has effect on short-term electricity load forecasting. Being a multi culture country Malaysia has many major festive celebrations such as Eidul Fitri, Chinese New Year and Deepavali but they are moving holidays due to non-fixed dates on the Gregorian calendar. This study emphasis on the performance of fuzzy approach in forecasting electricity load when considering the presence of moving holidays. Autoregressive Distributed Lag model was estimated using simulated data by including model simplification concept (manual or automatic), day types (weekdays or weekend), public holidays and lags of electricity load. The result indicated that day types, public holidays and several lags of electricity load were significant in the model. Overall, model simplification improves fuzzy performance due to less variables and rules.
NASA Astrophysics Data System (ADS)
Efremenko, Vladimir; Belyaevsky, Roman; Skrebneva, Evgeniya
2017-11-01
In article the analysis of electric power consumption and problems of power saving on coal mines are considered. Nowadays the share of conditionally constant costs of electric power for providing safe working conditions underground on coal mines is big. Therefore, the power efficiency of underground coal mining depends on electric power expense of the main technological processes and size of conditionally constant costs. The important direction of increase of power efficiency of coal mining is forecasting of a power consumption and monitoring of electric power expense. One of the main approaches to reducing of electric power costs is increase in accuracy of the enterprise demand in the wholesale electric power market. It is offered to use artificial neural networks to forecasting of day-ahead power consumption with hourly breakdown. At the same time use of neural and indistinct (hybrid) systems on the principles of fuzzy logic, neural networks and genetic algorithms is more preferable. This model allows to do exact short-term forecasts at a small array of input data. A set of the input parameters characterizing mining-and-geological and technological features of the enterprise is offered.
NASA Technical Reports Server (NTRS)
1977-01-01
A demonstration experiment is being planned to show that frost and freeze prediction improvements are possible utilizing timely Synchronous Meteorological Satellite temperature measurements and that this information can affect Florida citrus grower operations and decisions. An economic experiment was carried out which will monitor citrus growers' decisions, actions, costs and losses, and meteorological forecasts and actual weather events and will establish the economic benefits of improved temperature forecasts. A summary is given of the economic experiment, the results obtained to date, and the work which still remains to be done. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service, and Federal Crop Insurance Corp., resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements.
Forecast Inaccuracies in Power Plant Projects From Project Managers' Perspectives
NASA Astrophysics Data System (ADS)
Sanabria, Orlando
Guided by organizational theory, this phenomenological study explored the factors affecting forecast preparation and inaccuracies during the construction of fossil fuel-fired power plants in the United States. Forecast inaccuracies can create financial stress and uncertain profits during the project construction phase. A combination of purposeful and snowball sampling supported the selection of participants. Twenty project managers with over 15 years of experience in power generation and project experience across the United States were interviewed within a 2-month period. From the inductive codification and descriptive analysis, 5 themes emerged: (a) project monitoring, (b) cost control, (c) management review frequency, (d) factors to achieve a precise forecast, and (e) factors causing forecast inaccuracies. The findings of the study showed the factors necessary to achieve a precise forecast includes a detailed project schedule, accurate labor cost estimates, monthly project reviews and risk assessment, and proper utilization of accounting systems to monitor costs. The primary factors reported as causing forecast inaccuracies were cost overruns by subcontractors, scope gaps, labor cost and availability of labor, and equipment and material cost. Results of this study could improve planning accuracy and the effective use of resources during construction of power plants. The study results could contribute to social change by providing a framework to project managers to lessen forecast inaccuracies, and promote construction of power plants that will generate employment opportunities and economic development.
Estimating the cost of production stoppage
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1979-01-01
Estimation model considers learning curve quantities, and time of break to forecast losses due to break in production schedule. Major parameters capable of predicting costs are number of units made prior to production sequence, length of production break, and slope of learning curve produced prior to break.
INTEGRATED PLANNING MODEL - EPA APPLICATIONS
The Integrated Planning Model (IPM) is a multi-regional, dynamic, deterministic linear programming (LP) model of the electric power sector in the continental lower 48 states and the District of Columbia. It provides forecasts up to year 2050 of least-cost capacity expansion, elec...
Sensor network based solar forecasting using a local vector autoregressive ridge framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, J.; Yoo, S.; Heiser, J.
2016-04-04
The significant improvements and falling costs of photovoltaic (PV) technology make solar energy a promising resource, yet the cloud induced variability of surface solar irradiance inhibits its effective use in grid-tied PV generation. Short-term irradiance forecasting, especially on the minute scale, is critically important for grid system stability and auxiliary power source management. Compared to the trending sky imaging devices, irradiance sensors are inexpensive and easy to deploy but related forecasting methods have not been well researched. The prominent challenge of applying classic time series models on a network of irradiance sensors is to address their varying spatio-temporal correlations duemore » to local changes in cloud conditions. We propose a local vector autoregressive framework with ridge regularization to forecast irradiance without explicitly determining the wind field or cloud movement. By using local training data, our learned forecast model is adaptive to local cloud conditions and by using regularization, we overcome the risk of overfitting from the limited training data. Our systematic experimental results showed an average of 19.7% RMSE and 20.2% MAE improvement over the benchmark Persistent Model for 1-5 minute forecasts on a comprehensive 25-day dataset.« less
Assessment of the Charging Policy in Energy Efficiency of the Enterprise
NASA Astrophysics Data System (ADS)
Shutov, E. A.; E Turukina, T.; Anisimov, T. S.
2017-04-01
The forecasting problem for energy facilities with a power exceeding 670 kW is currently one of the main. In connection with rules of the retail electricity market such customers also pay for actual energy consumption deviations from plan value. In compliance with the hierarchical stages of the electricity market a guaranteeing supplier is to respect the interests of distribution and generation companies that require load leveling. The answer to this question for industrial enterprise is possible only within technological process through implementation of energy-efficient processing chains with the adaptive function and forecasting tool. In such a circumstance the primary objective of a forecasting is reduce the energy consumption costs by taking account of the energy cost correlation for 24 hours for forming of pumping unit work schedule. The pumping unit virtual model with the variable frequency drive is considered. The forecasting tool and the optimizer are integrated into typical control circuit. Economic assessment of the optimization method was estimated.
Evaluating the Impacts of Real-Time Pricing on the Cost and Value of Wind Generation
Siohansi, Ramteen
2010-05-01
One of the costs associated with integrating wind generation into a power system is the cost of redispatching the system in real-time due to day-ahead wind resource forecast errors. One possible way of reducing these redispatch costs is to introduce demand response in the form of real-time pricing (RTP), which could allow electricity demand to respond to actual real-time wind resource availability using price signals. A day-ahead unit commitment model with day-ahead wind forecasts and a real-time dispatch model with actual wind resource availability is used to estimate system operations in a high wind penetration scenario. System operations are comparedmore » to a perfect foresight benchmark, in which actual wind resource availability is known day-ahead. The results show that wind integration costs with fixed demands can be high, both due to real-time redispatch costs and lost load. It is demonstrated that introducing RTP can reduce redispatch costs and eliminate loss of load events. Finally, social surplus with wind generation and RTP is compared to a system with neither and the results demonstrate that introducing wind and RTP into a market can result in superadditive surplus gains.« less
Seasonal forecasting of groundwater levels in natural aquifers in the United Kingdom
NASA Astrophysics Data System (ADS)
Mackay, Jonathan; Jackson, Christopher; Pachocka, Magdalena; Brookshaw, Anca; Scaife, Adam
2014-05-01
Groundwater aquifers comprise the world's largest freshwater resource and provide resilience to climate extremes which could become more frequent under future climate changes. Prolonged dry conditions can induce groundwater drought, often characterised by significantly low groundwater levels which may persist for months to years. In contrast, lasting wet conditions can result in anomalously high groundwater levels which result in flooding, potentially at large economic cost. Using computational models to produce groundwater level forecasts allows appropriate management strategies to be considered in advance of extreme events. The majority of groundwater level forecasting studies to date use data-based models, which exploit the long response time of groundwater levels to meteorological drivers and make forecasts based only on the current state of the system. Instead, seasonal meteorological forecasts can be used to drive hydrological models and simulate groundwater levels months into the future. Such approaches have not been used in the past due to a lack of skill in these long-range forecast products. However systems such as the latest version of the Met Office Global Seasonal Forecast System (GloSea5) are now showing increased skill up to a 3-month lead time. We demonstrate the first groundwater level ensemble forecasting system using a multi-member ensemble of hindcasts from GloSea5 between 1996 and 2009 to force 21 simple lumped conceptual groundwater models covering most of the UK's major aquifers. We present the results from this hindcasting study and demonstrate that the system can be used to forecast groundwater levels with some skill up to three months into the future.
Airfreight forecasting methodology and results
NASA Technical Reports Server (NTRS)
1978-01-01
A series of econometric behavioral equations was developed to explain and forecast the evolution of airfreight traffic demand for the total U.S. domestic airfreight system, the total U.S. international airfreight system, and the total scheduled international cargo traffic carried by the top 44 foreign airlines. The basic explanatory variables used in these macromodels were the real gross national products of the countries involved and a measure of relative transportation costs. The results of the econometric analysis reveal that the models explain more than 99 percent of the historical evolution of freight traffic. The long term traffic forecasts generated with these models are based on scenarios of the likely economic outlook in the United States and 31 major foreign countries.
Simplification of the Kalman filter for meteorological data assimilation
NASA Technical Reports Server (NTRS)
Dee, Dick P.
1991-01-01
The paper proposes a new statistical method of data assimilation that is based on a simplification of the Kalman filter equations. The forecast error covariance evolution is approximated simply by advecting the mass-error covariance field, deriving the remaining covariances geostrophically, and accounting for external model-error forcing only at the end of each forecast cycle. This greatly reduces the cost of computation of the forecast error covariance. In simulations with a linear, one-dimensional shallow-water model and data generated artificially, the performance of the simplified filter is compared with that of the Kalman filter and the optimal interpolation (OI) method. The simplified filter produces analyses that are nearly optimal, and represents a significant improvement over OI.
NASA Astrophysics Data System (ADS)
Thompson, R. J.; Cole, D. G.; Wilkinson, P. J.; Shea, M. A.; Smart, D.
1990-11-01
The following subject areas were covered: a probability forecast for geomagnetic activity; cost recovery in solar-terrestrial predictions; magnetospheric specification and forecasting models; a geomagnetic forecast and monitoring system for power system operation; some aspects of predicting magnetospheric storms; some similarities in ionospheric disturbance characteristics in equatorial, mid-latitude, and sub-auroral regions; ionospheric support for low-VHF radio transmission; a new approach to prediction of ionospheric storms; a comparison of the total electron content of the ionosphere around L=4 at low sunspot numbers with the IRI model; the French ionospheric radio propagation predictions; behavior of the F2 layer at mid-latitudes; and the design of modern ionosondes.
Obtaining high-resolution stage forecasts by coupling large-scale hydrologic models with sensor data
NASA Astrophysics Data System (ADS)
Fries, K. J.; Kerkez, B.
2017-12-01
We investigate how "big" quantities of distributed sensor data can be coupled with a large-scale hydrologic model, in particular the National Water Model (NWM), to obtain hyper-resolution forecasts. The recent launch of the NWM provides a great example of how growing computational capacity is enabling a new generation of massive hydrologic models. While the NWM spans an unprecedented spatial extent, there remain many questions about how to improve forecast at the street-level, the resolution at which many stakeholders make critical decisions. Further, the NWM runs on supercomputers, so water managers who may have access to their own high-resolution measurements may not readily be able to assimilate them into the model. To that end, we ask the question: how can the advances of the large-scale NWM be coupled with new local observations to enable hyper-resolution hydrologic forecasts? A methodology is proposed whereby the flow forecasts of the NWM are directly mapped to high-resolution stream levels using Dynamical System Identification. We apply the methodology across a sensor network of 182 gages in Iowa. Of these sites, approximately one third have shown to perform well in high-resolution flood forecasting when coupled with the outputs of the NWM. The quality of these forecasts is characterized using Principal Component Analysis and Random Forests to identify where the NWM may benefit from new sources of local observations. We also discuss how this approach can help municipalities identify where they should place low-cost sensors to most benefit from flood forecasts of the NWM.
2017-01-01
The U.S. Energy Information Administration's Short-Term Energy Outlook (STEO) produces monthly projections of energy supply, demand, trade, and prices over a 13-24 month period. Every January, the forecast horizon is extended through December of the following year. The STEO model is an integrated system of econometric regression equations and identities that link data on the various components of the U.S. energy industry together in order to develop consistent forecasts. The regression equations are estimated and the STEO model is solved using the EViews 9.5 econometric software package from IHS Global Inc. The model consists of various modules specific to each energy resource. All modules provide projections for the United States, and some modules provide more detailed forecasts for different regions of the country.
Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma
2015-04-21
Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources.
Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma
2015-01-01
Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources. PMID:25905698
Added value of dynamical downscaling of winter seasonal forecasts over North America
NASA Astrophysics Data System (ADS)
Tefera Diro, Gulilat; Sushama, Laxmi
2017-04-01
Skillful seasonal forecasts have enormous potential benefits for socio-economic sectors that are sensitive to weather and climate conditions, as the early warning routines could reduce the vulnerability of such sectors. In this study, individual ensemble members of the ECMWF global ensemble seasonal forecasts are dynamically downscaled to produce ensemble of regional seasonal forecasts over North America using the fifth generation Canadian Regional Climate Model (CRCM5). CRCM5 forecasts are initialized on November 1st of each year and are integrated for four months for the 1991-2001 period at 0.22 degree resolution to produce a one-month lead-time forecast. The initial conditions for atmospheric variables are obtained from ERA-Interim reanalysis, whereas the initial conditions for land surface are obtained from a separate ERA-interim driven CRCM5 simulation with spectral nudging applied to the interior domain. The global and regional ensemble forecasts were then verified to investigate the skill and economic benefits of dynamical downscaling. Results indicate that both the global and regional climate models produce skillful precipitation forecast over the southern Great Plains and eastern coasts of the U.S and skillful temperature forecasts over the northern U.S. and most of Canada. In comparison to ECMWF forecasts, CRCM5 forecasts improved the temperature forecast skill over most part of the domain, but the improvements for precipitation is limited to regions with complex topography, where it improves the frequency of intense daily precipitation. CRCM5 forecast also yields a better economic value compared to ECMWF precipitation forecasts, for users whose cost to loss ratio is smaller than 0.5.
The NASA MERIT program - Developing new concepts for accurate flight planning
NASA Technical Reports Server (NTRS)
Steinberg, R.
1982-01-01
It is noted that the rising cost of aviation fuel has necessitated the development of a new approach to upper air forecasting for flight planning. It is shown that the spatial resolution of the present weather forecast models used in fully automated computer flight planning is an important accuracy-limiting factor, and it is proposed that man be put back into the system, although not in the way he has been used in the past. A new approach is proposed which uses the application of man-computer interactive display techniques to upper air forecasting to retain the fine scale features of the atmosphere inherent in the present data base in order to provide a more accurate and cost effective flight plan. It is pointed out that, as a result of NASA research, the hardware required for this approach already exists.
Weather-based forecasts of California crop yields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lobell, D B; Cahill, K N; Field, C B
2005-09-26
Crop yield forecasts provide useful information to a range of users. Yields for several crops in California are currently forecast based on field surveys and farmer interviews, while for many crops official forecasts do not exist. As broad-scale crop yields are largely dependent on weather, measurements from existing meteorological stations have the potential to provide a reliable, timely, and cost-effective means to anticipate crop yields. We developed weather-based models of state-wide yields for 12 major California crops (wine grapes, lettuce, almonds, strawberries, table grapes, hay, oranges, cotton, tomatoes, walnuts, avocados, and pistachios), and tested their accuracy using cross-validation over themore » 1980-2003 period. Many crops were forecast with high accuracy, as judged by the percent of yield variation explained by the forecast, the number of yields with correctly predicted direction of yield change, or the number of yields with correctly predicted extreme yields. The most successfully modeled crop was almonds, with 81% of yield variance captured by the forecast. Predictions for most crops relied on weather measurements well before harvest time, allowing for lead times that were longer than existing procedures in many cases.« less
NASA Astrophysics Data System (ADS)
Gunda, T.; Bazuin, J. T.; Nay, J.; Yeung, K. L.
2017-03-01
Access to seasonal climate forecasts can benefit farmers by allowing them to make more informed decisions about their farming practices. However, it is unclear whether farmers realize these benefits when crop choices available to farmers have different and variable costs and returns; multiple countries have programs that incentivize production of certain crops while other crops are subject to market fluctuations. We hypothesize that the benefits of forecasts on farmer livelihoods will be moderated by the combined impact of differing crop economics and changing climate. Drawing upon methods and insights from both physical and social sciences, we develop a model of farmer decision-making to evaluate this hypothesis. The model dynamics are explored using empirical data from Sri Lanka; primary sources include survey and interview information as well as game-based experiments conducted with farmers in the field. Our simulations show that a farmer using seasonal forecasts has more diversified crop selections, which drive increases in average agricultural income. Increases in income are particularly notable under a drier climate scenario, when a farmer using seasonal forecasts is more likely to plant onions, a crop with higher possible returns. Our results indicate that, when water resources are scarce (i.e. drier climate scenario), farmer incomes could become stratified, potentially compounding existing disparities in farmers’ financial and technical abilities to use forecasts to inform their crop selections. This analysis highlights that while programs that promote production of certain crops may ensure food security in the short-term, the long-term implications of these dynamics need careful evaluation.
A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty
NASA Astrophysics Data System (ADS)
Ohmi, Masataro; Mori, Hiroyuki
In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.
A simplified real time method to forecast semi-enclosed basins storm surge
NASA Astrophysics Data System (ADS)
Pasquali, D.; Di Risio, M.; De Girolamo, P.
2015-11-01
Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.
Urban rail transit projects : forecast versus actual ridership and costs. final report
DOT National Transportation Integrated Search
1989-10-01
Substantial errors in forecasting ridership and costs for the ten rail transit projects reviewed in this report, put forth the possibility that more accurate forecasts would have led decision-makers to select projects other than those reviewed in thi...
DOT National Transportation Integrated Search
2009-01-01
In 1992, Pickrell published a seminal piece examining the accuracy of ridership forecasts and capital cost estimates for fixed-guideway transit systems in the US. His research created heated discussions in the transit industry regarding the ability o...
Forecasting Pell Program Applications Using Structural Aggregate Models.
ERIC Educational Resources Information Center
Cavin, Edward S.
1995-01-01
Demand for Pell Grant financial aid has become difficult to predict when using the current microsimulation model. This paper proposes an alternative model that uses aggregate data (based on individuals' microlevel decisions and macrodata on family incomes, college costs, and opportunity wages) and avoids some limitations of simple linear models.…
Time series modelling of global mean temperature for managerial decision-making.
Romilly, Peter
2005-07-01
Climate change has important implications for business and economic activity. Effective management of climate change impacts will depend on the availability of accurate and cost-effective forecasts. This paper uses univariate time series techniques to model the properties of a global mean temperature dataset in order to develop a parsimonious forecasting model for managerial decision-making over the short-term horizon. Although the model is estimated on global temperature data, the methodology could also be applied to temperature data at more localised levels. The statistical techniques include seasonal and non-seasonal unit root testing with and without structural breaks, as well as ARIMA and GARCH modelling. A forecasting evaluation shows that the chosen model performs well against rival models. The estimation results confirm the findings of a number of previous studies, namely that global mean temperatures increased significantly throughout the 20th century. The use of GARCH modelling also shows the presence of volatility clustering in the temperature data, and a positive association between volatility and global mean temperature.
Cloud Forecasting and 3-D Radiative Transfer Model Validation using Citizen-Sourced Imagery
NASA Astrophysics Data System (ADS)
Gasiewski, A. J.; Heymsfield, A.; Newman Frey, K.; Davis, R.; Rapp, J.; Bansemer, A.; Coon, T.; Folsom, R.; Pfeufer, N.; Kalloor, J.
2017-12-01
Cloud radiative feedback mechanisms are one of the largest sources of uncertainty in global climate models. Variations in local 3D cloud structure impact the interpretation of NASA CERES and MODIS data for top-of-atmosphere radiation studies over clouds. Much of this uncertainty results from lack of knowledge of cloud vertical and horizontal structure. Surface-based data on 3-D cloud structure from a multi-sensor array of low-latency ground-based cameras can be used to intercompare radiative transfer models based on MODIS and other satellite data with CERES data to improve the 3-D cloud parameterizations. Closely related, forecasting of solar insolation and associated cloud cover on time scales out to 1 hour and with spatial resolution of 100 meters is valuable for stabilizing power grids with high solar photovoltaic penetrations. Data for cloud-advection based solar insolation forecasting with requisite spatial resolution and latency needed to predict high ramp rate events obtained from a bottom-up perspective is strongly correlated with cloud-induced fluctuations. The development of grid management practices for improved integration of renewable solar energy thus also benefits from a multi-sensor camera array. The data needs for both 3D cloud radiation modelling and solar forecasting are being addressed using a network of low-cost upward-looking visible light CCD sky cameras positioned at 2 km spacing over an area of 30-60 km in size acquiring imagery on 30 second intervals. Such cameras can be manufactured in quantity and deployed by citizen volunteers at a marginal cost of 200-400 and operated unattended using existing communications infrastructure. A trial phase to understand the potential utility of up-looking multi-sensor visible imagery is underway within this NASA Citizen Science project. To develop the initial data sets necessary to optimally design a multi-sensor cloud camera array a team of 100 citizen scientists using self-owned PDA cameras is being organized to collect distributed cloud data sets suitable for MODIS-CERES cloud radiation science and solar forecasting algorithm development. A low-cost and robust sensor design suitable for large scale fabrication and long term deployment has been developed during the project prototyping phase.
DEVELOPING SITE-SPECIFIC MODELS FOR FORECASTING BACTERIA LEVELS AT COASTAL BEACHES
The U.S.Beaches Environmental Assessment and Coastal Health Act of 2000 authorizes studies of pathogen indicators in coastal recreation waters that develop appropriate, accurate, expeditious, and cost-effective methods (including predictive models) for quantifying pathogens in co...
Forecast of Frost Days Based on Monthly Temperatures
NASA Astrophysics Data System (ADS)
Castellanos, M. T.; Tarquis, A. M.; Morató, M. C.; Saa-Requejo, A.
2009-04-01
Although frost can cause considerable crop damage and mitigation practices against forecasted frost exist, frost forecasting technologies have not changed for many years. The paper reports a new method to forecast the monthly number of frost days (FD) for several meteorological stations at Community of Madrid (Spain) based on successive application of two models. The first one is a stochastic model, autoregressive integrated moving average (ARIMA), that forecasts monthly minimum absolute temperature (tmin) and monthly average of minimum temperature (tminav) following Box-Jenkins methodology. The second model relates these monthly temperatures to minimum daily temperature distribution during one month. Three ARIMA models were identified for the time series analyzed with a stational period correspondent to one year. They present the same stational behavior (moving average differenced model) and different non-stational part: autoregressive model (Model 1), moving average differenced model (Model 2) and autoregressive and moving average model (Model 3). At the same time, the results point out that minimum daily temperature (tdmin), for the meteorological stations studied, followed a normal distribution each month with a very similar standard deviation through years. This standard deviation obtained for each station and each month could be used as a risk index for cold months. The application of Model 1 to predict minimum monthly temperatures showed the best FD forecast. This procedure provides a tool for crop managers and crop insurance companies to asses the risk of frost frequency and intensity, so that they can take steps to mitigate against frost damage and estimated the damage that frost would cost. This research was supported by Comunidad de Madrid Research Project 076/92. The cooperation of the Spanish National Meteorological Institute and the Spanish Ministerio de Agricultura, Pesca y Alimentation (MAPA) is gratefully acknowledged.
NASA Astrophysics Data System (ADS)
Perkins, W. A.; Hakim, G. J.
2016-12-01
In this work, we examine the skill of a new approach to performing climate field reconstructions (CFRs) using a form of online paleoclimate data assimilation (PDA). Many previous studies have foregone climate model forecasts during assimilation due to the computational expense of running coupled global climate models (CGCMs), and the relatively low skill of these forecasts on longer timescales. Here we greatly diminish the computational costs by employing an empirical forecast model (known as a linear inverse model; LIM), which has been shown to have comparable skill to CGCMs. CFRs of annually averaged 2m air temperature anomalies are compared between the Last Millennium Reanalysis framework (no forecasting or "offline"), a persistence forecast, and four LIM forecasting experiments over the instrumental period (1850 - 2000). We test LIM calibrations for observational (Berkeley Earth), reanalysis (20th Century Reanalysis), and CMIP5 climate model (CCSM4 and MPI) data. Generally, we find that the usage of LIM forecasts for online PDA increases reconstruction agreement with the instrumental record for both spatial and global mean temperature (GMT). The detrended GMT skill metrics show the most dramatic increases in skill with coefficient of efficiency (CE) improvements over the no-forecasting benchmark averaging 57%. LIM experiments display a common pattern of spatial field increases in CE skill over northern hemisphere land areas and in the high-latitude North Atlantic - Barents Sea corridor (Figure 1). However, the non-GCM-calibrated LIMs introduce other deficiencies into the spatial skill of these reconstructions, likely due to aspects of the LIM calibration process. Overall, the CMIP5 LIMs have the best performance when considering both spatial fields and GMT. A comparison with the persistence forecast experiment suggests that improvements are associated with the usage of the LIM forecasts, and not simple persistence of temperature anomalies over time. These results show that the use of LIM forecasting can help add further dynamical constraint to CFRs. As we move forward, this will be an important factor in fully utilizing dynamically consistent information from the proxy record while reconstructing the past millennium.
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi
2014-09-01
Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.
Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping
2015-01-01
Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911
Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M
2015-12-01
Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.
Metrics for the Evaluation the Utility of Air Quality Forecasting
NASA Astrophysics Data System (ADS)
Sumo, T. M.; Stockwell, W. R.
2013-12-01
Global warming is expected to lead to higher levels of air pollution and therefore the forecasting of both long-term and daily air quality is an important component for the assessment of the costs of climate change and its impact on human health. Some of the risks associated with poor air quality days (where the Air Pollution Index is greater than 100), include hospital visits and mortality. Accurate air quality forecasting has the potential to allow sensitive groups to take appropriate precautions. This research builds metrics for evaluating the utility of air quality forecasting in terms of its potential impacts. Our analysis of air quality models focuses on the Washington, DC/Baltimore, MD region over the summertime ozone seasons between 2010 and 2012. The metrics that are relevant to our analysis include: (1) The number of times that a high ozone or particulate matter (PM) episode is correctly forecasted, (2) the number of times that high ozone or PM episode is forecasted when it does not occur and (3) the number of times when the air quality forecast predicts a cleaner air episode when the air was observed to have high ozone or PM. Our evaluation of the performance of air quality forecasts include those forecasts of ozone and particulate matter and data available from the U.S. Environmental Protection Agency (EPA)'s AIRNOW. We also examined observational ozone and particulate matter data available from Clean Air Partners. Overall the forecast models perform well for our region and time interval.
Routine High-Resolution Forecasts/Analyses for the Pacific Disaster Center: User Manual
NASA Technical Reports Server (NTRS)
Roads, John; Han, J.; Chen, S.; Burgan, R.; Fujioka, F.; Stevens, D.; Funayama, D.; Chambers, C.; Bingaman, B.; McCord, C.;
2001-01-01
Enclosed herein is our HWCMO user manual. This manual constitutes the final report for our NASA/PDC grant, NASA NAG5-8730, "Routine High Resolution Forecasts/Analysis for the Pacific Disaster Center". Since the beginning of the grant, we have routinely provided experimental high resolution forecasts from the RSM/MSM for the Hawaii Islands, while working to upgrade the system to include: (1) a more robust input of NCEP analyses directly from NCEP; (2) higher vertical resolution, with increased forecast accuracy; (3) faster delivery of forecast products and extension of initial 1-day forecasts to 2 days; (4) augmentation of our basic meteorological and simplified fireweather forecasts to firedanger and drought forecasts; (5) additional meteorological forecasts with an alternate mesoscale model (MM5); and (6) the feasibility of using our modeling system to work in higher-resolution domains and other regions. In this user manual, we provide a general overview of the operational system and the mesoscale models as well as more detailed descriptions of the models. A detailed description of daily operations and a cost analysis is also provided. Evaluations of the models are included although it should be noted that model evaluation is a continuing process and as potential problems are identified, these can be used as the basis for making model improvements. Finally, we include our previously submitted answers to particular PDC questions (Appendix V). All of our initially proposed objectives have basically been met. In fact, a number of useful applications (VOG, air pollution transport) are already utilizing our experimental output and we believe there are a number of other applications that could make use of our routine forecast/analysis products. Still, work still remains to be done to further develop this experimental weather, climate, fire danger and drought prediction system. In short, we would like to be a part of a future PDC team, if at all possible, to further develop and apply the system for the Hawaiian and other Pacific Islands as well as the entire Pacific Basin.
Wang, Shengnan; Petzold, Max; Cao, Junshan; Zhang, Yue; Wang, Weibing
2015-05-01
Few studies in China have focused on direct expenditures for cardiovascular diseases (CVDs), making cost trends for CVDs uncertain. Epidemic modeling and forecasting may be essential for health workers and policy makers to reduce the cost burden of CVDs.To develop a time series model using Box-Jenkins methodology for a 15-year forecasting of CVD hospitalization costs in Shanghai.Daily visits and medical expenditures for CVD hospitalizations between January 1, 2008 and December 31, 2012 were analyzed. Data from 2012 were used for further analyses, including yearly total health expenditures and expenditures per visit for each disease, as well as per-visit-per-year medical costs of each service for CVD hospitalizations. Time series analyses were performed to determine the long-time trend of total direct medical expenditures for CVDs and specific expenditures for each disease, which were used to forecast expenditures until December 31, 2030.From 2008 to 2012, there were increased yearly trends for both hospitalizations (from 250,354 to 322,676) and total costs (from US $ 388.52 to 721.58 million per year in 2014 currency) in Shanghai. Cost per CVD hospitalization in 2012 averaged US $ 2236.29, with the highest being for chronic rheumatic heart diseases (US $ 4710.78). Most direct medical costs were spent on medication. By the end of 2030, the average cost per visit per month for all CVDs was estimated to be US $ 4042.68 (95% CI: US $ 3795.04-4290.31) for all CVDs, and the total health expenditure for CVDs would reach over US $1.12 billion (95% CI: US $ 1.05-1.19 billion) without additional government interventions.Total health expenditures for CVDs in Shanghai are estimated to be higher in the future. These results should be a valuable future resource for both researchers on the economic effects of CVDs and for policy makers.
Direct Medical Costs of Hospitalizations for Cardiovascular Diseases in Shanghai, China
Wang, Shengnan; Petzold, Max; Cao, Junshan; Zhang, Yue; Wang, Weibing
2015-01-01
Abstract Few studies in China have focused on direct expenditures for cardiovascular diseases (CVDs), making cost trends for CVDs uncertain. Epidemic modeling and forecasting may be essential for health workers and policy makers to reduce the cost burden of CVDs. To develop a time series model using Box–Jenkins methodology for a 15-year forecasting of CVD hospitalization costs in Shanghai. Daily visits and medical expenditures for CVD hospitalizations between January 1, 2008 and December 31, 2012 were analyzed. Data from 2012 were used for further analyses, including yearly total health expenditures and expenditures per visit for each disease, as well as per-visit-per-year medical costs of each service for CVD hospitalizations. Time series analyses were performed to determine the long-time trend of total direct medical expenditures for CVDs and specific expenditures for each disease, which were used to forecast expenditures until December 31, 2030. From 2008 to 2012, there were increased yearly trends for both hospitalizations (from 250,354 to 322,676) and total costs (from US $ 388.52 to 721.58 million per year in 2014 currency) in Shanghai. Cost per CVD hospitalization in 2012 averaged US $ 2236.29, with the highest being for chronic rheumatic heart diseases (US $ 4710.78). Most direct medical costs were spent on medication. By the end of 2030, the average cost per visit per month for all CVDs was estimated to be US $ 4042.68 (95% CI: US $ 3795.04–4290.31) for all CVDs, and the total health expenditure for CVDs would reach over US $1.12 billion (95% CI: US $ 1.05–1.19 billion) without additional government interventions. Total health expenditures for CVDs in Shanghai are estimated to be higher in the future. These results should be a valuable future resource for both researchers on the economic effects of CVDs and for policy makers. PMID:25997060
The S-curve for forecasting waste generation in construction projects.
Lu, Weisheng; Peng, Yi; Chen, Xi; Skitmore, Martin; Zhang, Xiaoling
2016-10-01
Forecasting construction waste generation is the yardstick of any effort by policy-makers, researchers, practitioners and the like to manage construction and demolition (C&D) waste. This paper develops and tests an S-curve model to indicate accumulative waste generation as a project progresses. Using 37,148 disposal records generated from 138 building projects in Hong Kong in four consecutive years from January 2011 to June 2015, a wide range of potential S-curve models are examined, and as a result, the formula that best fits the historical data set is found. The S-curve model is then further linked to project characteristics using artificial neural networks (ANNs) so that it can be used to forecast waste generation in future construction projects. It was found that, among the S-curve models, cumulative logistic distribution is the best formula to fit the historical data. Meanwhile, contract sum, location, public-private nature, and duration can be used to forecast construction waste generation. The study provides contractors with not only an S-curve model to forecast overall waste generation before a project commences, but also with a detailed baseline to benchmark and manage waste during the course of construction. The major contribution of this paper is to the body of knowledge in the field of construction waste generation forecasting. By examining it with an S-curve model, the study elevates construction waste management to a level equivalent to project cost management where the model has already been readily accepted as a standard tool. Copyright © 2016 Elsevier Ltd. All rights reserved.
Seasonal forecasting of discharge for the Raccoon River, Iowa
NASA Astrophysics Data System (ADS)
Slater, Louise; Villarini, Gabriele; Bradley, Allen; Vecchi, Gabriel
2016-04-01
The state of Iowa (central United States) is regularly afflicted by severe natural hazards such as the 2008/2013 floods and the 2012 drought. To improve preparedness for these catastrophic events and allow Iowans to make more informed decisions about the most suitable water management strategies, we have developed a framework for medium to long range probabilistic seasonal streamflow forecasting for the Raccoon River at Van Meter, a 8900-km2 catchment located in central-western Iowa. Our flow forecasts use statistical models to predict seasonal discharge for low to high flows, with lead forecasting times ranging from one to ten months. Historical measurements of daily discharge are obtained from the U.S. Geological Survey (USGS) at the Van Meter stream gage, and used to compute quantile time series from minimum to maximum seasonal flow. The model is forced with basin-averaged total seasonal precipitation records from the PRISM Climate Group and annual row crop production acreage from the U.S. Department of Agriculture's National Agricultural Statistics Services database. For the forecasts, we use corn and soybean production from the previous year (persistence forecast) as a proxy for the impacts of agricultural practices on streamflow. The monthly precipitation forecasts are provided by eight Global Climate Models (GCMs) from the North American Multi-Model Ensemble (NMME), with lead times ranging from 0.5 to 11.5 months, and a resolution of 1 decimal degree. Additionally, precipitation from the month preceding each season is used to characterize antecedent soil moisture conditions. The accuracy of our modelled (1927-2015) and forecasted (2001-2015) discharge values is assessed by comparison with the observed USGS data. We explore the sensitivity of forecast skill over the full range of lead times, flow quantiles, forecast seasons, and with each GCM. Forecast skill is also examined using different formulations of the statistical models, as well as NMME forecast weighting procedures based on the computed potential skill (historical forecast accuracy) of the different GCMs. We find that the models describe the year-to-year variability in streamflow accurately, as well as the overall tendency towards increasing (and more variable) discharge over time. Surprisingly, forecast skill does not decrease markedly with lead time, and high flows tend to be well predicted, suggesting that these forecasts may have considerable practical applications. Further, the seasonal flow forecast accuracy is substantially improved by weighting the contribution of individual GCMs to the forecasts, and also by the inclusion of antecedent precipitation. Our results can provide critical information for adaptation strategies aiming to mitigate the costs and disruptions arising from flood and drought conditions, and allow us to determine how far in advance skillful forecasts can be issued. The availability of these discharge forecasts would have major societal and economic benefits for hydrology and water resources management, agriculture, disaster forecasts and prevention, energy, finance and insurance, food security, policy-making and public authorities, and transportation.
Integrating Solar PV in Utility System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, A.; Botterud, A.; Wu, J.
2013-10-31
This study develops a systematic framework for estimating the increase in operating costs due to uncertainty and variability in renewable resources, uses the framework to quantify the integration costs associated with sub-hourly solar power variability and uncertainty, and shows how changes in system operations may affect these costs. Toward this end, we present a statistical method for estimating the required balancing reserves to maintain system reliability along with a model for commitment and dispatch of the portfolio of thermal and renewable resources at different stages of system operations. We estimate the costs of sub-hourly solar variability, short-term forecast errors, andmore » day-ahead (DA) forecast errors as the difference in production costs between a case with “realistic” PV (i.e., subhourly solar variability and uncertainty are fully included in the modeling) and a case with “well behaved” PV (i.e., PV is assumed to have no sub-hourly variability and can be perfectly forecasted). In addition, we highlight current practices that allow utilities to compensate for the issues encountered at the sub-hourly time frame with increased levels of PV penetration. In this analysis we use the analytical framework to simulate utility operations with increasing deployment of PV in a case study of Arizona Public Service Company (APS), a utility in the southwestern United States. In our analysis, we focus on three processes that are important in understanding the management of PV variability and uncertainty in power system operations. First, we represent the decisions made the day before the operating day through a DA commitment model that relies on imperfect DA forecasts of load and wind as well as PV generation. Second, we represent the decisions made by schedulers in the operating day through hour-ahead (HA) scheduling. Peaking units can be committed or decommitted in the HA schedules and online units can be redispatched using forecasts that are improved relative to DA forecasts, but still imperfect. Finally, we represent decisions within the operating hour by schedulers and transmission system operators as real-time (RT) balancing. We simulate the DA and HA scheduling processes with a detailed unit-commitment (UC) and economic dispatch (ED) optimization model. This model creates a least-cost dispatch and commitment plan for the conventional generating units using forecasts and reserve requirements as inputs. We consider only the generation units and load of the utility in this analysis; we do not consider opportunities to trade power with neighboring utilities. We also do not consider provision of reserves from renewables or from demand-side options. We estimate dynamic reserve requirements in order to meet reliability requirements in the RT operations, considering the uncertainty and variability in load, solar PV, and wind resources. Balancing reserve requirements are based on the 2.5th and 97.5th percentile of 1-min deviations from the HA schedule in a previous year. We then simulate RT deployment of balancing reserves using a separate minute-by-minute simulation of deviations from the HA schedules in the operating year. In the simulations we assume that balancing reserves can be fully deployed in 10 min. The minute-by-minute deviations account for HA forecasting errors and the actual variability of the load, wind, and solar generation. Using these minute-by-minute deviations and deployment of balancing reserves, we evaluate the impact of PV on system reliability through the calculation of the standard reliability metric called Control Performance Standard 2 (CPS2). Broadly speaking, the CPS2 score measures the percentage of 10-min periods in which a balancing area is able to balance supply and demand within a specific threshold. Compliance with the North American Electric Reliability Corporation (NERC) reliability standards requires that the CPS2 score must exceed 90% (i.e., the balancing area must maintain adequate balance for 90% of the 10-min periods). The combination of representing DA forecast errors in the DA commitments, using 1-min PV data to simulate RT balancing, and estimates of reliability performance through the CPS2 metric, all factors that are important to operating systems with increasing amounts of PV, makes this study unique in its scope.« less
Value of biologic therapy: a forecasting model in three disease areas.
Paramore, L Clark; Hunter, Craig A; Luce, Bryan R; Nordyke, Robert J; Halbert, R J
2010-01-01
Forecast the return on investment (ROI) for advances in biologic therapies in years 2015 and 2030, based upon impact on disease prevalence, morbidity, and mortality for asthma, diabetes, and colorectal cancer. A deterministic, spreadsheet-based, forecasting model was developed based on trends in demographics and disease epidemiology. 'Return' was defined as reductions in disease burden (prevalence, morbidity, mortality) translated into monetary terms; 'investment' was defined as the incremental costs of biologic therapy advances. Data on disease prevalence, morbidity, mortality, and associated costs were obtained from government survey statistics or published literature. Expected impact of advances in biologic therapies was based on expert opinion. Gains in quality-adjusted life years (QALYs) were valued at $100,000 per QALY. The base case analysis, in which reductions in disease prevalence and mortality predicted by the expert panel are not considered, shows the resulting ROIs remain positive for asthma and diabetes but fall below $1 for colorectal cancer. Analysis involving expert panel predictions indicated positive ROI results for all three diseases at both time points, ranging from $207 for each incremental dollar spent on biologic therapies to treat asthma in 2030, to $4 for each incremental dollar spent on biologic therapies to treat colorectal cancer in 2015. If QALYs are not considered, the resulting ROIs remain positive for all three diseases at both time points. Society may expect substantial returns from investments in innovative biologic therapies. These benefits are most likely to be realized in an environment of appropriate use of new molecules. The potential variance between forecasted (from expert opinion) and actual future health outcomes could be significant. Similarly, the forecasted growth in use of biologic therapies relied upon unvalidated market forecasts.
NASA Astrophysics Data System (ADS)
Pulusani, Praneeth R.
As the number of electric vehicles on the road increases, current power grid infrastructure will not be able to handle the additional load. Some approaches in the area of Smart Grid research attempt to mitigate this, but those approaches alone will not be sufficient. Those approaches and traditional solution of increased power production can result in an insufficient and imbalanced power grid. It can lead to transformer blowouts, blackouts and blown fuses, etc. The proposed solution will supplement the ``Smart Grid'' to create a more sustainable power grid. To solve or mitigate the magnitude of the problem, measures can be taken that depend on weather forecast models. For instance, wind and solar forecasts can be used to create first order Markov chain models that will help predict the availability of additional power at certain times. These models will be used in conjunction with the information processing layer and bidirectional signal processing components of electric vehicle charging systems, to schedule the amount of energy transferred per time interval at various times. The research was divided into three distinct components: (1) Renewable Energy Supply Forecast Model, (2) Energy Demand Forecast from PEVs, and (3) Renewable Energy Resource Estimation. For the first component, power data from a local wind turbine, and weather forecast data from NOAA were used to develop a wind energy forecast model, using a first order Markov chain model as the foundation. In the second component, additional macro energy demand from PEVs in the Greater Rochester Area was forecasted by simulating concurrent driving routes. In the third component, historical data from renewable energy sources was analyzed to estimate the renewable resources needed to offset the energy demand from PEVs. The results from these models and components can be used in the smart grid applications for scheduling and delivering energy. Several solutions are discussed to mitigate the problem of overloading transformers, lack of energy supply, and higher utility costs.
Prediction of a service demand using combined forecasting approach
NASA Astrophysics Data System (ADS)
Zhou, Ling
2017-08-01
Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.
NASA Astrophysics Data System (ADS)
Penn, C. A.; Clow, D. W.; Sexstone, G. A.
2017-12-01
Water supply forecasts are an important tool for water resource managers in areas where surface water is relied on for irrigating agricultural lands and for municipal water supplies. Forecast errors, which correspond to inaccurate predictions of total surface water volume, can lead to mis-allocated water and productivity loss, thus costing stakeholders millions of dollars. The objective of this investigation is to provide water resource managers with an improved understanding of factors contributing to forecast error, and to help increase the accuracy of future forecasts. In many watersheds of the western United States, snowmelt contributes 50-75% of annual surface water flow and controls both the timing and volume of peak flow. Water supply forecasts from the Natural Resources Conservation Service (NRCS), National Weather Service, and similar cooperators use precipitation and snowpack measurements to provide water resource managers with an estimate of seasonal runoff volume. The accuracy of these forecasts can be limited by available snowpack and meteorological data. In the headwaters of the Rio Grande, NRCS produces January through June monthly Water Supply Outlook Reports. This study evaluates the accuracy of these forecasts since 1990, and examines what factors may contribute to forecast error. The Rio Grande headwaters has experienced recent changes in land cover from bark beetle infestation and a large wildfire, which can affect hydrological processes within the watershed. To investigate trends and possible contributing factors in forecast error, a semi-distributed hydrological model was calibrated and run to simulate daily streamflow for the period 1990-2015. Annual and seasonal watershed and sub-watershed water balance properties were compared with seasonal water supply forecasts. Gridded meteorological datasets were used to assess changes in the timing and volume of spring precipitation events that may contribute to forecast error. Additionally, a spatially-distributed physics-based snow model was used to assess possible effects of land cover change on snowpack properties. Trends in forecasted error are variable while baseline model results show a consistent under-prediction in the recent decade, highlighting possible compounding effects of climate and land cover changes.
Probabilistic Space Weather Forecasting: a Bayesian Perspective
NASA Astrophysics Data System (ADS)
Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.
2017-12-01
Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.
Extensions and applications of ensemble-of-trees methods in machine learning
NASA Astrophysics Data System (ADS)
Bleich, Justin
Ensemble-of-trees algorithms have emerged to the forefront of machine learning due to their ability to generate high forecasting accuracy for a wide array of regression and classification problems. Classic ensemble methodologies such as random forests (RF) and stochastic gradient boosting (SGB) rely on algorithmic procedures to generate fits to data. In contrast, more recent ensemble techniques such as Bayesian Additive Regression Trees (BART) and Dynamic Trees (DT) focus on an underlying Bayesian probability model to generate the fits. These new probability model-based approaches show much promise versus their algorithmic counterparts, but also offer substantial room for improvement. The first part of this thesis focuses on methodological advances for ensemble-of-trees techniques with an emphasis on the more recent Bayesian approaches. In particular, we focus on extensions of BART in four distinct ways. First, we develop a more robust implementation of BART for both research and application. We then develop a principled approach to variable selection for BART as well as the ability to naturally incorporate prior information on important covariates into the algorithm. Next, we propose a method for handling missing data that relies on the recursive structure of decision trees and does not require imputation. Last, we relax the assumption of homoskedasticity in the BART model to allow for parametric modeling of heteroskedasticity. The second part of this thesis returns to the classic algorithmic approaches in the context of classification problems with asymmetric costs of forecasting errors. First we consider the performance of RF and SGB more broadly and demonstrate its superiority to logistic regression for applications in criminology with asymmetric costs. Next, we use RF to forecast unplanned hospital readmissions upon patient discharge with asymmetric costs taken into account. Finally, we explore the construction of stable decision trees for forecasts of violence during probation hearings in court systems.
Forecasting the Cost-Effectiveness of Educational Incentives
ERIC Educational Resources Information Center
Abt, Clark C.
1974-01-01
A look at cost-effectiveness as the major characteristic for which to develop a forecasting method, because it encompasses concerns of most educators. It indicates relative costs and relative effectiveness, and provides a rational basis for optimal resource allocation. (Author)
Exploiting Domain Knowledge to Forecast Heating Oil Consumption
NASA Astrophysics Data System (ADS)
Corliss, George F.; Sakauchi, Tsuginosuke; Vitullo, Steven R.; Brown, Ronald H.
2011-11-01
The GasDay laboratory at Marquette University provides forecasts of energy consumption. One such service is the Heating Oil Forecaster, a service for a heating oil or propane delivery company. Accurate forecasts can help reduce the number of trucks and drivers while providing efficient inventory management by stretching the time between deliveries. Accurate forecasts help retain valuable customers. If a customer runs out of fuel, the delivery service incurs costs for an emergency delivery and often a service call. Further, the customer probably changes providers. The basic modeling is simple: Fit delivery amounts sk to cumulative Heating Degree Days (HDDk = Σmax(0,60 °F—daily average temperature)), with wind adjustment, for each delivery period: sk≈ŝk = β0+β1HDDk. For the first few deliveries, there is not enough data to provide a reliable estimate K = 1/β1 so we use Bayesian techniques with priors constructed from historical data. A fresh model is trained for each customer with each delivery, producing daily consumption forecasts using actual and forecast weather until the next delivery. In practice, a delivery may not fill the oil tank if the delivery truck runs out of oil or the automatic shut-off activates prematurely. Special outlier detection and recovery based on domain knowledge addresses this and other special cases. The error at each delivery is the difference between that delivery and the aggregate of daily forecasts using actual weather since the preceding delivery. Out-of-sample testing yields MAPE = 21.2% and an average error of 6.0% of tank capacity for Company A. The MAPE and an average error as a percentage of tank capacity for Company B are 31.5 % and 6.6 %, respectively. One heating oil delivery company who uses this forecasting service [1] reported instances of a customer running out of oil reduced from about 250 in 50,000 deliveries per year before contracting for our service to about 10 with our service. They delivered slightly more oil with 20 % fewer trucks and drivers, citing 250,000 annual savings in operational costs.
Jones, Andrew M; Lomas, James; Moore, Peter T; Rice, Nigel
2016-10-01
We conduct a quasi-Monte-Carlo comparison of the recent developments in parametric and semiparametric regression methods for healthcare costs, both against each other and against standard practice. The population of English National Health Service hospital in-patient episodes for the financial year 2007-2008 (summed for each patient) is randomly divided into two equally sized subpopulations to form an estimation set and a validation set. Evaluating out-of-sample using the validation set, a conditional density approximation estimator shows considerable promise in forecasting conditional means, performing best for accuracy of forecasting and among the best four for bias and goodness of fit. The best performing model for bias is linear regression with square-root-transformed dependent variables, whereas a generalized linear model with square-root link function and Poisson distribution performs best in terms of goodness of fit. Commonly used models utilizing a log-link are shown to perform badly relative to other models considered in our comparison.
Hospital output forecasts and the cost of empty hospital beds.
Pauly, M V; Wilson, P
1986-01-01
This article investigates the cost incurred when hospitals have different levels of beds to treat a given number of patients. The cost of hospital care is affected by both the forecasted level of admissions and the actual number of admissions. When the relationship between forecasted and actual admissions is held constant, it is found that an empty hospital bed at a typical hospital in Michigan has a relatively low cost, about 13 percent or less of the cost of an occupied bed. However, empty beds in large hospitals do add significantly to cost. If hospital beds are closed, whether by closing beds at hospitals which remain in business or by closing entire hospitals, cost savings are estimated to be small. PMID:3759473
The impact of wind power on electricity prices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brancucci Martinez-Anido, Carlo; Brinkman, Greg; Hodge, Bri-Mathias
This paper investigates the impact of wind power on electricity prices using a production cost model of the Independent System Operator - New England power system. Different scenarios in terms of wind penetration, wind forecasts, and wind curtailment are modeled in order to analyze the impact of wind power on electricity prices for different wind penetration levels and for different levels of wind power visibility and controllability. The analysis concludes that electricity price volatility increases even as electricity prices decrease with increasing wind penetration levels. The impact of wind power on price volatility is larger in the shorter term (5-minmore » compared to hour-to-hour). The results presented show that over-forecasting wind power increases electricity prices while under-forecasting wind power reduces them. The modeling results also show that controlling wind power by allowing curtailment increases electricity prices, and for higher wind penetrations it also reduces their volatility.« less
Xue, J L; Ma, J Z; Louis, T A; Collins, A J
2001-12-01
As the United States end-stage renal disease (ESRD) program enters the new millennium, the continued growth of the ESRD population poses a challenge for policy makers, health care providers, and financial planners. To assist in future planning for the ESRD program, the growth of patient numbers and Medicare costs was forecasted to the year 2010 by modeling of historical data from 1982 through 1997. A stepwise autoregressive method and exponential smoothing models were used. The forecasting models for ESRD patient numbers demonstrated mean errors of -0.03 to 1.03%, relative to the observed values. The model for Medicare payments demonstrated -0.12% mean error. The R(2) values for the forecasting models ranged from 99.09 to 99.98%. On the basis of trends in patient numbers, this forecast projects average annual growth of the ESRD populations of approximately 4.1% for new patients, 6.4% for long-term ESRD patients, 7.1% for dialysis patients, 6.1% for patients with functioning transplants, and 8.2% for patients on waiting lists for transplants, as well as 7.7% for Medicare expenditures. The numbers of patients with ESRD in 2010 are forecasted to be 129,200 +/- 7742 (95% confidence limits) new patients, 651,330 +/- 15,874 long-term ESRD patients, 520,240 +/- 25,609 dialysis patients, 178,806 +/- 4349 patients with functioning transplants, and 95,550 +/- 5478 patients on waiting lists. The forecasted Medicare expenditures are projected to increase to $28.3 +/- 1.7 billion by 2010. These projections are subject to many factors that may alter the actual growth, compared with the historical patterns. They do, however, provide a basis for discussing the future growth of the ESRD program and how the ESRD community can meet the challenges ahead.
Water and Power Systems Co-optimization under a High Performance Computing Framework
NASA Astrophysics Data System (ADS)
Xuan, Y.; Arumugam, S.; DeCarolis, J.; Mahinthakumar, K.
2016-12-01
Water and energy systems optimizations are traditionally being treated as two separate processes, despite their intrinsic interconnections (e.g., water is used for hydropower generation, and thermoelectric cooling requires a large amount of water withdrawal). Given the challenges of urbanization, technology uncertainty and resource constraints, and the imminent threat of climate change, a cyberinfrastructure is needed to facilitate and expedite research into the complex management of these two systems. To address these issues, we developed a High Performance Computing (HPC) framework for stochastic co-optimization of water and energy resources to inform water allocation and electricity demand. The project aims to improve conjunctive management of water and power systems under climate change by incorporating improved ensemble forecast models of streamflow and power demand. First, by downscaling and spatio-temporally disaggregating multimodel climate forecasts from General Circulation Models (GCMs), temperature and precipitation forecasts are obtained and input into multi-reservoir and power systems models. Extended from Optimus (Optimization Methods for Universal Simulators), the framework drives the multi-reservoir model and power system model, Temoa (Tools for Energy Model Optimization and Analysis), and uses Particle Swarm Optimization (PSO) algorithm to solve high dimensional stochastic problems. The utility of climate forecasts on the cost of water and power systems operations is assessed and quantified based on different forecast scenarios (i.e., no-forecast, multimodel forecast and perfect forecast). Analysis of risk management actions and renewable energy deployments will be investigated for the Catawba River basin, an area with adequate hydroclimate predicting skill and a critical basin with 11 reservoirs that supplies water and generates power for both North and South Carolina. Further research using this scalable decision supporting framework will provide understanding and elucidate the intricate and interdependent relationship between water and energy systems and enhance the security of these two critical public infrastructures.
How does informational heterogeneity affect the quality of forecasts?
NASA Astrophysics Data System (ADS)
Gualdi, S.; De Martino, A.
2010-01-01
We investigate a toy model of inductive interacting agents aiming to forecast a continuous, exogenous random variable E. Private information on E is spread heterogeneously across agents. Herding turns out to be the preferred forecasting mechanism when heterogeneity is maximal. However in such conditions aggregating information efficiently is hard even in the presence of learning, as the herding ratio rises significantly above the efficient market expectation of 1 and remarkably close to the empirically observed values. We also study how different parameters (interaction range, learning rate, cost of information and score memory) may affect this scenario and improve efficiency in the hard phase.
Olshansky, S J
1988-01-01
Official forecasts of mortality made by the U.S. Office of the Actuary throughout this century have consistently underestimated observed mortality declines. This is due, in part, to their reliance on the static extrapolation of past trends, an atheoretical statistical method that pays scant attention to the behavioral, medical, and social factors contributing to mortality change. A "multiple cause-delay model" more realistically portrays the effects on mortality of the presence of more favorable risk factors at the population level. Such revised assumptions produce large increases in forecasts of the size of the elderly population, and have a dramatic impact on related estimates of population morbidity, disability, and health care costs.
Performance assessment of a Bayesian Forecasting System (BFS) for real-time flood forecasting
NASA Astrophysics Data System (ADS)
Biondi, D.; De Luca, D. L.
2013-02-01
SummaryThe paper evaluates, for a number of flood events, the performance of a Bayesian Forecasting System (BFS), with the aim of evaluating total uncertainty in real-time flood forecasting. The predictive uncertainty of future streamflow is estimated through the Bayesian integration of two separate processors. The former evaluates the propagation of input uncertainty on simulated river discharge, the latter computes the hydrological uncertainty of actual river discharge associated with all other possible sources of error. A stochastic model and a distributed rainfall-runoff model were assumed, respectively, for rainfall and hydrological response simulations. A case study was carried out for a small basin in the Calabria region (southern Italy). The performance assessment of the BFS was performed with adequate verification tools suited for probabilistic forecasts of continuous variables such as streamflow. Graphical tools and scalar metrics were used to evaluate several attributes of the forecast quality of the entire time-varying predictive distributions: calibration, sharpness, accuracy, and continuous ranked probability score (CRPS). Besides the overall system, which incorporates both sources of uncertainty, other hypotheses resulting from the BFS properties were examined, corresponding to (i) a perfect hydrological model; (ii) a non-informative rainfall forecast for predicting streamflow; and (iii) a perfect input forecast. The results emphasize the importance of using different diagnostic approaches to perform comprehensive analyses of predictive distributions, to arrive at a multifaceted view of the attributes of the prediction. For the case study, the selected criteria revealed the interaction of the different sources of error, in particular the crucial role of the hydrological uncertainty processor when compensating, at the cost of wider forecast intervals, for the unreliable and biased predictive distribution resulting from the Precipitation Uncertainty Processor.
1984-09-01
IN SOFTWARE DESIGN ......... .................... 39 P. PROCESS DESCRIPTIONS 43.............3 1. Model Euilding .............. 43 2. M1odel Management ... manager to model a wide variety of technology, price and cost situations without the associated overhead imposed by multiple application-specific systems...The Manager of the National Communications System (NCS) has been tasked by the National Security Telecommunications Policy of 3 August 1983 with
Vogl, Matthias; Leidl, Reiner
2016-05-01
The planning of health care management benefits from understanding future trends in demand and costs. In the case of lung diseases in the national German hospital market, we therefore analyze the current structure of care, and forecast future trends in key process indicators. We use standardized, patient-level, activity-based costing from a national cost calculation data set of respiratory cases, representing 11.9-14.1 % of all cases in the major diagnostic category "respiratory system" from 2006 to 2012. To forecast hospital admissions, length of stay (LOS), and costs, the best adjusted models out of possible autoregressive integrated moving average models and exponential smoothing models are used. The number of cases is predicted to increase substantially, from 1.1 million in 2006 to 1.5 million in 2018 (+2.7 % each year). LOS is expected to decrease from 7.9 to 6.1 days, and overall costs to increase from 2.7 to 4.5 billion euros (+4.3 % each year). Except for lung cancer (-2.3 % each year), costs for all respiratory disease areas increase: surgical interventions +9.2 % each year, COPD +3.9 %, bronchitis and asthma +1.7 %, infections +2.0 %, respiratory failure +2.6 %, and other diagnoses +8.5 % each year. The share of costs of surgical interventions in all costs of respiratory cases increases from 17.8 % in 2006 to 30.8 % in 2018. Overall costs are expected to increase particularly because of an increasing share of expensive surgical interventions and rare diseases, and because of higher intensive care, operating room, and diagnostics and therapy costs.
NASA Technical Reports Server (NTRS)
Shevell, R. S.; Jones, D. W., Jr.
1973-01-01
The development of a forecast model for short haul air transportation systems in the California Corridor is discussed. The factors which determine the level of air traffic demand are identified. A forecast equation for use in airport utilization analysis is developed. A mathematical model is submitted to show the relationship between population, employment, and income for indicating future air transportation utilization. Diagrams and tables of data are included to support the conclusions reached regarding air transportation economic factors.
AQA - Air Quality model for Austria - Evaluation and Developments
NASA Astrophysics Data System (ADS)
Hirtl, M.; Krüger, B. C.; Baumann-Stanzer, K.; Skomorowski, P.
2009-04-01
The regional weather forecast model ALADIN of the Central Institute for Meteorology and Geodynamics (ZAMG) is used in combination with the chemical transport model CAMx (www.camx.com) to conduct forecasts of gaseous and particulate air pollution over Europe. The forecasts which are done in cooperation with the University of Natural Resources and Applied Life Sciences in Vienna (BOKU) are supported by the regional governments since 2005 with the main interest on the prediction of tropospheric ozone. The daily ozone forecasts are evaluated for the summer 2008 with the observations of about 150 air quality stations in Austria. In 2008 the emission-model SMOKE was integrated into the modelling system to calculate the biogenic emissions. The anthropogenic emissions are based on the newest EMEP data set as well as on regional inventories for the core domain. The performance of SMOKE is shown for a summer period in 2007. In the frame of the COST-action 728 „Enhancing mesoscale meteorological modelling capabilities for air pollution and dispersion applications", multi-model ensembles are used to conduct an international model evaluation. The model calculations of meteorological- and concentration fields are compared to measurements on the ensemble platform at the Joint Research Centre (JRC) in Ispra. The results for 2 episodes in 2006 show the performance of the different models as well as of the model ensemble.
Newton, Ashley N; Ewer, Sid R
2010-01-01
This study uses longitudinal data of inpatient treatment from the Agency for Healthcare Research and Quality's (AHRQ's) Healthcare Cost and Utilization Project (HCUP) to examine the differences in historical trends and build future projections of charges, costs, and lengths of stay (LOS) for inpatient treatment of four of the most prevalent cancer types: breast, colon, lung, and prostate. We stratify our data by hospital ownership type and for the aforementioned four major cancer types. We use the Kruskal Wallis (nonparametric ANOVA) Test and time series models to analyze variance and build projections, respectively, for mean charges per discharge, mean costs per discharge, mean LOS per discharge, mean charges per day, and mean costs per day. We find that significant differences exist in both the mean charges per discharge and mean charges per day for breast, colon, lung, and prostate cancers and in the mean LOS per discharge for breast cancer. Additionally, we find that both mean charges and mean costs are forecast to continue increasing while mean LOS are forecast to continue decreasing over the forecast period 2008 to 2012. The methodologies we employ may be used by individual hospital systems, and by health care policy-makers, for various financial planning purposes. Future studies could examine additional financial and nonfinancial variables for these and other cancer types, test for geographic disparities, or focus on procedural-level hospital measures.
European Wintertime Windstorms and its Links to Large-Scale Variability Modes
NASA Astrophysics Data System (ADS)
Befort, D. J.; Wild, S.; Walz, M. A.; Knight, J. R.; Lockwood, J. F.; Thornton, H. E.; Hermanson, L.; Bett, P.; Weisheimer, A.; Leckebusch, G. C.
2017-12-01
Winter storms associated with extreme wind speeds and heavy precipitation are the most costly natural hazard in several European countries. Improved understanding and seasonal forecast skill of winter storms will thus help society, policy-makers and (re-) insurance industry to be better prepared for such events. We firstly assess the ability to represent extra-tropical windstorms over the Northern Hemisphere of three seasonal forecast ensemble suites: ECMWF System3, ECMWF System4 and GloSea5. Our results show significant skill for inter-annual variability of windstorm frequency over parts of Europe in two of these forecast suites (ECMWF-S4 and GloSea5) indicating the potential use of current seasonal forecast systems. In a regression model we further derive windstorm variability using the forecasted NAO from the seasonal model suites thus estimating the suitability of the NAO as the only predictor. We find that the NAO as the main large-scale mode over Europe can explain some of the achieved skill and is therefore an important source of variability in the seasonal models. However, our results show that the regression model fails to reproduce the skill level of the directly forecast windstorm frequency over large areas of central Europe. This suggests that the seasonal models also capture other sources of variability/predictability of windstorms than the NAO. In order to investigate which other large-scale variability modes steer the interannual variability of windstorms we develop a statistical model using a Poisson GLM. We find that the Scandinavian Pattern (SCA) in fact explains a larger amount of variability for Central Europe during the 20th century than the NAO. This statistical model is able to skilfully reproduce the interannual variability of windstorm frequency especially for the British Isles and Central Europe with correlations up to 0.8.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Huaiguang; Zhang, Yingchen
This paper proposes an approach for distribution system state forecasting, which aims to provide an accurate and high speed state forecasting with an optimal synchrophasor sensor placement (OSSP) based state estimator and an extreme learning machine (ELM) based forecaster. Specifically, considering the sensor installation cost and measurement error, an OSSP algorithm is proposed to reduce the number of synchrophasor sensor and keep the whole distribution system numerically and topologically observable. Then, the weighted least square (WLS) based system state estimator is used to produce the training data for the proposed forecaster. Traditionally, the artificial neural network (ANN) and support vectormore » regression (SVR) are widely used in forecasting due to their nonlinear modeling capabilities. However, the ANN contains heavy computation load and the best parameters for SVR are difficult to obtain. In this paper, the ELM, which overcomes these drawbacks, is used to forecast the future system states with the historical system states. The proposed approach is effective and accurate based on the testing results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Tian; Chernyakhovskiy, Ilya; Brancucci Martinez-Anido, Carlo
This document is the Spanish version of 'Greening the Grid- Forecasting Wind and Solar Generation Improving System Operations'. It discusses improving system operations with forecasting with and solar generation. By integrating variable renewable energy (VRE) forecasts into system operations, power system operators can anticipate up- and down-ramps in VRE generation in order to cost-effectively balance load and generation in intra-day and day-ahead scheduling. This leads to reduced fuel costs, improved system reliability, and maximum use of renewable resources.
A Copula-Based Conditional Probabilistic Forecast Model for Wind Power Ramps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Krishnan, Venkat K; Zhang, Jie
Efficient management of wind ramping characteristics can significantly reduce wind integration costs for balancing authorities. By considering the stochastic dependence of wind power ramp (WPR) features, this paper develops a conditional probabilistic wind power ramp forecast (cp-WPRF) model based on Copula theory. The WPRs dataset is constructed by extracting ramps from a large dataset of historical wind power. Each WPR feature (e.g., rate, magnitude, duration, and start-time) is separately forecasted by considering the coupling effects among different ramp features. To accurately model the marginal distributions with a copula, a Gaussian mixture model (GMM) is adopted to characterize the WPR uncertaintymore » and features. The Canonical Maximum Likelihood (CML) method is used to estimate parameters of the multivariable copula. The optimal copula model is chosen based on the Bayesian information criterion (BIC) from each copula family. Finally, the best conditions based cp-WPRF model is determined by predictive interval (PI) based evaluation metrics. Numerical simulations on publicly available wind power data show that the developed copula-based cp-WPRF model can predict WPRs with a high level of reliability and sharpness.« less
7 CFR 1710.205 - Minimum approval requirements for all load forecasts.
Code of Federal Regulations, 2014 CFR
2014-01-01
... computer software applications. RUS will evaluate borrower load forecasts for readability, understanding..., distribution costs, other systems costs, average revenue per kWh, and inflation. Also, a borrower's engineering...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Max; Smith, Sarah J.; Sohn, Michael D.
Fuel cells are both a longstanding and emerging technology for stationary and transportation applications, and their future use will likely be critical for the deep decarbonization of global energy systems. As we look into future applications, a key challenge for policy-makers and technology market forecasters who seek to track and/or accelerate their market adoption is the ability to forecast market costs of the fuel cells as technology innovations are incorporated into market products. Specifically, there is a need to estimate technology learning rates, which are rates of cost reduction versus production volume. Unfortunately, no literature exists for forecasting future learningmore » rates for fuel cells. In this paper, we look retrospectively to estimate learning rates for two fuel cell deployment programs: (1) the micro-combined heat and power (CHP) program in Japan, and (2) the Self-Generation Incentive Program (SGIP) in California. These two examples have a relatively broad set of historical market data and thus provide an informative and international comparison of distinct fuel cell technologies and government deployment programs. We develop a generalized procedure for disaggregating experience-curve cost-reductions in order to disaggregate the Japanese fuel cell micro-CHP market into its constituent components, and we derive and present a range of learning rates that may explain observed market trends. Finally, we explore the differences in the technology development ecosystem and market conditions that may have contributed to the observed differences in cost reduction and draw policy observations for the market adoption of future fuel cell technologies. The scientific and policy contributions of this paper are the first comparative experience curve analysis of past fuel cell technologies in two distinct markets, and the first quantitative comparison of a detailed cost model of fuel cell systems with actual market data. The resulting approach is applicable to analyzing other fuel cell markets and other energy-related technologies, and highlights the data needed for cost modeling and quantitative assessment of key cost reduction components.« less
Robust nonlinear canonical correlation analysis: application to seasonal climate forecasting
NASA Astrophysics Data System (ADS)
Cannon, A. J.; Hsieh, W. W.
2008-02-01
Robust variants of nonlinear canonical correlation analysis (NLCCA) are introduced to improve performance on datasets with low signal-to-noise ratios, for example those encountered when making seasonal climate forecasts. The neural network model architecture of standard NLCCA is kept intact, but the cost functions used to set the model parameters are replaced with more robust variants. The Pearson product-moment correlation in the double-barreled network is replaced by the biweight midcorrelation, and the mean squared error (mse) in the inverse mapping networks can be replaced by the mean absolute error (mae). Robust variants of NLCCA are demonstrated on a synthetic dataset and are used to forecast sea surface temperatures in the tropical Pacific Ocean based on the sea level pressure field. Results suggest that adoption of the biweight midcorrelation can lead to improved performance, especially when a strong, common event exists in both predictor/predictand datasets. Replacing the mse by the mae leads to improved performance on the synthetic dataset, but not on the climate dataset except at the longest lead time, which suggests that the appropriate cost function for the inverse mapping networks is more problem dependent.
Operational Forecasting and Warning systems for Coastal hazards in Korea
NASA Astrophysics Data System (ADS)
Park, Kwang-Soon; Kwon, Jae-Il; Kim, Jin-Ah; Heo, Ki-Young; Jun, Kicheon
2017-04-01
Coastal hazards caused by both Mother Nature and humans cost tremendous social, economic and environmental damages. To mitigate these damages many countries have been running the operational forecasting or warning systems. Since 2009 Korea Operational Oceanographic System (KOOS) has been developed by the leading of Korea Institute of Ocean Science and Technology (KIOST) in Korea and KOOS has been operated in 2012. KOOS is consists of several operational modules of numerical models and real-time observations and produces the basic forecasting variables such as winds, tides, waves, currents, temperature and salinity and so on. In practical application systems include storm surges, oil spills, and search and rescue prediction models. In particular, abnormal high waves (swell-like high-height waves) have occurred in the East coast of Korea peninsula during winter season owing to the local meteorological condition over the East Sea, causing property damages and the loss of human lives. In order to improve wave forecast accuracy even very local wave characteristics, numerical wave modeling system using SWAN is established with data assimilation module using 4D-EnKF and sensitivity test has been conducted. During the typhoon period for the prediction of sever waves and the decision making support system for evacuation of the ships, a high-resolution wave forecasting system has been established and calibrated.
Bridge Frost Prediction by Heat and Mass Transfer Methods
NASA Astrophysics Data System (ADS)
Greenfield, Tina M.; Takle, Eugene S.
2006-03-01
Frost on roadways and bridges can present hazardous conditions to motorists, particularly when it occurs in patches or on bridges when adjacent roadways are clear of frost. To minimize materials costs, vehicle corrosion, and negative environmental impacts, frost-suppression chemicals should be applied only when, where, and in the appropriate amounts needed to maintain roadways in a safe condition for motorists. Accurate forecasts of frost onset times, frost intensity, and frost disappearance (e.g., melting or sublimation) are needed to help roadway maintenance personnel decide when, where, and how much frost-suppression chemical to use. A finite-difference algorithm (BridgeT) has been developed that simulates vertical heat transfer in a bridge based on evolving meteorological conditions at its top and bottom as supplied by a weather forecast model. BridgeT simulates bridge temperatures at numerous points within the bridge (including its upper and lower surface) at each time step of the weather forecast model and calculates volume per unit area (i.e., depth) of deposited, melted, or sublimed frost. This model produces forecasts of bridge surface temperature, frost depth, and bridge condition (i.e., dry, wet, icy/snowy). Bridge frost predictions and bridge surface temperature are compared with observed and measured values to assess BridgeT's skill in forecasting bridge frost and associated conditions.
Reconstructing paleoclimate fields using online data assimilation with a linear inverse model
NASA Astrophysics Data System (ADS)
Perkins, Walter A.; Hakim, Gregory J.
2017-05-01
We examine the skill of a new approach to climate field reconstructions (CFRs) using an online paleoclimate data assimilation (PDA) method. Several recent studies have foregone climate model forecasts during assimilation due to the computational expense of running coupled global climate models (CGCMs) and the relatively low skill of these forecasts on longer timescales. Here we greatly diminish the computational cost by employing an empirical forecast model (linear inverse model, LIM), which has been shown to have skill comparable to CGCMs for forecasting annual-to-decadal surface temperature anomalies. We reconstruct annual-average 2 m air temperature over the instrumental period (1850-2000) using proxy records from the PAGES 2k Consortium Phase 1 database; proxy models for estimating proxy observations are calibrated on GISTEMP surface temperature analyses. We compare results for LIMs calibrated using observational (Berkeley Earth), reanalysis (20th Century Reanalysis), and CMIP5 climate model (CCSM4 and MPI) data relative to a control offline reconstruction method. Generally, we find that the usage of LIM forecasts for online PDA increases reconstruction agreement with the instrumental record for both spatial fields and global mean temperature (GMT). Specifically, the coefficient of efficiency (CE) skill metric for detrended GMT increases by an average of 57 % over the offline benchmark. LIM experiments display a common pattern of skill improvement in the spatial fields over Northern Hemisphere land areas and in the high-latitude North Atlantic-Barents Sea corridor. Experiments for non-CGCM-calibrated LIMs reveal region-specific reductions in spatial skill compared to the offline control, likely due to aspects of the LIM calibration process. Overall, the CGCM-calibrated LIMs have the best performance when considering both spatial fields and GMT. A comparison with the persistence forecast experiment suggests that improvements are associated with the linear dynamical constraints of the forecast and not simply persistence of temperature anomalies.
Weather forecasts, users' economic expenses and decision strategies
NASA Technical Reports Server (NTRS)
Carter, G. M.
1972-01-01
Differing decision models and operational characteristics affecting the economic expenses (i.e., the costs of protection and losses suffered if no protective measures have been taken) associated with the use of predictive weather information have been examined.
Planning Inmarsat's second generation of spacecraft
NASA Astrophysics Data System (ADS)
Williams, W. P.
1982-09-01
The next generation of studies of the Inmarsat service are outlined, such as traffic forecasting studies, communications capacity estimates, space segment design, cost estimates, and financial analysis. Traffic forecasting will require future demand estimates, and a computer model has been developed which estimates demand over the Atlantic, Pacific, and Indian ocean regions. Communications estimates are based on traffic estimates, as a model converts traffic demand into a required capacity figure for a given area. The Erlang formula is used, requiring additional data such as peak hour ratios and distribution estimates. Basic space segment technical requirements are outlined (communications payload, transponder arrangements, etc), and further design studies involve such areas as space segment configuration, launcher and spacecraft studies, transmission planning, and earth segment configurations. Cost estimates of proposed design parameters will be performed, but options must be reduced to make construction feasible. Finally, a financial analysis will be carried out in order to calculate financial returns.
NASA Technical Reports Server (NTRS)
Kratochvil, D.; Bowyer, J.; Bhushan, C.; Steinnagel, K.; Kaushal, D.; Al-Kinani, G.
1983-01-01
Potential satellite-provided fixed communications services, baseline forecasts, net long haul forecasts, cost analysis, net addressable forecasts, capacity requirements, and satellite system market development are considered.
NASA Astrophysics Data System (ADS)
Wang, Yuanbing; Min, Jinzhong; Chen, Yaodeng; Huang, Xiang-Yu; Zeng, Mingjian; Li, Xin
2017-01-01
This study evaluates the performance of three-dimensional variational (3DVar) and a hybrid data assimilation system using time-lagged ensembles in a heavy rainfall event. The time-lagged ensembles are constructed by sampling from a moving time window of 3 h along a model trajectory, which is economical and easy to implement. The proposed hybrid data assimilation system introduces flow-dependent error covariance derived from time-lagged ensemble into variational cost function without significantly increasing computational cost. Single observation tests are performed to document characteristic of the hybrid system. The sensitivity of precipitation forecasts to ensemble covariance weight and localization scale is investigated. Additionally, the TLEn-Var is evaluated and compared to the ETKF(ensemble transformed Kalman filter)-based hybrid assimilation within a continuously cycling framework, through which new hybrid analyses are produced every 3 h over 10 days. The 24 h accumulated precipitation, moisture, wind are analyzed between 3DVar and the hybrid assimilation using time-lagged ensembles. Results show that model states and precipitation forecast skill are improved by the hybrid assimilation using time-lagged ensembles compared with 3DVar. Simulation of the precipitable water and structure of the wind are also improved. Cyclonic wind increments are generated near the rainfall center, leading to an improved precipitation forecast. This study indicates that the hybrid data assimilation using time-lagged ensembles seems like a viable alternative or supplement in the complex models for some weather service agencies that have limited computing resources to conduct large size of ensembles.
NASA Technical Reports Server (NTRS)
Mauldin, Lemuel E., III
1993-01-01
Travel Forecaster is menu-driven, easy-to-use computer program that plans, forecasts cost, and tracks actual vs. planned cost of business-related travel of division or branch of organization and compiles information into data base to aid travel planner. Ability of program to handle multiple trip entries makes it valuable time-saving device.
With string model to time series forecasting
NASA Astrophysics Data System (ADS)
Pinčák, Richard; Bartoš, Erik
2015-10-01
Overwhelming majority of econometric models applied on a long term basis in the financial forex market do not work sufficiently well. The reason is that transaction costs and arbitrage opportunity are not included, as this does not simulate the real financial markets. Analyses are not conducted on the non equidistant date but rather on the aggregate date, which is also not a real financial case. In this paper, we would like to show a new way how to analyze and, moreover, forecast financial market. We utilize the projections of the real exchange rate dynamics onto the string-like topology in the OANDA market. The latter approach allows us to build the stable prediction models in trading in the financial forex market. The real application of the multi-string structures is provided to demonstrate our ideas for the solution of the problem of the robust portfolio selection. The comparison with the trend following strategies was performed, the stability of the algorithm on the transaction costs for long trade periods was confirmed.
NASA Astrophysics Data System (ADS)
Owens, Mathew J.; Riley, Pete
2017-11-01
Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).
Owens, Mathew J; Riley, Pete
2017-11-01
Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).
Riley, Pete
2017-01-01
Abstract Long lead‐time space‐weather forecasting requires accurate prediction of the near‐Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near‐Sun solar wind and magnetic field conditions provide the inner boundary condition to three‐dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics‐based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near‐Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near‐Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near‐Sun solar wind speed at a range of latitudes about the sub‐Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun‐Earth line. Propagating these conditions to Earth by a three‐dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one‐dimensional “upwind” scheme is used. The variance in the resulting near‐Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996–2016, the upwind ensemble is found to provide a more “actionable” forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large). PMID:29398982
A Public-Private-Acadmic Partnership to Advance Solar Power Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haupt, Sue Ellen
The National Center for Atmospheric Research (NCAR) is pleased to have led a partnership to advance the state-of-the-science of solar power forecasting by designing, developing, building, deploying, testing, and assessing the SunCast™ Solar Power Forecasting System. The project has included cutting edge research, testing in several geographically- and climatologically-diverse high penetration solar utilities and Independent System Operators (ISOs), and wide dissemination of the research results to raise the bar on solar power forecasting technology. The partners include three other national laboratories, six universities, and industry partners. This public-private-academic team has worked in concert to perform use-inspired research to advance solarmore » power forecasting through cutting-edge research to advance both the necessary forecasting technologies and the metrics for evaluating them. The project has culminated in a year-long, full-scale demonstration of provide irradiance and power forecasts to utilities and ISOs to use in their operations. The project focused on providing elements of a value chain, beginning with the weather that causes a deviation from clear sky irradiance and progresses through monitoring and observations, modeling, forecasting, dissemination and communication of the forecasts, interpretation of the forecast, and through decision-making, which produces outcomes that have an economic value. The system has been evaluated using metrics developed specifically for this project, which has provided rich information on model and system performance. Research was accomplished on the very short range (0-6 hours) Nowcasting system as well as on the longer term (6-72 hour) forecasting system. The shortest range forecasts are based on observations in the field. The shortest range system, built by Brookhaven National Laboratory (BNL) and based on Total Sky Imagers (TSIs) is TSICast, which operates on the shortest time scale with a latency of only a few minutes and forecasts that currently go out to about 15 min. This project has facilitated research in improving the hardware and software so that the new high definition cameras deployed at multiple nearby locations allow discernment of the clouds at varying levels and advection according to the winds observed at those levels. Improvements over “smart persistence” are about 29% for even these very short forecasts. StatCast is based on pyranometer data measured at the site as well as concurrent meteorological observations and forecasts. StatCast is based on regime-dependent artificial intelligence forecasting techniques and has been shown to improve on “smart persistence” forecasts by 15-50%. A second category of short-range forecasting systems employ satellite imagery and use that information to discern clouds and their motion, allowing them to project the clouds, and the resulting blockage of irradiance, in time. CIRACast (the system produced by the Cooperative Institute for Atmospheric Research [CIRA] at Colorado State University) was already one of the more advanced cloud motion systems, which is the reason that team was brought to this project. During the project timeframe, the CIRA team was able to advance cloud shadowing, parallax removal, and implementation of better advecting winds at different altitudes. CIRACast shows generally a 25-40% improvement over Smart Persistence between sunrise and approximately 1600 UTC (Coordinated Universal Time) . A second satellite-based system, MADCast (Multi-sensor Advective Diffusive foreCast system), assimilates data from multiple satellite imagers and profilers to assimilate a fully three-dimensional picture of the cloud into the dynamic core of WRF. During 2015, MADCast (provided at least 70% improvement over Smart Persistence, with most of that skill being derived during partly cloudy conditions. That allows advection of the clouds via the Weather Research and Forecasting (WRF) model dynamics directly. After WRF-Solar™ showed initial success, it was also deployed in nowcasting mode with coarser runs out to 6 hours made hourly. It provided improvements on the order of 50-60% over Smart Persistence for forecasts up to 1600 UTC. The advantages of WRF-Solar-Nowcasting and MADCast were then blended to develop the new MAD-WRF model that incorporates the most important features of each of those models, both assimilating satellite cloud fields and using WRF-So far physics to develop and dissipate clouds. MAE improvements for MAD-WRF for forecasts from 3-6 hours are improved over WRF-Solar-Now by 20%. While all the Nowcasting system components by themselves provide improvement over Smart Persistence, the largest benefit is derived when they are smartly blended together by the Nowcasting Integrator to produce an integrated forecast. The development of WRF-Solar™ under this project has provided the first numerical weather prediction (NWP) model specifically designed to meet the needs of irradiance forecasting. The first augmentation improved the solar tracking algorithm to account for deviations associated with the eccentricity of the Earth’s orbit and the obliquity of the Earth. Second, WRF-Solar™ added the direct normal irradiance (DNI) and diffuse (DIF) components from the radiation parameterization to the model output. Third, efficient parameterizations were implemented to either interpolate the irradiance in between calls to the expensive radiative transfer parameterization, or to use a fast radiative transfer code that avoids computing three-dimensional heating rates but provides the surface irradiance. Fourth, a new parameterization was developed to improve the representation of absorption and scattering of radiation by aerosols (aerosol direct effect). A fifth advance is that the aerosols now interact with the cloud microphysics, altering the cloud evolution and radiative properties, an effect that has been traditionally only implemented in atmospheric computationally costly chemistry models. A sixth development accounts for the feedbacks that sub-grid scale clouds produce in shortwave irradiance as implemented in a shallow cumulus parameterization Finally, WRF-Solar™ also allows assimilation of infrared irradiances from satellites to determine the three dimensional cloud field, allowing for an improved initialization of the cloud field that increases the performance of short-range forecasts. We find that WRF-Solar™ can improve clear sky irradiance prediction by 15-80% over a standard version of WRF, depending on location and cloud conditions. In a formal comparison to the NAM baseline, WRF-Solar™ showed improvements in the Day-Ahead forecast of 22-42%. The SunCast™ system requires substantial software engineering to blend all of the new model components as well as existing publically available NWP model runs. To do this we use an expert system for the Nowcasting blender and the Dynamic Integrated foreCast (DICast®) system for the NWP models. These two systems are then blended, we use an empirical power conversion method to convert the irradiance predictions to power, then apply an analog ensemble (AnEn) approach to further tune the forecast as well as to estimate its uncertainty. The AnEn module decreased RMSE (root mean squared error) by 17% over the blended SunCast™ power forecasts and provided skill in the probabilistic forecast with a Brier Skill Score of 0.55. In addition, we have also developed a Gridded Atmospheric Forecast System (GRAFS) in parallel, leveraging cost share funds. An economic evaluation based on Production Cost Modeling in the Public Service Company of Colorado showed that the observed 50% improvement in forecast accuracy will save their customers $819,200 with the projected MW deployment for 2024. Using econometrics, NCAR has scaled this savings to a national level and shown that an annual expected savings for this 50% forecast error reduction ranges from $11M in 2015 to $43M expected in 2040 with increased solar deployment. This amounts to a $455M discounted savings over the 26 year period of analysis.« less
NASA Astrophysics Data System (ADS)
Abernethy, Jennifer A.
Pilots' ability to avoid clear-air turbulence (CAT) during flight affects the safety of the millions of people who fly commercial airlines and other aircraft, and turbulence costs millions in injuries and aircraft maintenance every year. Forecasting CAT is not straightforward, however; microscale features like the turbulence eddies that affect aircraft (100m) are below the current resolution of operational numerical weather prediction (NWP) models, and the only evidence of CAT episodes, until recently, has been sparse, subjective reports from pilots known as PIREPs. To forecast CAT, researchers use a simple weighted sum of top-performing turbulence indicators derived from NWP model outputs---termed diagnostics---based on their agreement with current PIREPs. However, a new, quantitative source of observation data---high-density measurements made by sensor equipment and software on aircraft, called in-situ measurements---is now available. The main goal of this thesis is to develop new data analysis and processing techniques to apply to the model and new observation data, in order to improve CAT forecasting accuracy. This thesis shows that using in-situ data improves forecasting accuracy and that automated machine learning algorithms such as support vector machines (SVM), logistic regression, and random forests, can match current performance while eliminating almost all hand-tuning. Feature subset selection is paired with the new algorithms to choose diagnostics that predict well as a group rather than individually. Specializing forecasts and choice of diagnostics by geographic region further improves accuracy because of the geographic variation in turbulence sources. This work uses random forests to find climatologically-relevant regions based on these variations and implements a forecasting system testbed which brings these techniques together to rapidly prototype new, regionalized versions of operational CAT forecasting systems.
NASA Astrophysics Data System (ADS)
Rochoux, M. C.; Ricci, S.; Lucor, D.; Cuenot, B.; Trouvé, A.
2014-05-01
This paper is the first part in a series of two articles and presents a data-driven wildfire simulator for forecasting wildfire spread scenarios, at a reduced computational cost that is consistent with operational systems. The prototype simulator features the following components: a level-set-based fire propagation solver FIREFLY that adopts a regional-scale modeling viewpoint, treats wildfires as surface propagating fronts, and uses a description of the local rate of fire spread (ROS) as a function of environmental conditions based on Rothermel's model; a series of airborne-like observations of the fire front positions; and a data assimilation algorithm based on an ensemble Kalman filter (EnKF) for parameter estimation. This stochastic algorithm partly accounts for the non-linearities between the input parameters of the semi-empirical ROS model and the fire front position, and is sequentially applied to provide a spatially-uniform correction to wind and biomass fuel parameters as observations become available. A wildfire spread simulator combined with an ensemble-based data assimilation algorithm is therefore a promising approach to reduce uncertainties in the forecast position of the fire front and to introduce a paradigm-shift in the wildfire emergency response. In order to reduce the computational cost of the EnKF algorithm, a surrogate model based on a polynomial chaos (PC) expansion is used in place of the forward model FIREFLY in the resulting hybrid PC-EnKF algorithm. The performance of EnKF and PC-EnKF is assessed on synthetically-generated simple configurations of fire spread to provide valuable information and insight on the benefits of the PC-EnKF approach as well as on a controlled grassland fire experiment. The results indicate that the proposed PC-EnKF algorithm features similar performance to the standard EnKF algorithm, but at a much reduced computational cost. In particular, the re-analysis and forecast skills of data assimilation strongly relate to the spatial and temporal variability of the errors in the ROS model parameters.
NASA Astrophysics Data System (ADS)
Rochoux, M. C.; Ricci, S.; Lucor, D.; Cuenot, B.; Trouvé, A.
2014-11-01
This paper is the first part in a series of two articles and presents a data-driven wildfire simulator for forecasting wildfire spread scenarios, at a reduced computational cost that is consistent with operational systems. The prototype simulator features the following components: an Eulerian front propagation solver FIREFLY that adopts a regional-scale modeling viewpoint, treats wildfires as surface propagating fronts, and uses a description of the local rate of fire spread (ROS) as a function of environmental conditions based on Rothermel's model; a series of airborne-like observations of the fire front positions; and a data assimilation (DA) algorithm based on an ensemble Kalman filter (EnKF) for parameter estimation. This stochastic algorithm partly accounts for the nonlinearities between the input parameters of the semi-empirical ROS model and the fire front position, and is sequentially applied to provide a spatially uniform correction to wind and biomass fuel parameters as observations become available. A wildfire spread simulator combined with an ensemble-based DA algorithm is therefore a promising approach to reduce uncertainties in the forecast position of the fire front and to introduce a paradigm-shift in the wildfire emergency response. In order to reduce the computational cost of the EnKF algorithm, a surrogate model based on a polynomial chaos (PC) expansion is used in place of the forward model FIREFLY in the resulting hybrid PC-EnKF algorithm. The performance of EnKF and PC-EnKF is assessed on synthetically generated simple configurations of fire spread to provide valuable information and insight on the benefits of the PC-EnKF approach, as well as on a controlled grassland fire experiment. The results indicate that the proposed PC-EnKF algorithm features similar performance to the standard EnKF algorithm, but at a much reduced computational cost. In particular, the re-analysis and forecast skills of DA strongly relate to the spatial and temporal variability of the errors in the ROS model parameters.
NASA Astrophysics Data System (ADS)
Baklanov, Alexander; Smith Korsholm, Ulrik; Nuterman, Roman; Mahura, Alexander; Pagh Nielsen, Kristian; Hansen Sass, Bent; Rasmussen, Alix; Zakey, Ashraf; Kaas, Eigil; Kurganskiy, Alexander; Sørensen, Brian; González-Aparicio, Iratxe
2017-08-01
The Environment - High Resolution Limited Area Model (Enviro-HIRLAM) is developed as a fully online integrated numerical weather prediction (NWP) and atmospheric chemical transport (ACT) model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI) in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2), in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct) on radiation and (first and second indirect effects) on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform - HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose
model configurations for the meteorological and air quality communities are discussed.
NASA Astrophysics Data System (ADS)
Wanders, Niko; Wood, Eric
2016-04-01
Sub-seasonal to seasonal weather and hydrological forecasts have the potential to provide vital information for a variety of water-related decision makers. For example, seasonal forecasts of drought risk can enable farmers to make adaptive choices on crop varieties, labour usage, and technology investments. Seasonal and sub-seasonal predictions can increase preparedness to hydrological extremes that regularly occur in all regions of the world with large impacts on society. We investigated the skill of six seasonal forecast models from the NMME-2 ensemble coupled to two global hydrological models (VIC and PCRGLOBWB) for the period 1982-2012. The 31 years of NNME-2 hindcast data is used in combination with an ensemble mean and ESP forecast, to forecast important hydrological variables (e.g. soil moisture, groundwater storage, snow, reservoir levels and river discharge). By using two global hydrological models we are able to quantify both the uncertainty in the meteorological input and the uncertainty created by the different hydrological models. We show that the NMME-2 forecast outperforms the ESP forecasts in terms of anomaly correlation and brier skill score for all forecasted hydrological variables, with a low uncertainty in the performance amongst the hydrological models. However, the continuous ranked probability score (CRPS) of the NMME-2 ensemble is inferior to the ESP due to a large spread between the individual ensemble members. We use a cost analysis to show that the damage caused by floods and droughts in large scale rivers can globally be reduced by 48% (for leads from 1-2 months) to 20% (for leads between 6-9 months) when precautions are taken based on the NMME-2 ensemble instead of an ESP forecast. In collaboration with our local partner in West Africa (AGHRYMET), we looked at the performance of the sub-seasonal forecasts for crop planting dates and high flow season in West Africa. We show that the uncertainty in the optimal planting date is reduced from 30 days to 12 days (2.5 month lead) and an increased predictability of the high flow season from 45 days to 20 days (3-4 months lead). Additionally, we show that snow accumulation and melt onset in the Northern hemisphere can be forecasted with an uncertainty of 10 days (2.5 months lead). Both the overall skill, and the skill found in these last two examples, indicates that the new NMME-2 forecast dataset is valuable for sub-seasonal forecast applications. The high temporal resolution (daily), long leads (one year leads) and large hindcast archive enable new sub-seasonal forecasting applications to be explored. We show that the NMME-2 has a large potential for sub-seasonal hydrological forecasting and other potential hydrological applications (e.g. reservoir management), which could benefit from these new forecasts.
Present and future hydropower scheduling in Statkraft
NASA Astrophysics Data System (ADS)
Bruland, O.
2012-12-01
Statkraft produces close to 40 TWH in an average year and is one of the largest hydropower producers in Europe. For hydropower producers the scheduling of electricity generation is the key to success and this depend on optimal use of the water resources. The hydrologist and his forecasts both on short and on long terms are crucial to this success. The hydrological forecasts in Statkraft and most hydropower companies in Scandinavia are based on lumped models and the HBV concept. But before the hydrological model there is a complex system for collecting, controlling and correcting data applied in the models and the production scheduling and, equally important, routines for surveillance of the processes and manual intervention. Prior to the forecasting the states in the hydrological models are updated based on observations. When snow is present in the catchments snow surveys are an important source for model updating. The meteorological forecast is another premise provider to the hydrological forecast and to get as precise meteorological forecast as possible Statkraft hires resources from the governmental forecasting center. Their task is to interpret the meteorological situation, describe the uncertainties and if necessary use their knowledge and experience to manually correct the forecast in the hydropower production regions. This is one of several forecast applied further in the scheduling process. Both to be able to compare and evaluate different forecast providers and to ensure that we get the best available forecast, forecasts from different sources are applied. Some of these forecasts have undergone statistical corrections to reduce biases. The uncertainties related to the meteorological forecast have for a long time been approached and described by ensemble forecasts. But also the observations used for updating the model have a related uncertainty. Both to the observations itself and to how well they represent the catchment. Though well known, these uncertainties have thus far been handled superficially. Statkraft has initiated a program called ENKI to approach these issues. A part of this program is to apply distributed models for hydrological forecasting. Developing methodologies to handle uncertainties in the observations, the meteorological forecasts, the model itself and how to update the model with this information are other parts of the program. Together with energy price expectations and information about the state of the energy production system the hydrological forecast is input to the next step in the production scheduling both on short and long term. The long term schedule for reservoir filling is premise provider to the short term optimizing of water. The long term schedule is based on the actual reservoir levels, snow storages and a long history of meteorological observations and gives an overall schedule at a regional level. Within the regions a more detailed tool is used for short term optimizing of the hydropower production Each reservoir is scheduled taking into account restrictions in the water courses and cost of start and stop of aggregates. The value of the water is calculated for each reservoir and reflects the risk of water spillage. This compared to the energy price determines whether an aggregate will run or not. In a gradually more complex energy system with relatively lower regulated capacity this is an increasingly more challenging task.
NASA Astrophysics Data System (ADS)
Barthélémy, S.; Ricci, S.; Morel, T.; Goutal, N.; Le Pape, E.; Zaoui, F.
2018-07-01
In the context of hydrodynamic modeling, the use of 2D models is adapted in areas where the flow is not mono-dimensional (confluence zones, flood plains). Nonetheless the lack of field data and the computational cost constraints limit the extensive use of 2D models for operational flood forecasting. Multi-dimensional coupling offers a solution with 1D models where the flow is mono-dimensional and with local 2D models where needed. This solution allows for the representation of complex processes in 2D models, while the simulated hydraulic state is significantly better than that of the full 1D model. In this study, coupling is implemented between three 1D sub-models and a local 2D model for a confluence on the Adour river (France). A Schwarz algorithm is implemented to guarantee the continuity of the variables at the 1D/2D interfaces while in situ observations are assimilated in the 1D sub-models to improve results and forecasts in operational mode as carried out by the French flood forecasting services. An implementation of the coupling and data assimilation (DA) solution with domain decomposition and task/data parallelism is proposed so that it is compatible with operational constraints. The coupling with the 2D model improves the simulated hydraulic state compared to a global 1D model, and DA improves results in 1D and 2D areas.
Replacement Beef Cow Valuation under Data Availability Constraints
Hagerman, Amy D.; Thompson, Jada M.; Ham, Charlotte; Johnson, Kamina K.
2017-01-01
Economists are often tasked with estimating the benefits or costs associated with livestock production losses; however, lack of available data or absence of consistent reporting can reduce the accuracy of these valuations. This work looks at three potential estimation techniques for determining the value for replacement beef cows with varying types of market data to proxy constrained data availability and discusses the potential margin of error for each technique. Oklahoma bred replacement cows are valued using hedonic pricing based on Oklahoma bred cow data—a best case scenario—vector error correction modeling (VECM) based on national cow sales data and cost of production (COP) based on just a representative enterprise budget and very limited sales data. Each method was then used to perform a within-sample forecast of 2016 January to December, and forecasts are compared with the 2016 monthly observed market prices in Oklahoma using the mean absolute percent error (MAPE). Hedonic pricing methods tend to overvalue for within-sample forecasting but performed best, as measured by MAPE for high quality cows. The VECM tended to undervalue cows but performed best for younger animals. COP performed well, compared with the more data intensive methods. Examining each method individually across eight representative replacement beef female types, the VECM forecast resulted in a MAPE under 10% for 33% of forecasted months, followed by hedonic pricing at 24% of the forecasted months and COP at 14% of the forecasted months for average quality beef females. For high quality females, the hedonic pricing method worked best producing a MAPE under 10% in 36% of the forecasted months followed by the COP method at 21% of months and the VECM at 14% of the forecasted months. These results suggested that livestock valuation method selection was not one-size-fits-all and may need to vary based not only on the data available but also on the characteristics (e.g., quality or age) of the livestock being valued. PMID:29164141
Cost drivers and resource allocation in military health care systems.
Fulton, Larry; Lasdon, Leon S; McDaniel, Reuben R
2007-03-01
This study illustrates the feasibility of incorporating technical efficiency considerations in the funding of military hospitals and identifies the primary drivers for hospital costs. Secondary data collected for 24 U.S.-based Army hospitals and medical centers for the years 2001 to 2003 are the basis for this analysis. Technical efficiency was measured by using data envelopment analysis; subsequently, efficiency estimates were included in logarithmic-linear cost models that specified cost as a function of volume, complexity, efficiency, time, and facility type. These logarithmic-linear models were compared against stochastic frontier analysis models. A parsimonious, three-variable, logarithmic-linear model composed of volume, complexity, and efficiency variables exhibited a strong linear relationship with observed costs (R(2) = 0.98). This model also proved reliable in forecasting (R(2) = 0.96). Based on our analysis, as much as $120 million might be reallocated to improve the United States-based Army hospital performance evaluated in this study.
Forecasting Construction Cost Index based on visibility graph: A network approach
NASA Astrophysics Data System (ADS)
Zhang, Rong; Ashuri, Baabak; Shyr, Yu; Deng, Yong
2018-03-01
Engineering News-Record (ENR), a professional magazine in the field of global construction engineering, publishes Construction Cost Index (CCI) every month. Cost estimators and contractors assess projects, arrange budgets and prepare bids by forecasting CCI. However, fluctuations and uncertainties of CCI cause irrational estimations now and then. This paper aims at achieving more accurate predictions of CCI based on a network approach in which time series is firstly converted into a visibility graph and future values are forecasted relied on link prediction. According to the experimental results, the proposed method shows satisfactory performance since the error measures are acceptable. Compared with other methods, the proposed method is easier to implement and is able to forecast CCI with less errors. It is convinced that the proposed method is efficient to provide considerably accurate CCI predictions, which will make contributions to the construction engineering by assisting individuals and organizations in reducing costs and making project schedules.
Hydrologic Forecasting in the 21st Century: Challenges and Directions of Research
NASA Astrophysics Data System (ADS)
Restrepo, P.; Schaake, J.
2009-04-01
Traditionally, the role of the Hydrology program of the National Weather Service has been centered around forecasting floods, in order to minimize loss of lives and damage to property as a result of floods as well as water levels for navigable rivers, and water supply in some areas of the country. A number of factors, including shifting population patterns, widespread drought and concerns about climate change have made it imperative to widen the focus to cover forecasting flows ranging from drought to floods and anything in between. Because of these concerns, it is imperative to develop models that rely more on the physical characteristics of the watershed for parameterization and less on historical observations. Furthermore, it is also critical to consider explicitly the sources of uncertainty in the forecasting process, including parameter values, model structure, forcings (both observations and forecasts), initial conditions, and streamflow observations. A consequence of more widespread occurrence of low flows as a result either of the already evident earlier snowmelt in the Western United States, or of the predicted changes in precipitation patterns, is the issue of water quality: lower flows will have higher concentrations of certain pollutants. This paper describes the current projects and future directions of research for hydrologic forecasting in the United States. Ongoing projects on quantitative precipitation and temperature estimates and forecasts, uncertainty modeling by the use of ensembles, data assimilation, verification, distributed conceptual modeling will be reviewed. Broad goals of the research directions are: 1) reliable modeling of the different sources of uncertainty. 2) a more expeditious and cost-effective approach by reducing the effort required in model calibration; 3) improvements in forecast lead-time and accuracy; 4) an approach for rapid adjustment of model parameters to account for changes in the watershed, both rapid as the result from forest fires or levee breaches, and slow, as the result of watershed reforestation, reforestation or urban development; 5) an expanded suite of products, including soil moisture and temperature forecasts, and water quality constituents; and 6) a comprehensive verification system to assess the effectiveness of the other 5 goals. To this end, the research plan places an emphasis on research of models with parameters that can be derived from physical watershed characteristics. Purely physically based models may be unattainable or impractical, and, therefore, models resulting from a combination of physically and conceptually approached processes may be required With respect to the hydrometeorological forcings the research plan emphasizes the development of improved precipitation estimation techniques through the synthesis of radar, rain gauge, satellite, and numerical weather prediction model output, particularly in those areas where ground-based sensors are inadequate to detect spatial variability in precipitation. Better estimation and forecasting of precipitation are most likely to be achieved by statistical merging of remote-sensor observations and forecasts from high-resolution numerical prediction models. Enhancements to the satellite-based precipitation products will include use of TRMM precipitation data in preparation for information to be supplied by the Global Precipitation Mission satellites not yet deployed. Because of a growing need for services in water resources, including low-flow forecasts for water supply customers, we will be directing research into coupled surface-groundwater models that will eventually replace the groundwater component of the existing models, and will be part of the new generation of models. Finally, the research plan covers the directions of research for probabilistic forecasting using ensembles, data assimilation and the verification and validation of both deterministic and probabilistic forecasts.
Obesity and severe obesity forecasts through 2030.
Finkelstein, Eric A; Khavjou, Olga A; Thompson, Hope; Trogdon, Justin G; Pan, Liping; Sherry, Bettylou; Dietz, William
2012-06-01
Previous efforts to forecast future trends in obesity applied linear forecasts assuming that the rise in obesity would continue unabated. However, evidence suggests that obesity prevalence may be leveling off. This study presents estimates of adult obesity and severe obesity prevalence through 2030 based on nonlinear regression models. The forecasted results are then used to simulate the savings that could be achieved through modestly successful obesity prevention efforts. The study was conducted in 2009-2010 and used data from the 1990 through 2008 Behavioral Risk Factor Surveillance System (BRFSS). The analysis sample included nonpregnant adults aged ≥ 18 years. The individual-level BRFSS variables were supplemented with state-level variables from the U.S. Bureau of Labor Statistics, the American Chamber of Commerce Research Association, and the Census of Retail Trade. Future obesity and severe obesity prevalence were estimated through regression modeling by projecting trends in explanatory variables expected to influence obesity prevalence. Linear time trend forecasts suggest that by 2030, 51% of the population will be obese. The model estimates a much lower obesity prevalence of 42% and severe obesity prevalence of 11%. If obesity were to remain at 2010 levels, the combined savings in medical expenditures over the next 2 decades would be $549.5 billion. The study estimates a 33% increase in obesity prevalence and a 130% increase in severe obesity prevalence over the next 2 decades. If these forecasts prove accurate, this will further hinder efforts for healthcare cost containment. Copyright © 2012 Elsevier Inc. All rights reserved.
Shipping emission forecasts and cost-benefit analysis of China ports and key regions' control.
Liu, Huan; Meng, Zhi-Hang; Shang, Yi; Lv, Zhao-Feng; Jin, Xin-Xin; Fu, Ming-Liang; He, Ke-Bin
2018-05-01
China established Domestic Emission Control Area (DECA) for sulphur since 2015 to constrain the increasing shipping emissions. However, future DECA policy-makings are not supported due to a lack of quantitive evaluations. To investigate the effects of current and possible Chinese DECAs policies, a model is presented for the forecast of shipping emissions and evaluation of potential costs and benefits of an DECA policy package set in 2020. It includes a port-level and regional-level projection accounting for shipping trade volume growth, share of ship types, and fuel consumption. The results show that without control measures, both SO 2 and particulate matter (PM) emissions are expected to increase by 15.3-61.2% in Jing-Jin-Ji, the Yangtze River Delta, and the Pearl River Delta from 2013 to 2020. However, most emissions can be reduced annually by the establishment of a DECA that depends on the size of the control area and the fuel sulphur content limit. Costs range from 0.667 to 1.561 billion dollars (control regional shipping emissions) based on current fuel price. A social cost method shows the regional control scenarios benefit-cost ratios vary from 4.3 to 5.1 with large uncertainty. Chemical transportation model combined with health model method is used to get the monetary health benefits and then compared with the results from social cost method. This study suggests that Chinese DECAs will reduce the projected emissions at a favorable benefit-cost ratio, and furthermore proposes policy combinations that provide high cost-effective benefits as a reference for future policy-making. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Calibration of Ocean Forcing with satellite Flux Estimates (COFFEE)
NASA Astrophysics Data System (ADS)
Barron, Charlie; Jan, Dastugue; Jackie, May; Rowley, Clark; Smith, Scott; Spence, Peter; Gremes-Cordero, Silvia
2016-04-01
Predicting the evolution of ocean temperature in regional ocean models depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. Within the COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates, real-time satellite observations are used to estimate shortwave, longwave, sensible, and latent air-sea heat flux corrections to a background estimate from the prior day's regional or global model forecast. These satellite-corrected fluxes are used to prepare a corrected ocean hindcast and to estimate flux error covariances to project the heat flux corrections for a 3-5 day forecast. In this way, satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. While traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle, COFFEE endeavors to appropriately partition and reduce among various surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using operational global or regional atmospheric forcing. Experiment cases combine different levels of flux calibration with assimilation alternatives. The cases use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
A three-stage birandom program for unit commitment with wind power uncertainty.
Zhang, Na; Li, Weidong; Liu, Rao; Lv, Quan; Sun, Liang
2014-01-01
The integration of large-scale wind power adds a significant uncertainty to power system planning and operating. The wind forecast error is decreased with the forecast horizon, particularly when it is from one day to several hours ahead. Integrating intraday unit commitment (UC) adjustment process based on updated ultra-short term wind forecast information is one way to improve the dispatching results. A novel three-stage UC decision method, in which the day-ahead UC decisions are determined in the first stage, the intraday UC adjustment decisions of subfast start units are determined in the second stage, and the UC decisions of fast-start units and dispatching decisions are determined in the third stage is presented. Accordingly, a three-stage birandom UC model is presented, in which the intraday hours-ahead forecasted wind power is formulated as a birandom variable, and the intraday UC adjustment event is formulated as a birandom event. The equilibrium chance constraint is employed to ensure the reliability requirement. A birandom simulation based hybrid genetic algorithm is designed to solve the proposed model. Some computational results indicate that the proposed model provides UC decisions with lower expected total costs.
Christensen, Torsten Lundgaard; Poulsen, Peter Bo; Holmstrom, Stefan; Walt, John G; Vetrugno, Michele
2005-11-01
Glaucoma is generally managed by decreasing the intraocular pressure (IOP) to a level believed to prevent further damage to the optic disc and loss of visual field. This may be achieved medically or surgically. The objective of this pharmacoeconomic analysis was to investigate the 4-year costs of bimatoprost 0.03% (Lumigan) eye drops as an alternative to filtration surgery (FS) for glaucoma patients on maximum tolerable medical therapy (MTMT). A Markov model was designed using effectiveness and resource use data from a randomized clinical trial and expert statements (Delphi panel). The RCT covered 83 patients on MTMT. The Model compared bimatoprost with FS. In the bimatoprost model arm patients began treatment with bimatoprost. If target IOP (-20%) was not reached using medical therapy the patient proceeded with FS. In the FS model arm, FS was performed after the first ophthalmologist visit. Unit costs were obtained from an Italian chart and tariffs review (healthcare sector perspective). The RCT showed that 74.7% of the patients delayed the need for FS by 3 months. The Markov model forecasted that 64.2% of the patients could delay the need for FS by 1 year, and forecasted 34.0% could avoid FS after 4 years. The 4-year cost per patient in the bimatoprost and FS arms was E3438 and E4194, respectively (incremental costs of E755). The major cost drivers for the bimatoprost arm were patients who needed combination therapy or FS if the target IOP was not reached. In the FS arm, the major cost drives were the initial surgery costs and pressure-lowering medications used as add-on therapy after FS. The analysis shows that in a 4-year perspective bimatoprost is cheaper compared to FS. In addition, the postponement of FS associated with bimatoprost may have important implications for waiting list planning.
NASA Astrophysics Data System (ADS)
Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie
2015-08-01
The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.
Army College Fund Cost-Effectiveness Study
1990-11-01
Section A.2 presents a theory of enlistment supply to provide a basis for specifying the regression model , The model Is specified in Section A.3, which...Supplementary materials are included in the final four sections. Section A.6 provides annual trends in the regression model variables. Estimates of the model ...millions, A.S. ESTIMATION OF A YOUTH EARNINGS FORECASTING MODEL Civilian pay is an important explanatory variable in the regression model . Previous
NASA Technical Reports Server (NTRS)
1977-01-01
A demonstration experiment is being planned to show that frost and freeze prediction improvements are possible utilizing timely Synchronous Meteorological Satellite temperature measurements and that this information can affect Florida citrus grower operations and decisions so as to significantly reduce the cost for frost and freeze protection and crop losses. The design and implementation of the first phase of an economic experiment which will monitor citrus growers decisions, actions, costs and losses, and meteorological forecasts and actual weather events was carried out. The economic experiment was designed to measure the change in annual protection costs and crop losses which are the direct result of improved temperature forecasts. To estimate the benefits that may result from improved temperature forecasting capability, control and test groups were established with effective separation being accomplished temporally. The control group, utilizing current forecasting capability, was observed during the 1976-77 frost season and the results are reported. A brief overview is given of the economic experiment, the results obtained to date, and the work which still remains to be done.
Building the Sun4Cast System: Improvements in Solar Power Forecasting
Haupt, Sue Ellen; Kosovic, Branko; Jensen, Tara; ...
2017-06-16
The Sun4Cast System results from a research-to-operations project built on a value chain approach, and benefiting electric utilities’ customers, society, and the environment by improving state-of-the-science solar power forecasting capabilities. As integration of solar power into the national electric grid rapidly increases, it becomes imperative to improve forecasting of this highly variable renewable resource. Thus, a team of researchers from public, private, and academic sectors partnered to develop and assess a new solar power forecasting system, Sun4Cast. The partnership focused on improving decision-making for utilities and independent system operators, ultimately resulting in improved grid stability and cost savings for consumers.more » The project followed a value chain approach to determine key research and technology needs to reach desired results. Sun4Cast integrates various forecasting technologies across a spectrum of temporal and spatial scales to predict surface solar irradiance. Anchoring the system is WRF-Solar, a version of the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) model optimized for solar irradiance prediction. Forecasts from multiple NWP models are blended via the Dynamic Integrated Forecast (DICast) System, the basis of the system beyond about 6 h. For short-range (0-6 h) forecasts, Sun4Cast leverages several observation-based nowcasting technologies. These technologies are blended via the Nowcasting Expert System Integrator (NESI). The NESI and DICast systems are subsequently blended to produce short to mid-term irradiance forecasts for solar array locations. The irradiance forecasts are translated into power with uncertainties quantified using an analog ensemble approach, and are provided to the industry partners for real-time decision-making. The Sun4Cast system ran operationally throughout 2015 and results were assessed. As a result, this paper analyzes the collaborative design process, discusses the project results, and provides recommendations for best-practice solar forecasting.« less
Building the Sun4Cast System: Improvements in Solar Power Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haupt, Sue Ellen; Kosovic, Branko; Jensen, Tara
The Sun4Cast System results from a research-to-operations project built on a value chain approach, and benefiting electric utilities’ customers, society, and the environment by improving state-of-the-science solar power forecasting capabilities. As integration of solar power into the national electric grid rapidly increases, it becomes imperative to improve forecasting of this highly variable renewable resource. Thus, a team of researchers from public, private, and academic sectors partnered to develop and assess a new solar power forecasting system, Sun4Cast. The partnership focused on improving decision-making for utilities and independent system operators, ultimately resulting in improved grid stability and cost savings for consumers.more » The project followed a value chain approach to determine key research and technology needs to reach desired results. Sun4Cast integrates various forecasting technologies across a spectrum of temporal and spatial scales to predict surface solar irradiance. Anchoring the system is WRF-Solar, a version of the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) model optimized for solar irradiance prediction. Forecasts from multiple NWP models are blended via the Dynamic Integrated Forecast (DICast) System, the basis of the system beyond about 6 h. For short-range (0-6 h) forecasts, Sun4Cast leverages several observation-based nowcasting technologies. These technologies are blended via the Nowcasting Expert System Integrator (NESI). The NESI and DICast systems are subsequently blended to produce short to mid-term irradiance forecasts for solar array locations. The irradiance forecasts are translated into power with uncertainties quantified using an analog ensemble approach, and are provided to the industry partners for real-time decision-making. The Sun4Cast system ran operationally throughout 2015 and results were assessed. As a result, this paper analyzes the collaborative design process, discusses the project results, and provides recommendations for best-practice solar forecasting.« less
NASA Technical Reports Server (NTRS)
Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith
2000-01-01
This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.
Quantifying the Economic and Grid Reliability Impacts of Improved Wind Power Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Martinez-Anido, Carlo Brancucci; Wu, Hongyu
Wind power forecasting is an important tool in power system operations to address variability and uncertainty. Accurately doing so is important to reducing the occurrence and length of curtailment, enhancing market efficiency, and improving the operational reliability of the bulk power system. This research quantifies the value of wind power forecasting improvements in the IEEE 118-bus test system as modified to emulate the generation mixes of Midcontinent, California, and New England independent system operator balancing authority areas. To measure the economic value, a commercially available production cost modeling tool was used to simulate the multi-timescale unit commitment (UC) and economicmore » dispatch process for calculating the cost savings and curtailment reductions. To measure the reliability improvements, an in-house tool, FESTIV, was used to calculate the system's area control error and the North American Electric Reliability Corporation Control Performance Standard 2. The approach allowed scientific reproducibility of results and cross-validation of the tools. A total of 270 scenarios were evaluated to accommodate the variation of three factors: generation mix, wind penetration level, and wind fore-casting improvements. The modified IEEE 118-bus systems utilized 1 year of data at multiple timescales, including the day-ahead UC, 4-hour-ahead UC, and 5-min real-time dispatch. The value of improved wind power forecasting was found to be strongly tied to the conventional generation mix, existence of energy storage devices, and the penetration level of wind energy. The simulation results demonstrate that wind power forecasting brings clear benefits to power system operations.« less
NASA Astrophysics Data System (ADS)
Stephenson, S. R.; Babiker, M.; Sandven, S.; Muckenhuber, S.; Korosov, A.; Bobylev, L.; Vesman, A.; Mushta, A.; Demchev, D.; Volkov, V.; Smirnov, K.; Hamre, T.
2015-12-01
Sea ice monitoring and forecasting systems are important tools for minimizing accident risk and environmental impacts of Arctic maritime operations. Satellite data such as synthetic aperture radar (SAR), combined with atmosphere-ice-ocean forecasting models, navigation models and automatic identification system (AIS) transponder data from ships are essential components of such systems. Here we present first results from the SONARC project (project term: 2015-2017), an international multidisciplinary effort to develop novel and complementary ice monitoring and forecasting systems for vessels and offshore platforms in the Arctic. Automated classification methods (Zakhvatkina et al., 2012) are applied to Sentinel-1 dual-polarization SAR images from the Barents and Kara Sea region to identify ice types (e.g. multi-year ice, level first-year ice, deformed first-year ice, new/young ice, open water) and ridges. Short-term (1-3 days) ice drift forecasts are computed from SAR images using feature tracking and pattern tracking methods (Berg & Eriksson, 2014). Ice classification and drift forecast products are combined with ship positions based on AIS data from a selected period of 3-4 weeks to determine optimal vessel speed and routing in ice. Results illustrate the potential of high-resolution SAR data for near-real-time monitoring and forecasting of Arctic ice conditions. Over the next 3 years, SONARC findings will contribute new knowledge about sea ice in the Arctic while promoting safe and cost-effective shipping, domain awareness, resource management, and environmental protection.
NASA Astrophysics Data System (ADS)
Henley, E. M.; Pope, E. C. D.
2017-12-01
This commentary concerns recent work on solar wind forecasting by Owens and Riley (2017). The approach taken makes effective use of tools commonly used in terrestrial weather—notably, via use of a simple model—generation of an "ensemble" forecast, and application of a "cost-loss" analysis to the resulting probabilistic information, to explore the benefit of this forecast to users with different risk appetites. This commentary aims to highlight these useful techniques to the wider space weather audience and to briefly discuss the general context of application of terrestrial weather approaches to space weather.
The IEA/ORAU Long-Term Global Energy- CO2 Model: Personal Computer Version A84PC
Edmonds, Jae A.; Reilly, John M.; Boden, Thomas A. [CDIAC; Reynolds, S. E. [CDIAC; Barns, D. W.
1995-01-01
The IBM A84PC version of the Edmonds-Reilly model has the capability to calculate both CO2 and CH4 emission estimates by source and region. Population, labor productivity, end-use energy efficiency, income effects, price effects, resource base, technological change in energy production, environmental costs of energy production, market-penetration rate of energy-supply technology, solar and biomass energy costs, synfuel costs, and the number of forecast periods may be interactively inspected and altered producing a variety of global and regional CO2 and CH4 emission scenarios for 1975 through 2100. Users are strongly encouraged to see our instructions for downloading, installing, and running the model.
Forecast of future aviation fuels: The model
NASA Technical Reports Server (NTRS)
Ayati, M. B.; Liu, C. Y.; English, J. M.
1981-01-01
A conceptual models of the commercial air transportation industry is developed which can be used to predict trends in economics, demand, and consumption. The methodology is based on digraph theory, which considers the interaction of variables and propagation of changes. Air transportation economics are treated by examination of major variables, their relationships, historic trends, and calculation of regression coefficients. A description of the modeling technique and a compilation of historic airline industry statistics used to determine interaction coefficients are included. Results of model validations show negligible difference between actual and projected values over the twenty-eight year period of 1959 to 1976. A limited application of the method presents forecasts of air tranportation industry demand, growth, revenue, costs, and fuel consumption to 2020 for two scenarios of future economic growth and energy consumption.
Projecting technology change to improve space technology planning and systems management
NASA Astrophysics Data System (ADS)
Walk, Steven Robert
2011-04-01
Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.
Environmentally Related Diseases and the Possibility of Valuation of Their Social Costs
Hajok, Ilona; Marchwińska, Ewa; Dziubanek, Grzegorz; Kuraszewska, Bernadeta; Piekut, Agata
2014-01-01
The risks of the morbidity of the asbestos-related lung cancer was estimated in the general population of Poles as the result of increased exposure to asbestos fibers during the removal of asbestos-cement products and the possibility of the valuation of the social costs related to this risk. The prediction of the new incidences was made using linear regression model. The forecast shows that to the end of 2030 about 3,500 new cases of lung cancer can be expected as a result of occupational exposure to asbestos in the past which makes together with paraoccupational exposure about 14.000 new cases. The forecast shows the increasing number of asbestos-related lung cancer in Poland and indicates the priority areas where preventive action should be implemented. PMID:25374934
48 CFR 232.072-3 - Cash flow forecasts.
Code of Federal Regulations, 2013 CFR
2013-10-01
... forecasts is a strong indicator of serious managerial deficiencies or potential contract cost or performance... the causes of any differences. (d) Cash flow forecasts must— (1) Show the origin and use of all...
48 CFR 232.072-3 - Cash flow forecasts.
Code of Federal Regulations, 2011 CFR
2011-10-01
... forecasts is a strong indicator of serious managerial deficiencies or potential contract cost or performance... the causes of any differences. (d) Cash flow forecasts must— (1) Show the origin and use of all...
48 CFR 232.072-3 - Cash flow forecasts.
Code of Federal Regulations, 2014 CFR
2014-10-01
... forecasts is a strong indicator of serious managerial deficiencies or potential contract cost or performance... the causes of any differences. (d) Cash flow forecasts must— (1) Show the origin and use of all...
48 CFR 232.072-3 - Cash flow forecasts.
Code of Federal Regulations, 2012 CFR
2012-10-01
... forecasts is a strong indicator of serious managerial deficiencies or potential contract cost or performance... the causes of any differences. (d) Cash flow forecasts must— (1) Show the origin and use of all...
DOT National Transportation Integrated Search
2013-08-01
"Over the last 50 years, advances in the fields of travel behavior research and travel demand forecasting have been : immense, driven by the increasing costs of infrastructure and spatial limitations in areas of high population density : together wit...
Evolutionary Development of the Simulation by Logical Modeling System (SIBYL)
NASA Technical Reports Server (NTRS)
Wu, Helen
1995-01-01
Through the evolutionary development of the Simulation by Logical Modeling System (SIBYL) we have re-engineered the expensive and complex IBM mainframe based Long-term Hardware Projection Model (LHPM) to a robust cost-effective computer based mode that is easy to use. We achieved significant cost reductions and improved productivity in preparing long-term forecasts of Space Shuttle Main Engine (SSME) hardware. The LHPM for the SSME is a stochastic simulation model that projects the hardware requirements over 10 years. SIBYL is now the primary modeling tool for developing SSME logistics proposals and Program Operating Plan (POP) for NASA and divisional marketing studies.
Jin, Haomiao; Wu, Shinyi; Di Capua, Paul
2015-09-03
Depression is a common but often undiagnosed comorbid condition of people with diabetes. Mass screening can detect undiagnosed depression but may require significant resources and time. The objectives of this study were 1) to develop a clinical forecasting model that predicts comorbid depression among patients with diabetes and 2) to evaluate a model-based screening policy that saves resources and time by screening only patients considered as depressed by the clinical forecasting model. We trained and validated 4 machine learning models by using data from 2 safety-net clinical trials; we chose the one with the best overall predictive ability as the ultimate model. We compared model-based policy with alternative policies, including mass screening and partial screening, on the basis of depression history or diabetes severity. Logistic regression had the best overall predictive ability of the 4 models evaluated and was chosen as the ultimate forecasting model. Compared with mass screening, the model-based policy can save approximately 50% to 60% of provider resources and time but will miss identifying about 30% of patients with depression. Partial-screening policy based on depression history alone found only a low rate of depression. Two other heuristic-based partial screening policies identified depression at rates similar to those of the model-based policy but cost more in resources and time. The depression prediction model developed in this study has compelling predictive ability. By adopting the model-based depression screening policy, health care providers can use their resources and time better and increase their efficiency in managing their patients with depression.
Does NASA SMAP Improve the Accuracy of Power Outage Models?
NASA Astrophysics Data System (ADS)
Quiring, S. M.; McRoberts, D. B.; Toy, B.; Alvarado, B.
2016-12-01
Electric power utilities make critical decisions in the days prior to hurricane landfall that are primarily based on the estimated impact to their service area. For example, utilities must determine how many repair crews to request from other utilities, the amount of material and equipment they will need to make repairs, and where in their geographically expansive service area to station crews and materials. Accurate forecasts of the impact of an approaching hurricane within their service area are critical for utilities in balancing the costs and benefits of different levels of resources. The Hurricane Outage Prediction Model (HOPM) are a family of statistical models that utilize predictions of tropical cyclone windspeed and duration of strong winds, along with power system and environmental variables (e.g., soil moisture, long-term precipitation), to forecast the number and location of power outages. This project assesses whether using NASA SMAP soil moisture improves the accuracy of power outage forecasts as compared to using model-derived soil moisture from NLDAS-2. A sensitivity analysis is employed since there have been very few tropical cyclones making landfall in the United States since SMAP was launched. The HOPM is used to predict power outages for 13 historical tropical cyclones and the model is run using twice, once with NLDAS soil moisture and once with SMAP soil moisture. Our results demonstrate that using SMAP soil moisture can have a significant impact on power outage predictions. SMAP has the potential to enhance the accuracy of power outage forecasts. Improved outage forecasts reduce the duration of power outages which reduces economic losses and accelerates recovery.
David B. South; Curtis L. VanderSchaaf; Larry D. Teeter
2006-01-01
Some researchers claim that continuously increasing intensive plantation management will increase profits and reduce the unit cost of wood production while others believe in the law of diminishing returns. We developed four hypothetical production models where yield is a function of silvicultural effort. Models that produced unrealistic results were (1) an exponential...
A medical cost estimation with fuzzy neural network of acute hepatitis patients in emergency room.
Kuo, R J; Cheng, W C; Lien, W C; Yang, T J
2015-10-01
Taiwan is an area where chronic hepatitis is endemic. Liver cancer is so common that it has been ranked first among cancer mortality rates since the early 1980s in Taiwan. Besides, liver cirrhosis and chronic liver diseases are the sixth or seventh in the causes of death. Therefore, as shown by the active research on hepatitis, it is not only a health threat, but also a huge medical cost for the government. The estimated total number of hepatitis B carriers in the general population aged more than 20 years old is 3,067,307. Thus, a case record review was conducted from all patients with diagnosis of acute hepatitis admitted to the Emergency Department (ED) of a well-known teaching-oriented hospital in Taipei. The cost of medical resource utilization is defined as the total medical fee. In this study, a fuzzy neural network is employed to develop the cost forecasting model. A total of 110 patients met the inclusion criteria. The computational results indicate that the FNN model can provide more accurate forecasts than the support vector regression (SVR) or artificial neural network (ANN). In addition, unlike SVR and ANN, FNN can also provide fuzzy IF-THEN rules for interpretation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Real-time monitoring and short-term forecasting of drought in Norway
NASA Astrophysics Data System (ADS)
Kwok Wong, Wai; Hisdal, Hege
2013-04-01
Drought is considered to be one of the most costly natural disasters. Drought monitoring and forecasting are thus important for sound water management. In this study hydrological drought characteristics applicable for real-time monitoring and short-term forecasting of drought in Norway were developed. A spatially distributed hydrological model (HBV) implemented in a Web-based GIS framework provides a platform for drought analyses and visualizations. A number of national drought maps can be produced, which is a simple and effective way to communicate drought conditions to decision makers and the public. The HBV model is driven by precipitation and air temperature data. On a daily time step it calculates the water balance for 1 x 1 km2 grid cells characterized by their elevation and land use. Drought duration and areal drought coverage for runoff and subsurface storage (sum of soil moisture and groundwater) were derived. The threshold level method was used to specify drought conditions on a grid cell basis. The daily 10th percentile thresholds were derived from seven-day windows centered on that calendar day from the reference period 1981-2010 (threshold not exceeded 10% of the time). Each individual grid cell was examined to determine if it was below its respective threshold level. Daily drought-stricken areas can then be easily identified when visualized on a map. The drought duration can also be tracked and calculated by a retrospective analysis. Real-time observations from synoptic stations interpolated to a regular grid of 1 km resolution constituted the forcing data for the current situation. 9-day meteorological forecasts were used as input to the HBV model to obtain short-term hydrological drought forecasts. Downscaled precipitation and temperature fields from two different atmospheric models were applied. The first two days of the forecast period adopted the forecasts from Unified Model (UM4) while the following seven days were based on the 9-day forecasts from ECMWF. The approach has been tested and is now available on the Web for operational water management.
NASA Astrophysics Data System (ADS)
Williams, K. A.; Partridge, E. C., III
1984-09-01
Originally envisioned as a means to integrate the many systems found throughout the government, the general mission of the NCS continues to be to ensure the survivability of communications during and subsequent to any national emergency. In order to accomplish this mission the NCS is an arrangement of heterogeneous telecommunications systems which are provided by their sponsor Federal agencies. The physical components of Federal telecommunications systems and networks include telephone and digital data switching facilities and primary common user communications centers; Special purpose local delivery message switching and exchange facilities; Government owned or leased radio systems; Technical control facilities which are under exclusive control of a government agency. This thesis describes the logical design of a proposed decision support system for use by the National Communications System in forecasting technology, prices, and costs. It is general in nature and only includes those forecasting models which are suitable for computer implementation. Because it is a logical design it can be coded and applied in many different hardware and/or software configurations.
Sources and dynamics of turbulence in the upper troposphere and lower stratosphere: A review
NASA Astrophysics Data System (ADS)
Sharman, R. D.; Trier, S. B.; Lane, T. P.; Doyle, J. D.
2012-06-01
Turbulence is a well-known hazard to aviation that is responsible for numerous injuries each year, with occasional fatalities, and is the underlying cause of many people's fear of air travel. Not only are turbulence encounters a safety issue, they also result in millions of dollars of operational costs to airlines, leading to increased costs passed on to the consumer. For these reasons, pilots, dispatchers, and air traffic controllers attempt to avoid turbulence wherever possible. Accurate forecasting of aviation-scale turbulence has been hampered in part by a lack of understanding of the underlying dynamical processes. However, more precise observations of turbulence encounters together with recent research into turbulence generation processes is helping to elucidate the detailed dynamical processes involved and is laying the foundation for improved turbulence forecasting and avoidance. In this paper we briefly review some of the more important recent observational, theoretical, and modeling results related to turbulence at cruise altitudes for commercial aircraft (i.e., the upper troposphere and lower stratosphere), and their implications for aviation turbulence forecasting.
Chenar, Shima Shamkhali; Deng, Zhiqiang
2018-02-01
This paper presents an artificial intelligence-based model, called ANN-2Day model, for forecasting, managing and ultimately eliminating the growing risk of oyster norovirus outbreaks. The ANN-2Day model was developed using Artificial Neural Network (ANN) Toolbox in MATLAB Program and 15-years of epidemiological and environmental data for six independent environmental predictors including water temperature, solar radiation, gage height, salinity, wind, and rainfall. It was found that oyster norovirus outbreaks can be forecasted with two-day lead time using the ANN-2Day model and daily data of the six environmental predictors. Forecasting results of the ANN-2Day model indicated that the model was capable of reproducing 19years of historical oyster norovirus outbreaks along the Northern Gulf of Mexico coast with the positive predictive value of 76.82%, the negative predictive value of 100.00%, the sensitivity of 100.00%, the specificity of 99.84%, and the overall accuracy of 99.83%, respectively, demonstrating the efficacy of the ANN-2Day model in predicting the risk of norovirus outbreaks to human health. The 2-day lead time enables public health agencies and oyster harvesters to plan for management interventions and thus makes it possible to achieve a paradigm shift of their daily management and operation from primarily reacting to epidemic incidents of norovirus infection after they have occurred to eliminating (or at least reducing) the risk of costly incidents. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2007-05-01
VISUM Online is a traffic management system for processing online traffic data. The system implements both a road network model and a traffic demand model. VISUM Online uses all available real-time and historic data to calculate current and forecaste...
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2015-09-01
Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.
NASA Astrophysics Data System (ADS)
Thompson, R. J.; Cole, D. G.; Wilkinson, P. J.; Shea, M. A.; Smart, D.
1990-11-01
Volume 1: The following subject areas are covered: the magnetosphere environment; forecasting magnetically quiet periods; radiation hazards to human in deep space (a summary with special reference to large solar particle events); solar proton events (review and status); problems of the physics of solar-terrestrial interactions; prediction of solar proton fluxes from x-ray signatures; rhythms in solar activity and the prediction of episodes of large flares; the role of persistence in the 24-hour flare forecast; on the relationship between the observed sunspot number and the number of solar flares; the latitudinal distribution of coronal holes and geomagnetic storms due to coronal holes; and the signatures of flares in the interplanetary medium at 1 AU. Volume 2: The following subject areas were covered: a probability forecast for geomagnetic activity; cost recovery in solar-terrestrial predictions; magnetospheric specification and forecasting models; a geomagnetic forecast and monitoring system for power system operation; some aspects of predicting magnetospheric storms; some similarities in ionospheric disturbance characteristics in equatorial, mid-latitude, and sub-auroral regions; ionospheric support for low-VHF radio transmission; a new approach to prediction of ionospheric storms; a comparison of the total electron content of the ionosphere around L=4 at low sunspot numbers with the IRI model; the French ionospheric radio propagation predictions; behavior of the F2 layer at mid-latitudes; and the design of modern ionosondes.
A cost-effectiveness comparison of existing and Landsat-aided snow water content estimation systems
NASA Technical Reports Server (NTRS)
Sharp, J. M.; Thomas, R. W.
1975-01-01
This study describes how Landsat imagery can be cost-effectively employed to augment an operational hydrologic model. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model presently used by the California Department of Water Resources. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the Landsat-aided approach.
Assessment of wind energy potential in Poland
NASA Astrophysics Data System (ADS)
Starosta, Katarzyna; Linkowska, Joanna; Mazur, Andrzej
2014-05-01
The aim of the presentation is to show the suitability of using numerical model wind speed forecasts for the wind power industry applications in Poland. In accordance with the guidelines of the European Union, the consumption of wind energy in Poland is rapidly increasing. According to the report of Energy Regulatory Office from 30 March 2013, the installed capacity of wind power in Poland was 2807MW from 765 wind power stations. Wind energy is strongly dependent on the meteorological conditions. Based on the climatological wind speed data, potential energy zones within the area of Poland have been developed (H. Lorenc). They are the first criterion for assessing the location of the wind farm. However, for exact monitoring of a given wind farm location the prognostic data from numerical model forecasts are necessary. For the practical interpretation and further post-processing, the verification of the model data is very important. Polish Institute Meteorology and Water Management - National Research Institute (IMWM-NRI) runs an operational model COSMO (Consortium for Small-scale Modelling, version 4.8) using two nested domains at horizontal resolutions of 7 km and 2.8 km. The model produces 36 hour and 78 hour forecasts from 00 UTC, for 2.8 km and 7 km domain resolutions respectively. Numerical forecasts were compared with the observation of 60 SYNOP and 3 TEMP stations in Poland, using VERSUS2 (Unified System Verification Survey 2) and R package. For every zone the set of statistical indices (ME, MAE, RMSE) was calculated. Forecast errors for aerological profiles are shown for Polish TEMP stations at Wrocław, Legionowo and Łeba. The current studies are connected with a topic of the COST ES1002 WIRE-Weather Intelligence for Renewable Energies.
Forecasting land cover change impacts on drinking water treatment costs in Minneapolis, Minnesota
Source protection is a critical aspect of drinking water treatment. The benefits of protecting source water quality in reducing drinking water treatment costs are clear. However, forecasting the impacts of environmental change on source water quality and its potential to influenc...
Sukič, Primož; Štumberger, Gorazd
2017-05-13
Clouds moving at a high speed in front of the Sun can cause step changes in the output power of photovoltaic (PV) power plants, which can lead to voltage fluctuations and stability problems in the connected electricity networks. These effects can be reduced effectively by proper short-term cloud passing forecasting and suitable PV power plant output power control. This paper proposes a low-cost Internet of Things (IoT)-based solution for intra-minute cloud passing forecasting. The hardware consists of a Raspberry PI Model B 3 with a WiFi connection and an OmniVision OV5647 sensor with a mounted wide-angle lens, a circular polarizing (CPL) filter and a natural density (ND) filter. The completely new algorithm for cloud passing forecasting uses the green and blue colors in the photo to determine the position of the Sun, to recognize the clouds, and to predict their movement. The image processing is performed in several stages, considering selectively only a small part of the photo relevant to the movement of the clouds in the vicinity of the Sun in the next minute. The proposed algorithm is compact, fast and suitable for implementation on low cost processors with low computation power. The speed of the cloud parts closest to the Sun is used to predict when the clouds will cover the Sun. WiFi communication is used to transmit this data to the PV power plant control system in order to decrease the output power slowly and smoothly.
Sukič, Primož; Štumberger, Gorazd
2017-01-01
Clouds moving at a high speed in front of the Sun can cause step changes in the output power of photovoltaic (PV) power plants, which can lead to voltage fluctuations and stability problems in the connected electricity networks. These effects can be reduced effectively by proper short-term cloud passing forecasting and suitable PV power plant output power control. This paper proposes a low-cost Internet of Things (IoT)-based solution for intra-minute cloud passing forecasting. The hardware consists of a Raspberry PI Model B 3 with a WiFi connection and an OmniVision OV5647 sensor with a mounted wide-angle lens, a circular polarizing (CPL) filter and a natural density (ND) filter. The completely new algorithm for cloud passing forecasting uses the green and blue colors in the photo to determine the position of the Sun, to recognize the clouds, and to predict their movement. The image processing is performed in several stages, considering selectively only a small part of the photo relevant to the movement of the clouds in the vicinity of the Sun in the next minute. The proposed algorithm is compact, fast and suitable for implementation on low cost processors with low computation power. The speed of the cloud parts closest to the Sun is used to predict when the clouds will cover the Sun. WiFi communication is used to transmit this data to the PV power plant control system in order to decrease the output power slowly and smoothly. PMID:28505078
NASA Astrophysics Data System (ADS)
Pierro, Marco; De Felice, Matteo; Maggioni, Enrico; Moser, David; Perotto, Alessandro; Spada, Francesco; Cornaro, Cristina
2017-04-01
The growing photovoltaic generation results in a stochastic variability of the electric demand that could compromise the stability of the grid and increase the amount of energy reserve and the energy imbalance cost. On regional scale, solar power estimation and forecast is becoming essential for Distribution System Operators, Transmission System Operator, energy traders, and aggregators of generation. Indeed the estimation of regional PV power can be used for PV power supervision and real time control of residual load. Mid-term PV power forecast can be employed for transmission scheduling to reduce energy imbalance and related cost of penalties, residual load tracking, trading optimization, secondary energy reserve assessment. In this context, a new upscaling method was developed and used for estimation and mid-term forecast of the photovoltaic distributed generation in a small area in the north of Italy under the control of a local DSO. The method was based on spatial clustering of the PV fleet and neural networks models that input satellite or numerical weather prediction data (centered on cluster centroids) to estimate or predict the regional solar generation. It requires a low computational effort and very few input information should be provided by users. The power estimation model achieved a RMSE of 3% of installed capacity. Intra-day forecast (from 1 to 4 hours) obtained a RMSE of 5% - 7% while the one and two days forecast achieve to a RMSE of 7% and 7.5%. A model to estimate the forecast error and the prediction intervals was also developed. The photovoltaic production in the considered region provided the 6.9% of the electric consumption in 2015. Since the PV penetration is very similar to the one observed at national level (7.9%), this is a good case study to analyse the impact of PV generation on the electric grid and the effects of PV power forecast on transmission scheduling and on secondary reserve estimation. It appears that, already with 7% of PV penetration, the distributed PV generation could have a great impact both on the DSO energy need and on the transmission scheduling capability. Indeed, for some hours of the days in summer time, the photovoltaic generation can provide from 50% to 75% of the energy that the local DSO should buy from Italian TSO to cover the electrical demand. Moreover, mid-term forecast can reduce the annual energy imbalance between the scheduled transmission and the actual one from 10% of the TSO energy supply (without considering the PV forecast) to 2%. Furthermore, it was shown that prediction intervals could be used not only to estimate the probability of a specific PV generation bid on the energy market, but also to reduce the energy reserve predicted for the next day. Two different methods for energy reserve estimation were developed and tested. The first is based on a clear sky model while the second makes use of the PV prediction intervals with the 95% of confidence level. The latter reduces the amount of the day-ahead energy reserve of 36% with respect the clear sky method.
Satellite-based Calibration of Heat Flux at the Ocean Surface
NASA Astrophysics Data System (ADS)
Barron, C. N.; Dastugue, J. M.; May, J. C.; Rowley, C. D.; Smith, S. R.; Spence, P. L.; Gremes-Cordero, S.
2016-02-01
Model forecasts of upper ocean heat content and variability on diurnal to daily scales are highly dependent on estimates of heat flux through the air-sea interface. Satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. Traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle. Subsequent evolution depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. The COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates) endeavors to correct ocean forecast bias through a responsive error partition among surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using Navy operational global or regional atmospheric forcing. COFFEE addresses satellite-calibration of surface fluxes to estimate surface error covariances and links these to the ocean interior. Experiment cases combine different levels of flux calibration with different assimilation alternatives. The cases may use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
Forecasting skills of the ensemble hydro-meteorological system for the Po river floods
NASA Astrophysics Data System (ADS)
Ricciardi, Giuseppe; Montani, Andrea; Paccagnella, Tiziana; Pecora, Silvano; Tonelli, Fabrizio
2013-04-01
The Po basin is the largest and most economically important river-basin in Italy. Extreme hydrological events, including floods, flash floods and droughts, are expected to become more severe in the next future due to climate change, and related ground effects are linked both with environmental and social resilience. A Warning Operational Center (WOC) for hydrological event management was created in Emilia Romagna region. In the last years, the WOC faced challenges in legislation, organization, technology and economics, achieving improvements in forecasting skill and information dissemination. Since 2005, an operational forecasting and modelling system for flood modelling and forecasting has been implemented, aimed at supporting and coordinating flood control and emergency management on the whole Po basin. This system, referred to as FEWSPo, has also taken care of environmental aspects of flood forecast. The FEWSPo system has reached a very high level of complexity, due to the combination of three different hydrological-hydraulic chains (HEC-HMS/RAS - MIKE11 NAM/HD, Topkapi/Sobek), with several meteorological inputs (forecasted - COSMOI2, COSMOI7, COSMO-LEPS among others - and observed). In this hydrological and meteorological ensemble the management of the relative predictive uncertainties, which have to be established and communicated to decision makers, is a debated scientific and social challenge. Real time activities face professional, modelling and technological aspects but are also strongly interrelated with organization and human aspects. The authors will report a case study using the operational flood forecast hydro-meteorological ensemble, provided by the MIKE11 chain fed by COSMO_LEPS EQPF. The basic aim of the proposed approach is to analyse limits and opportunities of the long term forecast (with a lead time ranging from 3 to 5 days), for the implementation of low cost actions, also looking for a well informed decision making and the improvement of flood preparedness and crisis management for basins greater than 1.000 km2.
NASA Astrophysics Data System (ADS)
Moturu, Sai T.; Liu, Huan; Johnson, William G.
Rapidly rising healthcare costs represent one of the major issues plaguing the healthcare system. Data from the Arizona Health Care Cost Containment System, Arizona's Medicaid program provide a unique opportunity to exploit state-of-the-art machine learning and data mining algorithms to analyze data and provide actionable findings that can aid cost containment. Our work addresses specific challenges in this real-life healthcare application with respect to data imbalance in the process of building predictive risk models for forecasting high-cost patients. We survey the literature and propose novel data mining approaches customized for this compelling application with specific focus on non-random sampling. Our empirical study indicates that the proposed approach is highly effective and can benefit further research on cost containment in the healthcare industry.
Economic and environmental costs of regulatory uncertainty for coal-fired power plants.
Patiño-Echeverri, Dalia; Fischbeck, Paul; Kriegler, Elmar
2009-02-01
Uncertainty about the extent and timing of CO2 emissions regulations for the electricity-generating sector exacerbates the difficulty of selecting investment strategies for retrofitting or alternatively replacing existent coal-fired power plants. This may result in inefficient investments imposing economic and environmental costs to society. In this paper, we construct a multiperiod decision model with an embedded multistage stochastic dynamic program minimizing the expected total costs of plant operation, installations, and pollution allowances. We use the model to forecast optimal sequential investment decisions of a power plant operator with and without uncertainty about future CO2 allowance prices. The comparison of the two cases demonstrates that uncertainty on future CO2 emissions regulations might cause significant economic costs and higher air emissions.
The Use of Scale-Dependent Precision to Increase Forecast Accuracy in Earth System Modelling
NASA Astrophysics Data System (ADS)
Thornes, Tobias; Duben, Peter; Palmer, Tim
2016-04-01
At the current pace of development, it may be decades before the 'exa-scale' computers needed to resolve individual convective clouds in weather and climate models become available to forecasters, and such machines will incur very high power demands. But the resolution could be improved today by switching to more efficient, 'inexact' hardware with which variables can be represented in 'reduced precision'. Currently, all numbers in our models are represented as double-precision floating points - each requiring 64 bits of memory - to minimise rounding errors, regardless of spatial scale. Yet observational and modelling constraints mean that values of atmospheric variables are inevitably known less precisely on smaller scales, suggesting that this may be a waste of computer resources. More accurate forecasts might therefore be obtained by taking a scale-selective approach whereby the precision of variables is gradually decreased at smaller spatial scales to optimise the overall efficiency of the model. To study the effect of reducing precision to different levels on multiple spatial scales, we here introduce a new model atmosphere developed by extending the Lorenz '96 idealised system to encompass three tiers of variables - which represent large-, medium- and small-scale features - for the first time. In this chaotic but computationally tractable system, the 'true' state can be defined by explicitly resolving all three tiers. The abilities of low resolution (single-tier) double-precision models and similar-cost high resolution (two-tier) models in mixed-precision to produce accurate forecasts of this 'truth' are compared. The high resolution models outperform the low resolution ones even when small-scale variables are resolved in half-precision (16 bits). This suggests that using scale-dependent levels of precision in more complicated real-world Earth System models could allow forecasts to be made at higher resolution and with improved accuracy. If adopted, this new paradigm would represent a revolution in numerical modelling that could be of great benefit to the world.
Demand forecasting for automotive sector in Malaysia by system dynamics approach
NASA Astrophysics Data System (ADS)
Zulkepli, Jafri; Fong, Chan Hwa; Abidin, Norhaslinda Zainal
2015-12-01
In general, Proton as an automotive company needs to forecast future demand of the car to assist in decision making related to capacity expansion planning. One of the forecasting approaches that based on judgemental or subjective factors is normally used to forecast the demand. As a result, demand could be overstock that eventually will increase the operation cost; or the company will face understock, which resulted losing their customers. Due to automotive industry is very challenging process because of high level of complexity and uncertainty involved in the system, an accurate tool to forecast the future of automotive demand from the modelling perspective is required. Hence, the main objective of this paper is to forecast the demand of automotive Proton car industry in Malaysia using system dynamics approach. Two types of intervention namely optimistic and pessimistic experiments scenarios have been tested to determine the capacity expansion that can prevent the company from overstocking. Finding from this study highlighted that the management needs to expand their production for optimistic scenario, whilst pessimistic give results that would otherwise. Finally, this study could help Proton Edar Sdn. Bhd (PESB) to manage the long-term capacity planning in order to meet the future demand of the Proton cars.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coimbra, Carlos F. M.
2016-02-25
In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior inmore » real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.« less
An econometric simulation model of income and electricity demand in Alaska's Railbelt, 1982-2022
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddigan, R.J.; Hill, L.J.; Hamblin, D.M.
1987-01-01
This report describes the specification of-and forecasts derived from-the Alaska Railbelt Electricity Load, Macroeconomic (ARELM) model. ARELM was developed as an independent, modeling tool for the evaluation of the need for power from the Susitna Hydroelectric Project which has been proposed by the Alaska Power Authority. ARELM is an econometric simulation model consisting of 61 equations - 46 behavioral equations and 15 identities. The system includes two components: (1) ARELM-MACRO which is a system of equations that simulates the performance of both the total Alaskan and Railbelt macroeconomies and (2) ARELM-LOAD which projects electricity-related activity in the Alaskan Railbelt region.more » The modeling system is block recursive in the sense that forecasts of population, personal income, and employment in the Railbelt derived from ARELM-MACRO are used as explanatory variables in ARELM-LOAD to simulate electricity demand, the real average price of electricity, and the number of customers in the Railbelt. Three scenarios based on assumptions about the future price of crude oil are simulated and documented in the report. The simulations, which do not include the cost-of-power impacts of Susitna-based generation, show that the growth rate in Railbelt electricity load is between 2.5 and 2.7% over the 1982 to 2022 forecast period. The forecasting results are consistent with other projections of load growth in the region using different modeling approaches.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-26
... others for safety assessment, planning, forecasting, cost/benefit analysis, and to target areas for... assessment, planning, forecasting, cost/benefit analysis, and to target areas for research. Respondents... invites public comments about our intention to request the Office of Management and Budget (OMB) approval...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... others for safety assessment, planning, forecasting, cost/benefit analysis, and to target areas for... assessment, planning, forecasting, cost/benefit analysis, and to target areas for research. Respondents... invites public comments about our intention to request the Office of Management and Budget (OMB) approval...
Municipal water consumption forecast accuracy
NASA Astrophysics Data System (ADS)
Fullerton, Thomas M.; Molina, Angel L.
2010-06-01
Municipal water consumption planning is an active area of research because of infrastructure construction and maintenance costs, supply constraints, and water quality assurance. In spite of that, relatively few water forecast accuracy assessments have been completed to date, although some internal documentation may exist as part of the proprietary "grey literature." This study utilizes a data set of previously published municipal consumption forecasts to partially fill that gap in the empirical water economics literature. Previously published municipal water econometric forecasts for three public utilities are examined for predictive accuracy against two random walk benchmarks commonly used in regional analyses. Descriptive metrics used to quantify forecast accuracy include root-mean-square error and Theil inequality statistics. Formal statistical assessments are completed using four-pronged error differential regression F tests. Similar to studies for other metropolitan econometric forecasts in areas with similar demographic and labor market characteristics, model predictive performances for the municipal water aggregates in this effort are mixed for each of the municipalities included in the sample. Given the competitiveness of the benchmarks, analysts should employ care when utilizing econometric forecasts of municipal water consumption for planning purposes, comparing them to recent historical observations and trends to insure reliability. Comparative results using data from other markets, including regions facing differing labor and demographic conditions, would also be helpful.
Forecasting Financial Priorities for Technology.
ERIC Educational Resources Information Center
Ringle, Martin D.
1997-01-01
Argues that, with technology costs and revenue opportunities changing rapidly, colleges' future financial strategies concerning technology will have to be more agile and adaptable than ever. Presents financial models from 20 independent colleges and universities, and discusses how they have been used to define a financial strategy for technology…
Body Fat Percentage Prediction Using Intelligent Hybrid Approaches
Shao, Yuehjen E.
2014-01-01
Excess of body fat often leads to obesity. Obesity is typically associated with serious medical diseases, such as cancer, heart disease, and diabetes. Accordingly, knowing the body fat is an extremely important issue since it affects everyone's health. Although there are several ways to measure the body fat percentage (BFP), the accurate methods are often associated with hassle and/or high costs. Traditional single-stage approaches may use certain body measurements or explanatory variables to predict the BFP. Diverging from existing approaches, this study proposes new intelligent hybrid approaches to obtain fewer explanatory variables, and the proposed forecasting models are able to effectively predict the BFP. The proposed hybrid models consist of multiple regression (MR), artificial neural network (ANN), multivariate adaptive regression splines (MARS), and support vector regression (SVR) techniques. The first stage of the modeling includes the use of MR and MARS to obtain fewer but more important sets of explanatory variables. In the second stage, the remaining important variables are served as inputs for the other forecasting methods. A real dataset was used to demonstrate the development of the proposed hybrid models. The prediction results revealed that the proposed hybrid schemes outperformed the typical, single-stage forecasting models. PMID:24723804
A Systems Modeling Approach to Forecast Corn Economic Optimum Nitrogen Rate.
Puntel, Laila A; Sawyer, John E; Barker, Daniel W; Thorburn, Peter J; Castellano, Michael J; Moore, Kenneth J; VanLoocke, Andrew; Heaton, Emily A; Archontoulis, Sotirios V
2018-01-01
Historically crop models have been used to evaluate crop yield responses to nitrogen (N) rates after harvest when it is too late for the farmers to make in-season adjustments. We hypothesize that the use of a crop model as an in-season forecast tool will improve current N decision-making. To explore this, we used the Agricultural Production Systems sIMulator (APSIM) calibrated with long-term experimental data for central Iowa, USA (16-years in continuous corn and 15-years in soybean-corn rotation) combined with actual weather data up to a specific crop stage and historical weather data thereafter. The objectives were to: (1) evaluate the accuracy and uncertainty of corn yield and economic optimum N rate (EONR) predictions at four forecast times (planting time, 6th and 12th leaf, and silking phenological stages); (2) determine whether the use of analogous historical weather years based on precipitation and temperature patterns as opposed to using a 35-year dataset could improve the accuracy of the forecast; and (3) quantify the value added by the crop model in predicting annual EONR and yields using the site-mean EONR and the yield at the EONR to benchmark predicted values. Results indicated that the mean corn yield predictions at planting time ( R 2 = 0.77) using 35-years of historical weather was close to the observed and predicted yield at maturity ( R 2 = 0.81). Across all forecasting times, the EONR predictions were more accurate in corn-corn than soybean-corn rotation (relative root mean square error, RRMSE, of 25 vs. 45%, respectively). At planting time, the APSIM model predicted the direction of optimum N rates (above, below or at average site-mean EONR) in 62% of the cases examined ( n = 31) with an average error range of ±38 kg N ha -1 (22% of the average N rate). Across all forecast times, prediction error of EONR was about three times higher than yield predictions. The use of the 35-year weather record was better than using selected historical weather years to forecast (RRMSE was on average 3% lower). Overall, the proposed approach of using the crop model as a forecasting tool could improve year-to-year predictability of corn yields and optimum N rates. Further improvements in modeling and set-up protocols are needed toward more accurate forecast, especially for extreme weather years with the most significant economic and environmental cost.
A Systems Modeling Approach to Forecast Corn Economic Optimum Nitrogen Rate
Puntel, Laila A.; Sawyer, John E.; Barker, Daniel W.; Thorburn, Peter J.; Castellano, Michael J.; Moore, Kenneth J.; VanLoocke, Andrew; Heaton, Emily A.; Archontoulis, Sotirios V.
2018-01-01
Historically crop models have been used to evaluate crop yield responses to nitrogen (N) rates after harvest when it is too late for the farmers to make in-season adjustments. We hypothesize that the use of a crop model as an in-season forecast tool will improve current N decision-making. To explore this, we used the Agricultural Production Systems sIMulator (APSIM) calibrated with long-term experimental data for central Iowa, USA (16-years in continuous corn and 15-years in soybean-corn rotation) combined with actual weather data up to a specific crop stage and historical weather data thereafter. The objectives were to: (1) evaluate the accuracy and uncertainty of corn yield and economic optimum N rate (EONR) predictions at four forecast times (planting time, 6th and 12th leaf, and silking phenological stages); (2) determine whether the use of analogous historical weather years based on precipitation and temperature patterns as opposed to using a 35-year dataset could improve the accuracy of the forecast; and (3) quantify the value added by the crop model in predicting annual EONR and yields using the site-mean EONR and the yield at the EONR to benchmark predicted values. Results indicated that the mean corn yield predictions at planting time (R2 = 0.77) using 35-years of historical weather was close to the observed and predicted yield at maturity (R2 = 0.81). Across all forecasting times, the EONR predictions were more accurate in corn-corn than soybean-corn rotation (relative root mean square error, RRMSE, of 25 vs. 45%, respectively). At planting time, the APSIM model predicted the direction of optimum N rates (above, below or at average site-mean EONR) in 62% of the cases examined (n = 31) with an average error range of ±38 kg N ha−1 (22% of the average N rate). Across all forecast times, prediction error of EONR was about three times higher than yield predictions. The use of the 35-year weather record was better than using selected historical weather years to forecast (RRMSE was on average 3% lower). Overall, the proposed approach of using the crop model as a forecasting tool could improve year-to-year predictability of corn yields and optimum N rates. Further improvements in modeling and set-up protocols are needed toward more accurate forecast, especially for extreme weather years with the most significant economic and environmental cost. PMID:29706974
The Application of Magnesium Alloys in Aircraft Interiors — Changing the Rules
NASA Astrophysics Data System (ADS)
Davis, Bruce
The commercial aircraft market is forecast to steadily grow over the next two decades. Part of this growth is driven by the desire of airlines to replace older models in their fleet with newer, more fuel efficient designs, to realize lower operating costs and to address the rising cost of aviation fuel. As such the aircraft OEMs are beginning to set more and more demanding mass targets on their new platforms.
Effective Capital Provision Within Government. Methodologies for Right-Sizing Base Infrastructure
2005-01-01
unknown distributions, since they more accurately represent the complexity of real -world problems. Forecasting uncertain future demand flows is critical to...ordering system with no time lags and no additional costs for instantaneous delivery, shortage and holding costs would be eliminated, because the...order a fixed quantity, Q. 4.1.4 Analyzed Time Step Time is an important dimension in inventory models, since the way the system changes over time affects
Global Disease Monitoring and Forecasting with Wikipedia
Generous, Nicholas; Fairchild, Geoffrey; Deshpande, Alina; Del Valle, Sara Y.; Priedhorsky, Reid
2014-01-01
Infectious disease is a leading threat to public health, economic stability, and other key social structures. Efforts to mitigate these impacts depend on accurate and timely monitoring to measure the risk and progress of disease. Traditional, biologically-focused monitoring techniques are accurate but costly and slow; in response, new techniques based on social internet data, such as social media and search queries, are emerging. These efforts are promising, but important challenges in the areas of scientific peer review, breadth of diseases and countries, and forecasting hamper their operational usefulness. We examine a freely available, open data source for this use: access logs from the online encyclopedia Wikipedia. Using linear models, language as a proxy for location, and a systematic yet simple article selection procedure, we tested 14 location-disease combinations and demonstrate that these data feasibly support an approach that overcomes these challenges. Specifically, our proof-of-concept yields models with up to 0.92, forecasting value up to the 28 days tested, and several pairs of models similar enough to suggest that transferring models from one location to another without re-training is feasible. Based on these preliminary results, we close with a research agenda designed to overcome these challenges and produce a disease monitoring and forecasting system that is significantly more effective, robust, and globally comprehensive than the current state of the art. PMID:25392913
Global disease monitoring and forecasting with Wikipedia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Generous, Nicholas; Fairchild, Geoffrey; Deshpande, Alina
Infectious disease is a leading threat to public health, economic stability, and other key social structures. Efforts to mitigate these impacts depend on accurate and timely monitoring to measure the risk and progress of disease. Traditional, biologically-focused monitoring techniques are accurate but costly and slow; in response, new techniques based on social internet data, such as social media and search queries, are emerging. These efforts are promising, but important challenges in the areas of scientific peer review, breadth of diseases and countries, and forecasting hamper their operational usefulness. We examine a freely available, open data source for this use: accessmore » logs from the online encyclopedia Wikipedia. Using linear models, language as a proxy for location, and a systematic yet simple article selection procedure, we tested 14 location-disease combinations and demonstrate that these data feasibly support an approach that overcomes these challenges. Specifically, our proof-of-concept yields models with up to 0.92, forecasting value up to the 28 days tested, and several pairs of models similar enough to suggest that transferring models from one location to another without re-training is feasible. Based on these preliminary results, we close with a research agenda designed to overcome these challenges and produce a disease monitoring and forecasting system that is significantly more effective, robust, and globally comprehensive than the current state of the art.« less
Global disease monitoring and forecasting with Wikipedia.
Generous, Nicholas; Fairchild, Geoffrey; Deshpande, Alina; Del Valle, Sara Y; Priedhorsky, Reid
2014-11-01
Infectious disease is a leading threat to public health, economic stability, and other key social structures. Efforts to mitigate these impacts depend on accurate and timely monitoring to measure the risk and progress of disease. Traditional, biologically-focused monitoring techniques are accurate but costly and slow; in response, new techniques based on social internet data, such as social media and search queries, are emerging. These efforts are promising, but important challenges in the areas of scientific peer review, breadth of diseases and countries, and forecasting hamper their operational usefulness. We examine a freely available, open data source for this use: access logs from the online encyclopedia Wikipedia. Using linear models, language as a proxy for location, and a systematic yet simple article selection procedure, we tested 14 location-disease combinations and demonstrate that these data feasibly support an approach that overcomes these challenges. Specifically, our proof-of-concept yields models with r2 up to 0.92, forecasting value up to the 28 days tested, and several pairs of models similar enough to suggest that transferring models from one location to another without re-training is feasible. Based on these preliminary results, we close with a research agenda designed to overcome these challenges and produce a disease monitoring and forecasting system that is significantly more effective, robust, and globally comprehensive than the current state of the art.
Global disease monitoring and forecasting with Wikipedia
Generous, Nicholas; Fairchild, Geoffrey; Deshpande, Alina; ...
2014-11-13
Infectious disease is a leading threat to public health, economic stability, and other key social structures. Efforts to mitigate these impacts depend on accurate and timely monitoring to measure the risk and progress of disease. Traditional, biologically-focused monitoring techniques are accurate but costly and slow; in response, new techniques based on social internet data, such as social media and search queries, are emerging. These efforts are promising, but important challenges in the areas of scientific peer review, breadth of diseases and countries, and forecasting hamper their operational usefulness. We examine a freely available, open data source for this use: accessmore » logs from the online encyclopedia Wikipedia. Using linear models, language as a proxy for location, and a systematic yet simple article selection procedure, we tested 14 location-disease combinations and demonstrate that these data feasibly support an approach that overcomes these challenges. Specifically, our proof-of-concept yields models with up to 0.92, forecasting value up to the 28 days tested, and several pairs of models similar enough to suggest that transferring models from one location to another without re-training is feasible. Based on these preliminary results, we close with a research agenda designed to overcome these challenges and produce a disease monitoring and forecasting system that is significantly more effective, robust, and globally comprehensive than the current state of the art.« less
Towards real-time assimilation of crowdsourced observations in hydrological modeling
NASA Astrophysics Data System (ADS)
Mazzoleni, Maurizio; Verlaan, Martin; Alfonso, Leonardo; Norbiato, Daniele; Monego, Martina; Ferri, Michele; Solomatine, Dimitri
2016-04-01
The continued technological advances have stimulated the spread of low-cost sensors that can be used by citizens to provide crowdsourced observations (CO) of different hydrological variables. An example of such low-cost sensors is a staff gauge connected to a QR code on which people can read the water level indication and send the measurement via a mobile phone application. The goal of this study is to assess the combined effect of the assimilation of CO coming from a distributed network of low-cost sensors, and the existing streamflow observations from physical sensors, on the performance of a semi-distributed hydrological model. The methodology is applied to the Bacchiglione catchment, North East of Italy, where an early warning system is used by the Alto Adriatico Water Authority to issue forecasted water level along the river network which cross important cities such as Vicenza and Padua. In this study, forecasted precipitation values are used as input in the hydrological model to estimate the simulated streamflow hydrograph used as boundary condition for the hydraulic model. Observed precipitation values are used to generate realistic synthetic streamflow values with various characteristics of arrival frequency and accuracy, to simulate CO coming at irregular time steps. These observations are assimilated into the semi-distributed model using a Kalman filter based method. The results of this study show that CO, asynchronous in time and with variable accuracy, can still improve flood prediction when integrated in hydrological models. When both physical and low-cost sensors are located at the same places, the assimilation of CO gives the same model improvement than the assimilation of physical observations only for high number of non-intermittent sensors. However, the integration of observations from low-cost sensors and single physical sensors can improve the flood prediction even when small a number of intermittent CO are available. This study is part of the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/).
Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2010-12-01
The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and they need to convey the epistemic uncertainties in the operational forecasts. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. All operational procedures should be rigorously reviewed by experts in the creation, delivery, and utility of earthquake forecasts. (c) The quality of all operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing in a CSEP-type environment against established long-term forecasts and a wide variety of alternative, time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in PSHA. (e) Alert procedures should be standardized to facilitate decisions at different levels of government and among the public, based in part on objective analysis of costs and benefits. (f) In establishing alert procedures, consideration should also be made of the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that can lead to informal predictions and misinformation.
Energy Storage Sizing Taking Into Account Forecast Uncertainties and Receding Horizon Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Hug, Gabriela; Li, Xin
Energy storage systems (ESS) have the potential to be very beneficial for applications such as reducing the ramping of generators, peak shaving, and balancing not only the variability introduced by renewable energy sources, but also the uncertainty introduced by errors in their forecasts. Optimal usage of storage may result in reduced generation costs and an increased use of renewable energy. However, optimally sizing these devices is a challenging problem. This paper aims to provide the tools to optimally size an ESS under the assumption that it will be operated under a model predictive control scheme and that the forecast ofmore » the renewable energy resources include prediction errors. A two-stage stochastic model predictive control is formulated and solved, where the optimal usage of the storage is simultaneously determined along with the optimal generation outputs and size of the storage. Wind forecast errors are taken into account in the optimization problem via probabilistic constraints for which an analytical form is derived. This allows for the stochastic optimization problem to be solved directly, without using sampling-based approaches, and sizing the storage to account not only for a wide range of potential scenarios, but also for a wide range of potential forecast errors. In the proposed formulation, we account for the fact that errors in the forecast affect how the device is operated later in the horizon and that a receding horizon scheme is used in operation to optimally use the available storage.« less
The impact of underwater glider observations in the forecast of Hurricane Gonzalo (2014)
NASA Astrophysics Data System (ADS)
Goni, G. J.; Domingues, R. M.; Kim, H. S.; Domingues, R. M.; Halliwell, G. R., Jr.; Bringas, F.; Morell, J. M.; Pomales, L.; Baltes, R.
2017-12-01
The tropical Atlantic basin is one of seven global regions where tropical cyclones (TC) are commonly observed to originate and intensify from June to November. On average, approximately 12 TCs travel through the region every year, frequently affecting coastal, and highly populated areas. In an average year, 2 to 3 of them are categorized as intense hurricanes. Given the appropriate atmospheric conditions, TC intensification has been linked to ocean conditions, such as increased ocean heat content and enhanced salinity stratification near the surface. While errors in hurricane track forecasts have been reduced during the last years, errors in intensity forecasts remain mostly unchanged. Several studies have indicated that the use of in situ observations has the potential to improve the representation of the ocean to correctly initialize coupled hurricane intensity forecast models. However, a sustained in situ ocean observing system in the tropical North Atlantic Ocean and Caribbean Sea dedicated to measuring subsurface thermal and salinity fields in support of TC intensity studies and forecasts has yet to be implemented. Autonomous technologies offer new and cost-effective opportunities to accomplish this objective. We highlight here a partnership effort that utilize underwater gliders to better understand air-sea processes during high wind events, and are particularly geared towards improving hurricane intensity forecasts. Results are presented for Hurricane Gonzalo (2014), where glider observations obtained in the tropical Atlantic: Helped to provide an accurate description of the upper ocean conditions, that included the presence of a low salinity barrier layer; Allowed a detailed analysis of the upper ocean response to hurricane force winds of Gonzalo; Improved the initialization of the ocean in a coupled ocean-atmosphere numerical model; and together with observations from other ocean observing platforms, substantially reduced the error in intensity forecast using the HYCOM-HWRF model. Data collected by this project are transmitted in real-time to the Global Telecommunication System, distributed through the institutional web pages, by the IOOS Glider Data Assembly Center, and by NCEI, and assimilated in real-time numerical weather forecast models.
Fast Kalman Filter for Random Walk Forecast model
NASA Astrophysics Data System (ADS)
Saibaba, A.; Kitanidis, P. K.
2013-12-01
Kalman filtering is a fundamental tool in statistical time series analysis to understand the dynamics of large systems for which limited, noisy observations are available. However, standard implementations of the Kalman filter are prohibitive because they require O(N^2) in memory and O(N^3) in computational cost, where N is the dimension of the state variable. In this work, we focus our attention on the Random walk forecast model which assumes the state transition matrix to be the identity matrix. This model is frequently adopted when the data is acquired at a timescale that is faster than the dynamics of the state variables and there is considerable uncertainty as to the physics governing the state evolution. We derive an efficient representation for the a priori and a posteriori estimate covariance matrices as a weighted sum of two contributions - the process noise covariance matrix and a low rank term which contains eigenvectors from a generalized eigenvalue problem, which combines information from the noise covariance matrix and the data. We describe an efficient algorithm to update the weights of the above terms and the computation of eigenmodes of the generalized eigenvalue problem (GEP). The resulting algorithm for the Kalman filter with Random walk forecast model scales as O(N) or O(N log N), both in memory and computational cost. This opens up the possibility of real-time adaptive experimental design and optimal control in systems of much larger dimension than was previously feasible. For a small number of measurements (~ 300 - 400), this procedure can be made numerically exact. However, as the number of measurements increase, for several choices of measurement operators and noise covariance matrices, the spectrum of the (GEP) decays rapidly and we are justified in only retaining the dominant eigenmodes. We discuss tradeoffs between accuracy and computational cost. The resulting algorithms are applied to an example application from ray-based travel time tomography.
A Capacity Forecast Model for Volatile Data in Maintenance Logistics
NASA Astrophysics Data System (ADS)
Berkholz, Daniel
2009-05-01
Maintenance, repair and overhaul processes (MRO processes) are elaborate and complex. Rising demands on these after sales services require reliable production planning and control methods particularly for maintaining valuable capital goods. Downtimes lead to high costs and an inability to meet delivery due dates results in severe contract penalties. Predicting the required capacities for maintenance orders in advance is often difficult due to unknown part conditions unless the goods are actually inspected. This planning uncertainty results in extensive capital tie-up by rising stock levels within the whole MRO network. The article outlines an approach to planning capacities when maintenance data forecasting is volatile. It focuses on the development of prerequisites for a reliable capacity planning model. This enables a quick response to maintenance orders by employing appropriate measures. The information gained through the model is then systematically applied to forecast both personnel capacities and the demand for spare parts. The improved planning reliability can support MRO service providers in shortening delivery times and reducing stock levels in order to enhance the performance of their maintenance logistics.
A study on characteristics of retrospective optimal interpolation with WRF testbed
NASA Astrophysics Data System (ADS)
Kim, S.; Noh, N.; Lim, G.
2012-12-01
This study presents the application of retrospective optimal interpolation (ROI) with Weather Research and Forecasting model (WRF). Song et al. (2009) suggest ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. Song and Lim (2011) improve the method by incorporating eigen-decomposition and covariance inflation. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In this study, ROI method is applied to WRF model to validate the algorithm and to investigate the capability. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance. Using the background error covariance in eigen-space, 1-profile assimilation experiment is performed. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation. The characteristics and strength/weakness of ROI method are investigated by conducting the experiments with other data assimilation method.
Solar Resource Assessment with Sky Imagery and a Virtual Testbed for Sky Imager Solar Forecasting
NASA Astrophysics Data System (ADS)
Kurtz, Benjamin Bernard
In recent years, ground-based sky imagers have emerged as a promising tool for forecasting solar energy on short time scales (0 to 30 minutes ahead). Following the development of sky imager hardware and algorithms at UC San Diego, we present three new or improved algorithms for sky imager forecasting and forecast evaluation. First, we present an algorithm for measuring irradiance with a sky imager. Sky imager forecasts are often used in conjunction with other instruments for measuring irradiance, so this has the potential to decrease instrumentation costs and logistical complexity. In particular, the forecast algorithm itself often relies on knowledge of the current irradiance which can now be provided directly from the sky images. Irradiance measurements are accurate to within about 10%. Second, we demonstrate a virtual sky imager testbed that can be used for validating and enhancing the forecast algorithm. The testbed uses high-quality (but slow) simulations to produce virtual clouds and sky images. Because virtual cloud locations are known, much more advanced validation procedures are possible with the virtual testbed than with measured data. In this way, we are able to determine that camera geometry and non-uniform evolution of the cloud field are the two largest sources of forecast error. Finally, with the assistance of the virtual sky imager testbed, we develop improvements to the cloud advection model used for forecasting. The new advection schemes are 10-20% better at short time horizons.
A multiscale forecasting method for power plant fleet management
NASA Astrophysics Data System (ADS)
Chen, Hongmei
In recent years the electric power industry has been challenged by a high level of uncertainty and volatility brought on by deregulation and globalization. A power producer must minimize the life cycle cost while meeting stringent safety and regulatory requirements and fulfilling customer demand for high reliability. Therefore, to achieve true system excellence, a more sophisticated system-level decision-making process with a more accurate forecasting support system to manage diverse and often widely dispersed generation units as a single, easily scaled and deployed fleet system in order to fully utilize the critical assets of a power producer has been created as a response. The process takes into account the time horizon for each of the major decision actions taken in a power plant and develops methods for information sharing between them. These decisions are highly interrelated and no optimal operation can be achieved without sharing information in the overall process. The process includes a forecasting system to provide information for planning for uncertainty. A new forecasting method is proposed, which utilizes a synergy of several modeling techniques properly combined at different time-scales of the forecasting objects. It can not only take advantages of the abundant historical data but also take into account the impact of pertinent driving forces from the external business environment to achieve more accurate forecasting results. Then block bootstrap is utilized to measure the bias in the estimate of the expected life cycle cost which will actually be needed to drive the business for a power plant in the long run. Finally, scenario analysis is used to provide a composite picture of future developments for decision making or strategic planning. The decision-making process is applied to a typical power producer chosen to represent challenging customer demand during high-demand periods. The process enhances system excellence by providing more accurate market information, evaluating the impact of external business environment, and considering cross-scale interactions between decision actions. Along with this process, system operation strategies, maintenance schedules, and capacity expansion plans that guide the operation of the power plant are optimally identified, and the total life cycle costs are estimated.
A combined road weather forecast system to prevent road ice formation in the Adige Valley (Italy)
NASA Astrophysics Data System (ADS)
Di Napoli, Claudia; Piazza, Andrea; Antonacci, Gianluca; Todeschini, Ilaria; Apolloni, Roberto; Pretto, Ilaria
2016-04-01
Road ice is a dangerous meteorological hazard to a nation's transportation system and economy. By reducing the pavement friction with vehicle tyres, ice formation on pavements increases accident risk and delays travelling times thus posing a serious threat to road users' safety and the running of economic activities. Keeping roads clear and open is therefore essential, especially in mountainous areas where ice is likely to form during the winter period. Winter road maintenance helps to restore road efficiency and security, and its benefits are up to 8 times the costs sustained for anti-icing strategies [1]. However, the optimization of maintenance costs and the reduction of the environmental damage from over-salting demand further improvements. These can be achieved by reliable road weather forecasts, and in particular by the prediction of road surface temperatures (RSTs). RST is one of the most important parameters in determining road surface conditions. It is well known from literature that ice forms on pavements in high-humidity conditions when RSTs are below 0°C. We have therefore implemented an automatic forecast system to predict critical RSTs on a test route along the Adige Valley complex terrain, in the Italian Alps. The system considers two physical models, each computing heat and energy fluxes between the road and the atmosphere. One is Reuter's radiative cooling model, which predicts RSTs at sunrise as a function of surface temperatures at sunset and the time passed since then [2]. One is METRo (Model of the Environment and Temperature of Roads), a road weather forecast software which also considers heat conduction through road material [3]. We have applied the forecast system to a network of road weather stations (road weather information system, RWIS) installed on the test route [4]. Road and atmospheric observations from RWIS have been used as initial conditions for both METRo and Reuter's model. In METRo observations have also been coupled to meteorological forecasts from ECMWF numerical prediction model. Overnight RST minima have then been estimated automatically in nowcast mode. In this presentation we show and discuss results and performances for the 2014-2015 and 2015-2016 winter seasons. Using evaluation indexes we demonstrate that combining METRo and Reuter's models into one single forecast system improves bias and accuracy by about 0.5°C. This study is supported by the LIFE11 ENV/IT/000002 CLEAN-ROADS project. The project aims to assess the environmental impact of salt de-icers in Trentino mountain region by supporting winter road management operations with meteorological information. [1] Thornes J.E. and Stephenson D.B., Meteorological Applications, 8:307 (2001) [2] Reuter H., Tellus, 3:141 (1951) [3] Crevier L.P. and Delage Y., Journal of applied meteorology, 40:2026 (2001) [4] Pretto I. et al., SIRWEC 2014 conference proceedings, ID:0019 (2014)
The costs of climate change: ecosystem services and wildland fires
In this paper we use Habitat Equivalency Analysis (HEA) to monetize the avoided ecosystem services losses due to climate change-induced wildland fires in the U.S. Specifically, we use the U.S. Forest Service’s MC1 dynamic global vegetation model to forecast changes in wildland fi...
A Comparative Verification of Forecasts from Two Operational Solar Wind Models (Postprint)
2012-02-08
much confidence to place on predicted parameters. Cost /benefit information is provided to administrators who decide to sustain or replace existing...magnetic field magnitude and three components of the magnetic field vector in the geocentric solar magnetospheric (GSM) coordinate system at each hour of
Probabilistic Predictions of PM2.5 Using a Novel Ensemble Design for the NAQFC
NASA Astrophysics Data System (ADS)
Kumar, R.; Lee, J. A.; Delle Monache, L.; Alessandrini, S.; Lee, P.
2017-12-01
Poor air quality (AQ) in the U.S. is estimated to cause about 60,000 premature deaths with costs of 100B-150B annually. To reduce such losses, the National AQ Forecasting Capability (NAQFC) at the National Oceanic and Atmospheric Administration (NOAA) produces forecasts of ozone, particulate matter less than 2.5 mm in diameter (PM2.5), and other pollutants so that advance notice and warning can be issued to help individuals and communities limit the exposure and reduce air pollution-caused health problems. The current NAQFC, based on the U.S. Environmental Protection Agency Community Multi-scale AQ (CMAQ) modeling system, provides only deterministic AQ forecasts and does not quantify the uncertainty associated with the predictions, which could be large due to the chaotic nature of atmosphere and nonlinearity in atmospheric chemistry. This project aims to take NAQFC a step further in the direction of probabilistic AQ prediction by exploring and quantifying the potential value of ensemble predictions of PM2.5, and perturbing three key aspects of PM2.5 modeling: the meteorology, emissions, and CMAQ secondary organic aerosol formulation. This presentation focuses on the impact of meteorological variability, which is represented by three members of NOAA's Short-Range Ensemble Forecast (SREF) system that were down-selected by hierarchical cluster analysis. These three SREF members provide the physics configurations and initial/boundary conditions for the Weather Research and Forecasting (WRF) model runs that generate required output variables for driving CMAQ that are missing in operational SREF output. We conducted WRF runs for Jan, Apr, Jul, and Oct 2016 to capture seasonal changes in meteorology. Estimated emissions of trace gases and aerosols via the Sparse Matrix Operator Kernel (SMOKE) system were developed using the WRF output. WRF and SMOKE output drive a 3-member CMAQ mini-ensemble of once-daily, 48-h PM2.5 forecasts for the same four months. The CMAQ mini-ensemble is evaluated against both observations and the current operational deterministic NAQFC products, and analyzed to assess the impact of meteorological biases on PM2.5 variability. Quantification of the PM2.5 prediction uncertainty will prove a key factor to support cost-effective decision-making while protecting public health.
Economic assessment of flood forecasts for a risk-averse decision-maker
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier-Filion, Thomas-Charles
2017-04-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. It has also been suggested in past studies that ensemble forecasts might possess a greater economic value than deterministic forecasts. However, the vast majority of recent hydro-economic literature is based on the cost-loss ratio framework, which might be appealing for its simplicity and intuitiveness. One important drawback of the cost-loss ratio is that it implicitly assumes a risk-neutral decision maker. By definition, a risk-neutral individual is indifferent to forecasts' sharpness: as long as forecasts agree with observations on average, the risk-neutral individual is satisfied. A risk-averse individual, however, is sensitive to the level of precision (sharpness) of forecasts. This person is willing to pay to increase his or her certainty about future events. In fact, this is how insurance companies operate: the probability of seeing one's house burn down is relatively low, so the expected cost related to such event is also low. However, people are willing to buy insurance to avoid the risk, however small, of loosing everything. Similarly, in a context where people's safety and property is at stake, the typical decision maker is more risk-averse than risk-neutral. Consequently, the cost-loss ratio is not the most appropriate tool to assess the economic value of flood forecasts. This presentation describes a more realistic framework for assessing the economic value of such forecasts for flood mitigation purposes. Borrowing from economics, the Constant Absolute Risk Aversion utility function (CARA) is the central tool of this new framework. Utility functions allow explicitly accounting for the level of risk aversion of the decision maker and fully exploiting the information related to ensemble forecasts' uncertainty. Three concurrent ensemble streamflow forecasting systems are compared in terms of quality (comparison with observed values) and in terms of their economic value. This assessment is performed for lead times of one to five days. The three systems are: (1) simple statistically dressed deterministic forecasts, (2) forecasts based on meteorological ensembles and (3) a variant of the latter that also includes an estimation of state variables uncertainty. The comparison takes place on the Montmorency River, a small flood-prone watershed in south central Quebec, Canada. The results show that forecasts quality as assessed by well-known tools such as the Continuous Ranked Probability Score or the reliability diagram do not necessarily translate directly into economic value, especially if the decision maker is not risk-neutral. In addition, results show that the economic value of forecasts for a risk-averse decision maker is very much influenced by the most extreme members of ensemble forecasts (upper tail of the predictive distributions). This study provides a new basis for further improvement of our comprehension of the complex interactions between forecasts uncertainty, risk-aversion and decision-making.
76 FR 9696 - Equipment Price Forecasting in Energy Conservation Standards Analysis
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... for particular efficiency design options, an empirical experience curve fit to the available data may be used to forecast future costs of such design option technologies. If a statistical evaluation indicates a low level of confidence in estimates of the design option cost trend, this method should not be...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-27
... assessment, planning, forecasting, cost/benefit analysis, and to target areas for research. DATES: Written..., forecasting, cost/benefit analysis, and to target areas for research. Respondents: Approximately 83,500 owners... invites public comments about our intention to request the Office of Management and Budget (OMB) approval...
A triangular climate-based decision model to forecast crop anomalies in Kenya
NASA Astrophysics Data System (ADS)
Guimarães Nobre, G.; Davenport, F.; Veldkamp, T.; Jongman, B.; Funk, C. C.; Husak, G. J.; Ward, P.; Aerts, J.
2017-12-01
By the end of 2017, the world is expected to experience unprecedented demands for food assistance where, across 45 countries, some 81 million people will face a food security crisis. Prolonged droughts in Eastern Africa are playing a major role in these crises. To mitigate famine risk and save lives, government bodies and international donor organisations are increasingly building up efforts to resolve conflicts and secure humanitarian relief. Disaster-relief and financing organizations traditionally focus on emergency response, providing aid after an extreme drought event, instead of taking actions in advance based on early warning. One of the reasons for this approach is that the seasonal risk information provided by early warning systems is often considered highly uncertain. Overcoming the reluctance to act based on early warnings greatly relies on understanding the risk of acting in vain, and assessing the cost-effectiveness of early actions. This research develops a triangular climate-based decision model for multiple seasonal time-scales to forecast strong anomalies in crop yield shortages in Kenya using Casual Discovery Algorithms and Fast and Frugal Decision Trees. This Triangular decision model (1) estimates the causality and strength of the relationship between crop yields and hydro climatological predictors (extracted from the Famine Early Warning Systems Network's data archive) during the crop growing season; (2) provides probabilistic forecasts of crop yield shortages in multiple time scales before the harvesting season; and (3) evaluates the cost-effectiveness of different financial mechanisms to respond to early warning indicators of crop yield shortages obtained from the model. Furthermore, we reflect on how such a model complements and advances the current state-of-art FEWS Net system, and examine its potential application to improve the management of agricultural risks in Kenya.
NASA Astrophysics Data System (ADS)
Krietemeyer, Andreas; ten Veldhuis, Marie-claire; van de Giesen, Nick
2017-04-01
Recent research has shown that assimilation of Precipitable Water Vapor (PWV) measurements into numerical weather predictions models improve the quality of rainfall now- and forecasting. Local PWV fluctuations may be related with water vapor increases in the lower troposphere which lead to deep convection. Prior studies show that about 20 minutes before rain occurs, the amount of water vapor in the atmosphere at 1 km height increases. Monitoring the small-scale temporal and spatial variability of PWV is therefore crucial to improve the weather now- and forecasting for convective storms, that are typically critical for urban stormwater systems. One established technique to obtain PWV measurements in the atmosphere is to exploit signal delays from GNSS satellites to dual-frequency receivers on the ground. Existing dual-frequency receiver networks typically have inter-station distances in the order of tens of kilometers, which is not sufficiently dense to capture the small-scale PWV variations. In this study, we will add low-cost, single-frequency GNSS receivers to an existing dual-frequency receiver network to obtain an inter-station distance of about 1 km in the Rotterdam area (Netherlands). The aim is to investigate the spatial variability of PWV in the atmosphere at this scale. We use the surrounding dual-frequency network (distributed over a radius of approximately 25 km) to apply an ionospheric delay model that accounts for the delay in the ionosphere (50-1000 km altitude) that cannot be eliminated by single-frequency receivers. The results are validated by co-aligning a single-frequency receiver to a dual-frequency receiver. In the next steps, we will investigate how the high temporal and increased spatial resolution network can help to improve high-resolution rainfall forecasts. Their supposed improved forecasting results will be evaluated based on high-resolution rainfall estimates from a polarimetric X-band rainfall radar installed in the city of Rotterdam.
Integrated risk/cost planning models for the US Air Traffic system
NASA Technical Reports Server (NTRS)
Mulvey, J. M.; Zenios, S. A.
1985-01-01
A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size.
Forecasting US ivacaftor outcomes and cost in cystic fibrosis patients with the G551D mutation.
Dilokthornsakul, Piyameth; Hansen, Ryan N; Campbell, Jonathan D
2016-06-01
Ivacaftor, a breakthrough treatment for cystic fibrosis (CF) patients with the G551D genetic mutation, lacks long-term clinical and cost projections. This study forecasted outcomes and cost by comparing ivacaftor plus usual care versus usual care alone.A lifetime Markov model was conducted from a US payer perspective. The model consisted of five health states: 1) forced expiratory volume in 1 s (FEV1) % pred ≥70%, 2) 40%≤ FEV1 % pred <70%, 3) FEV1 % pred <40%, 4) lung transplantation and 5) death. All inputs were extracted from published literature. Budget impact was also estimated. We estimated ivacaftor's improvement in outcomes compared with a non-CF referent population.Ivacaftor was associated with 18.25 (95% credible interval (CrI) 13.71-22.20) additional life-years and 15.03 (95% CrI 11.13-18.73) additional quality-adjusted life-years (QALYs). Ivacaftor was associated with improvements in survival and QALYs equivalent to 68% and 56%, respectively, for the survival and QALY gaps between CF usual care and their non-CF peers. The incremental lifetime cost was $3 374 584. The budget impact was $0.087 per member per month.Ivacaftor increased life-years and QALYs in CF patients with the G551D mutation, and moved morbidity and mortality closer to that of their non-CF peers. Ivacaftor costs much more than usual care, but comes at a relatively limited budget impact. Copyright ©ERS 2016.
Economic evaluation of a solar hot-water-system
NASA Technical Reports Server (NTRS)
1981-01-01
Analysis shows economic benefits at six representative sites using actual data from Tempe, Arizona and San Diego, California installations. Model is two-tank cascade water heater with flat-plate collector array for single-family residences. Performances are forecast for Albuquerque, New Mexico; Fort Worth, Texas; Madison, Wisconsin; and Washington, D.C. Costs are compared to net energy savings using variables for each site's environmental conditions, loads, fuel costs, and other economic factors; uncertainty analysis is included.
Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment
NASA Astrophysics Data System (ADS)
Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection
2011-12-01
Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.
Microgrid optimal scheduling considering impact of high penetration wind generation
NASA Astrophysics Data System (ADS)
Alanazi, Abdulaziz
The objective of this thesis is to study the impact of high penetration wind energy in economic and reliable operation of microgrids. Wind power is variable, i.e., constantly changing, and nondispatchable, i.e., cannot be controlled by the microgrid controller. Thus an accurate forecasting of wind power is an essential task in order to study its impacts in microgrid operation. Two commonly used forecasting methods including Autoregressive Integrated Moving Average (ARIMA) and Artificial Neural Network (ANN) have been used in this thesis to improve the wind power forecasting. The forecasting error is calculated using a Mean Absolute Percentage Error (MAPE) and is improved using the ANN. The wind forecast is further used in the microgrid optimal scheduling problem. The microgrid optimal scheduling is performed by developing a viable model for security-constrained unit commitment (SCUC) based on mixed-integer linear programing (MILP) method. The proposed SCUC is solved for various wind penetration levels and the relationship between the total cost and the wind power penetration is found. In order to reduce microgrid power transfer fluctuations, an additional constraint is proposed and added to the SCUC formulation. The new constraint would control the time-based fluctuations. The impact of the constraint on microgrid SCUC results is tested and validated with numerical analysis. Finally, the applicability of proposed models is demonstrated through numerical simulations.
A Case Study of the Impact of AIRS Temperature Retrievals on Numerical Weather Prediction
NASA Technical Reports Server (NTRS)
Reale, O.; Atlas, R.; Jusem, J. C.
2004-01-01
Large errors in numerical weather prediction are often associated with explosive cyclogenesis. Most studes focus on the under-forecasting error, i.e. cases of rapidly developing cyclones which are poorly predicted in numerical models. However, the over-forecasting error (i.e., to predict an explosively developing cyclone which does not occur in reality) is a very common error that severely impacts the forecasting skill of all models and may also present economic costs if associated with operational forecasting. Unnecessary precautions taken by marine activities can result in severe economic loss. Moreover, frequent occurrence of over-forecasting can undermine the reliance on operational weather forecasting. Therefore, it is important to understand and reduce the prdctions of extreme weather associated with explosive cyclones which do not actually develop. In this study we choose a very prominent case of over-forecasting error in the northwestern Pacific. A 960 hPa cyclone develops in less than 24 hour in the 5-day forecast, with a deepening rate of about 30 hPa in one day. The cyclone is not versed in the analyses and is thus a case of severe over-forecasting. By assimilating AIRS data, the error is largely eliminated. By following the propagation of the anomaly that generates the spurious cyclone, it is found that a small mid-tropospheric geopotential height negative anomaly over the northern part of the Indian subcontinent in the initial conditions, propagates westward, is amplified by orography, and generates a very intense jet streak in the subtropical jet stream, with consequent explosive cyclogenesis over the Pacific. The AIRS assimilation eliminates this anomaly that may have been caused by erroneous upper-air data, and represents the jet stream more correctly. The energy associated with the jet is distributed over a much broader area and as a consequence a multiple, but much more moderate cyclogenesis is observed.
Uncertainty analysis of geothermal energy economics
NASA Astrophysics Data System (ADS)
Sener, Adil Caner
This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be captured in the valuation model. Finally, the study will compare the probability distributions of development cost and project value and discusses the market penetration potential of the geothermal power generation. There is a recent world wide interest in geothermal utilization projects. There are several reasons for the recent popularity of geothermal energy, including the increasing volatility of fossil fuel prices, need for domestic energy sources, approaching carbon emission limitations and state renewable energy standards, increasing need for baseload units, and new technology to make geothermal energy more attractive for power generation. It is our hope that this study will contribute to the recent progress of geothermal energy by shedding light on the uncertainty of geothermal energy project costs.
2007-01-01
focus on identifying growth by income and housing costs. These, and other models are focused on the city itself and deal with growth over the course...2. This model employs a set of econometric models to project future population, household, and employment. The landscape is gridded into one... model in LEAM (LEAMecon) forecasts changes in output, employment and income over time based on changes in the market, technology, productivity and
Prediction of Weather Impacted Airport Capacity using Ensemble Learning
NASA Technical Reports Server (NTRS)
Wang, Yao Xun
2011-01-01
Ensemble learning with the Bagging Decision Tree (BDT) model was used to assess the impact of weather on airport capacities at selected high-demand airports in the United States. The ensemble bagging decision tree models were developed and validated using the Federal Aviation Administration (FAA) Aviation System Performance Metrics (ASPM) data and weather forecast at these airports. The study examines the performance of BDT, along with traditional single Support Vector Machines (SVM), for airport runway configuration selection and airport arrival rates (AAR) prediction during weather impacts. Testing of these models was accomplished using observed weather, weather forecast, and airport operation information at the chosen airports. The experimental results show that ensemble methods are more accurate than a single SVM classifier. The airport capacity ensemble method presented here can be used as a decision support model that supports air traffic flow management to meet the weather impacted airport capacity in order to reduce costs and increase safety.
Forecasting and evaluating patterns of energy development in southwestern Wyoming
Garman, Steven L.
2015-01-01
The effects of future oil and natural gas development in southwestern Wyoming on wildlife populations are topical to conservation of the sagebrush steppe ecosystem. To aid in understanding these potential effects, the U.S. Geological Survey developed an Energy Footprint simulation model that forecasts the amount and pattern of energy development under different assumptions of development rates and well-drilling methods. The simulated disturbance patterns produced by the footprint model are used to assess the potential effects on wildlife habitat and populations. A goal of this modeling effort is to use measures of energy production (number of simulated wells), well-pad and road-surface disturbance, and potential effects on wildlife to identify build-out designs that minimize the physical and ecological footprint of energy development for different levels of energy production and development costs.
Forecasting staffing needs for productivity management in hospital laboratories.
Pang, C Y; Swint, J M
1985-12-01
Daily and weekly prediction models are developed to help forecast hospital laboratory work load for the entire laboratory and individual sections of the laboratory. The models are tested using historical data obtained from hospital census and laboratory log books of a 90-bed southwestern hospital. The results indicate that the predictor variables account for 50%, 81%, 56%, and 82% of the daily work load variation for chemistry, hematology, and microbiology sections, and for the entire laboratory, respectively. Equivalent results for the weekly model are 53%, 72%, 12%, and 78% for the same respective sections. On the basis of the predicted work load, staffing assessment is made and a productivity monitoring system constructed. The purpose of such a system is to assist laboratory management in efforts to utilize laboratory manpower in a more efficient and cost-effective manner.
NASA Astrophysics Data System (ADS)
Habert, J.; Ricci, S.; Le Pape, E.; Thual, O.; Piacentini, A.; Goutal, N.; Jonville, G.; Rochoux, M.
2016-01-01
This paper presents a data-driven hydrodynamic simulator based on the 1-D hydraulic solver dedicated to flood forecasting with lead time of an hour up to 24 h. The goal of the study is to reduce uncertainties in the hydraulic model and thus provide more reliable simulations and forecasts in real time for operational use by the national hydrometeorological flood forecasting center in France. Previous studies have shown that sequential assimilation of water level or discharge data allows to adjust the inflows to the hydraulic network resulting in a significant improvement of the discharge while leaving the water level state imperfect. Two strategies are proposed here to improve the water level-discharge relation in the model. At first, a modeling strategy consists in improving the description of the river bed geometry using topographic and bathymetric measurements. Secondly, an inverse modeling strategy proposes to locally correct friction coefficients in the river bed and the flood plain through the assimilation of in situ water level measurements. This approach is based on an Extended Kalman filter algorithm that sequentially assimilates data to infer the upstream and lateral inflows at first and then the friction coefficients. It provides a time varying correction of the hydrological boundary conditions and hydraulic parameters. The merits of both strategies are demonstrated on the Marne catchment in France for eight validation flood events and the January 2004 flood event is used as an illustrative example throughout the paper. The Nash-Sutcliffe criterion for water level is improved from 0.135 to 0.832 for a 12-h forecast lead time with the data assimilation strategy. These developments have been implemented at the SAMA SPC (local flood forecasting service in the Haute-Marne French department) and used for operational forecast since 2013. They were shown to provide an efficient tool for evaluating flood risk and to improve the flood early warning system. Complementary with the deterministic forecast of the hydraulic state, the estimation of an uncertainty range is given relying on off-line and on-line diagnosis. The possibilities to further extend the control vector while limiting the computational cost and equifinality problem are finally discussed.
The 30/20 GHz fixed communications systems service demand assessment. Volume 3: Annex
NASA Technical Reports Server (NTRS)
Gamble, R. B.; Seltzer, H. R.; Speter, K. M.; Westheimer, M.
1979-01-01
A review of studies forecasting the communication market in the United States is given. The applicability of these forecasts to assessment of demand for the 30/20 GHz fixed communications system is analyzed. Costs for the 30/20 satellite trunking systems are presented and compared with the cost of terrestrial communications.
Spatially explicit forecasts of large wildland fire probability and suppression costs for California
Haiganoush Preisler; Anthony L. Westerling; Krista M. Gebert; Francisco Munoz-Arriola; Thomas P. Holmes
2011-01-01
In the last decade, increases in fire activity and suppression expenditures have caused budgetary problems for federal land management agencies. Spatial forecasts of upcoming fire activity and costs have the potential to help reduce expenditures, and increase the efficiency of suppression efforts, by enabling them to focus resources where they have the greatest effect...
Time-varying loss forecast for an earthquake scenario in Basel, Switzerland
NASA Astrophysics Data System (ADS)
Herrmann, Marcus; Zechar, Jeremy D.; Wiemer, Stefan
2014-05-01
When an unexpected earthquake occurs, people suddenly want advice on how to cope with the situation. The 2009 L'Aquila quake highlighted the significance of public communication and pushed the usage of scientific methods to drive alternative risk mitigation strategies. For instance, van Stiphout et al. (2010) suggested a new approach for objective evacuation decisions on short-term: probabilistic risk forecasting combined with cost-benefit analysis. In the present work, we apply this approach to an earthquake sequence that simulated a repeat of the 1356 Basel earthquake, one of the most damaging events in Central Europe. A recent development to benefit society in case of an earthquake are probabilistic forecasts of the aftershock occurrence. But seismic risk delivers a more direct expression of the socio-economic impact. To forecast the seismic risk on short-term, we translate aftershock probabilities to time-varying seismic hazard and combine this with time-invariant loss estimation. Compared with van Stiphout et al. (2010), we use an advanced aftershock forecasting model and detailed settlement data to allow us spatial forecasts and settlement-specific decision-making. We quantify the risk forecast probabilistically in terms of human loss. For instance one minute after the M6.6 mainshock, the probability for an individual to die within the next 24 hours is 41 000 times higher than the long-term average; but the absolute value remains at minor 0.04 %. The final cost-benefit analysis adds value beyond a pure statistical approach: it provides objective statements that may justify evacuations. To deliver supportive information in a simple form, we propose a warning approach in terms of alarm levels. Our results do not justify evacuations prior to the M6.6 mainshock, but in certain districts afterwards. The ability to forecast the short-term seismic risk at any time-and with sufficient data anywhere-is the first step of personal decision-making and raising risk awareness among the public. Reference Van Stiphout, T., S. Wiemer, and W. Marzocchi (2010). 'Are short-term evacuations warranted? Case of the 2009 L'Aquila earthquake'. In: Geophysical Research Letters 37.6, pp. 1-5. url: http://onlinelibrary.wiley.com/doi/10.1029/ 2009GL042352/abstract.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parks, K.; Wan, Y. H.; Wiener, G.
2011-10-01
The focus of this report is the wind forecasting system developed during this contract period with results of performance through the end of 2010. The report is intentionally high-level, with technical details disseminated at various conferences and academic papers. At the end of 2010, Xcel Energy managed the output of 3372 megawatts of installed wind energy. The wind plants span three operating companies1, serving customers in eight states2, and three market structures3. The great majority of the wind energy is contracted through power purchase agreements (PPAs). The remainder is utility owned, Qualifying Facilities (QF), distributed resources (i.e., 'behind the meter'),more » or merchant entities within Xcel Energy's Balancing Authority footprints. Regardless of the contractual or ownership arrangements, the output of the wind energy is balanced by Xcel Energy's generation resources that include fossil, nuclear, and hydro based facilities that are owned or contracted via PPAs. These facilities are committed and dispatched or bid into day-ahead and real-time markets by Xcel Energy's Commercial Operations department. Wind energy complicates the short and long-term planning goals of least-cost, reliable operations. Due to the uncertainty of wind energy production, inherent suboptimal commitment and dispatch associated with imperfect wind forecasts drives up costs. For example, a gas combined cycle unit may be turned on, or committed, in anticipation of low winds. The reality is winds stayed high, forcing this unit and others to run, or be dispatched, to sub-optimal loading positions. In addition, commitment decisions are frequently irreversible due to minimum up and down time constraints. That is, a dispatcher lives with inefficient decisions made in prior periods. In general, uncertainty contributes to conservative operations - committing more units and keeping them on longer than may have been necessary for purposes of maintaining reliability. The downside is costs are higher. In organized electricity markets, units that are committed for reliability reasons are paid their offer price even when prevailing market prices are lower. Often, these uplift charges are allocated to market participants that caused the inefficient dispatch in the first place. Thus, wind energy facilities are burdened with their share of costs proportional to their forecast errors. For Xcel Energy, wind energy uncertainty costs manifest depending on specific market structures. In the Public Service of Colorado (PSCo), inefficient commitment and dispatch caused by wind uncertainty increases fuel costs. Wind resources participating in the Midwest Independent System Operator (MISO) footprint make substantial payments in the real-time markets to true-up their day-ahead positions and are additionally burdened with deviation charges called a Revenue Sufficiency Guarantee (RSG) to cover out of market costs associated with operations. Southwest Public Service (SPS) wind plants cause both commitment inefficiencies and are charged Southwest Power Pool (SPP) imbalance payments due to wind uncertainty and variability. Wind energy forecasting helps mitigate these costs. Wind integration studies for the PSCo and Northern States Power (NSP) operating companies have projected increasing costs as more wind is installed on the system due to forecast error. It follows that reducing forecast error would reduce these costs. This is echoed by large scale studies in neighboring regions and states that have recommended adoption of state-of-the-art wind forecasting tools in day-ahead and real-time planning and operations. Further, Xcel Energy concluded reduction of the normalized mean absolute error by one percent would have reduced costs in 2008 by over $1 million annually in PSCo alone. The value of reducing forecast error prompted Xcel Energy to make substantial investments in wind energy forecasting research and development.« less
Pricing a Protest: Forecasting the Dynamics of Civil Unrest Activity in Social Media.
Goode, Brian J; Krishnan, Siddharth; Roan, Michael; Ramakrishnan, Naren
2015-01-01
Online social media activity can often be a precursor to disruptive events such as protests, strikes, and "occupy" movements. We have observed that such civil unrest can galvanize supporters through social networks and help recruit activists to their cause. Understanding the dynamics of social network cascades and extrapolating their future growth will enable an analyst to detect or forecast major societal events. Existing work has primarily used structural and temporal properties of cascades to predict their future behavior. But factors like societal pressure, alignment of individual interests with broader causes, and perception of expected benefits also affect protest participation in social media. Here we develop an analysis framework using a differential game theoretic approach to characterize the cost of participating in a cascade, and demonstrate how we can combine such cost features with classical properties to forecast the future behavior of cascades. Using data from Twitter, we illustrate the effectiveness of our models on the "Brazilian Spring" and Venezuelan protests that occurred in June 2013 and November 2013, respectively. We demonstrate how our framework captures both qualitative and quantitative aspects of how these uprisings manifest through the lens of tweet volume on Twitter social media.
NASA Technical Reports Server (NTRS)
Andrastek, D. A.
1976-01-01
The objectives of this phase of the study were (1) to assess the 10 year operating cost trends of the local service airlines operating in the 1965 through 1974 period, (2) to glean from these trends the technological and operational parameters which were impacted most significantly by the transition to newer pure jet, short haul transports, and effected by changing fuel prices and cost of living indices, and (3) to develop, construct, and evaluate an operating cost forecasting model which would incorporate those factors which best predicted airline total operating cost behavior over that 10-year period.
2015-03-26
acquisition programs’ cost and schedule. Many prior studies have focused on the overall cost of programs (the cost estimate at completion (EAC)) ( Smoker ...regression ( Smoker , 2011), the Kalman Filter Forecasting Method (Kim, 2007), and analysis of the Integrated Master Schedule (IMS). All of the...A study by Smoker demonstrated this technique by first regressing the BCWP against months and the same approach for BAC (2011). In that study
Optimal Day-Ahead Scheduling of a Hybrid Electric Grid Using Weather Forecasts
2013-12-01
ahead scheduling, Weather forecast , Wind power , Photovoltaic Power 15. NUMBER OF PAGES 107 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...cost can be reached by accurately anticipating the future renewable power productions. This thesis suggests the use of weather forecasts to establish...reached by accurately anticipating the future renewable power productions. This thesis suggests the use of weather forecasts to establish day-ahead
Lu, Wei-Zhen; Wang, Wen-Jian; Wang, Xie-Kang; Yan, Sui-Hang; Lam, Joseph C
2004-09-01
The forecasting of air pollutant trends has received much attention in recent years. It is an important and popular topic in environmental science, as concerns have been raised about the health impacts caused by unacceptable ambient air pollutant levels. Of greatest concern are metropolitan cities like Hong Kong. In Hong Kong, respirable suspended particulates (RSP), nitrogen oxides (NOx), and nitrogen dioxide (NO2) are major air pollutants due to the dominant usage of diesel fuel by commercial vehicles and buses. Hence, the study of the influence and the trends relating to these pollutants is extremely significant to the public health and the image of the city. The use of neural network techniques to predict trends relating to air pollutants is regarded as a reliable and cost-effective method for the task of prediction. The works reported here involve developing an improved neural network model that combines both the principal component analysis technique and the radial basis function network and forecasts pollutant tendencies based on a recorded database. Compared with general neural network models, the proposed model features a more simple network architecture, a faster training speed, and a more satisfactory prediction performance. The improved model was evaluated with hourly time series of RSP, NOx and NO2 concentrations monitored at the Mong Kok Roadside Gaseous Monitory Station in Hong Kong during the year 2000 and proved to be effective. The model developed is a potential tool for forecasting air quality parameters and is superior to traditional neural network methods.
Costs of vaccine programs across 94 low- and middle-income countries.
Portnoy, Allison; Ozawa, Sachiko; Grewal, Simrun; Norman, Bryan A; Rajgopal, Jayant; Gorham, Katrin M; Haidari, Leila A; Brown, Shawn T; Lee, Bruce Y
2015-05-07
While new mechanisms such as advance market commitments and co-financing policies of the GAVI Alliance are allowing low- and middle-income countries to gain access to vaccines faster than ever, understanding the full scope of vaccine program costs is essential to ensure adequate resource mobilization. This costing analysis examines the vaccine costs, supply chain costs, and service delivery costs of immunization programs for routine immunization and for supplemental immunization activities (SIAs) for vaccines related to 18 antigens in 94 countries across the decade, 2011-2020. Vaccine costs were calculated using GAVI price forecasts for GAVI-eligible countries, and assumptions from the PAHO Revolving Fund and UNICEF for middle-income countries not supported by the GAVI Alliance. Vaccine introductions and coverage levels were projected primarily based on GAVI's Adjusted Demand Forecast. Supply chain costs including costs of transportation, storage, and labor were estimated by developing a mechanistic model using data generated by the HERMES discrete event simulation models. Service delivery costs were abstracted from comprehensive multi-year plans for the majority of GAVI-eligible countries and regression analysis was conducted to extrapolate costs to additional countries. The analysis shows that the delivery of the full vaccination program across 94 countries would cost a total of $62 billion (95% uncertainty range: $43-$87 billion) over the decade, including $51 billion ($34-$73 billion) for routine immunization and $11 billion ($7-$17 billion) for SIAs. More than half of these costs stem from service delivery at $34 billion ($21-$51 billion)-with an additional $24 billion ($13-$41 billion) in vaccine costs and $4 billion ($3-$5 billion) in supply chain costs. The findings present the global costs to attain the goals envisioned during the Decade of Vaccines to prevent millions of deaths by 2020 through more equitable access to existing vaccines for people in all communities. By projecting the full costs of immunization programs, our findings may aid to garner greater country and donor commitments toward adequate resource mobilization and efficient allocation. As service delivery costs have increasingly become the main driver of vaccination program costs, it is essential to pay additional consideration to health systems strengthening. Copyright © 2015 Elsevier Ltd. All rights reserved.
Proposed reliability cost model
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1973-01-01
The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.
Forecasting the future burden of opioids for osteoarthritis.
Ackerman, I N; Zomer, E; Gilmartin-Thomas, J F-M; Liew, D
2018-03-01
To quantify the current national burden of opioids for osteoarthritis (OA) pain in Australia in terms of number of dispensed opioid prescriptions and associated costs, and to forecast the likely burden to the year 2030/31. Epidemiological modelling. Published data were obtained on rates of opioid prescribing for people with OA and national OA prevalence projections. Trends in opioid dispensing from 2006 to 2016, and average costs for common opioid subtypes were obtained from the Pharmaceutical Benefits Scheme and Medicare Australia Statistics. Using these inputs, a model was developed to estimate the likely number of dispensed opioid prescriptions and costs to the public healthcare system by 2030/31. In 2015/16, an estimated 1.1 million opioid prescriptions were dispensed in Australia for 403,954 people with OA (of a total 2.2 million Australians with OA). Based on recent dispensing trends and OA prevalence projections, the number of dispensed opioid prescriptions is expected to nearly triple to 3,032,332 by 2030/31, for an estimated 562,610 people with OA. The estimated cost to the Australian healthcare system was $AUD25.2 million in 2015/16, rising to $AUD72.4 million by 2030/31. OA-related opioid dispensing and associated costs are set to increase substantially in Australia from 2015/16 to 2030/31. Use of opioids for OA pain is concerning given joint disease chronicity and the risk of adverse events, particularly among older people. These projections represent a conservative estimate of the full financial burden given additional costs associated with opioid-related harms and out-of-pocket costs borne by patients. Copyright © 2017 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Thiboult, A.; Anctil, F.
2015-10-01
Forecast reliability and accuracy is a prerequisite for successful hydrological applications. This aim may be attained by using data assimilation techniques such as the popular Ensemble Kalman filter (EnKF). Despite its recognized capacity to enhance forecasting by creating a new set of initial conditions, implementation tests have been mostly carried out with a single model and few catchments leading to case specific conclusions. This paper performs an extensive testing to assess ensemble bias and reliability on 20 conceptual lumped models and 38 catchments in the Province of Québec with perfect meteorological forecast forcing. The study confirms that EnKF is a powerful tool for short range forecasting but also that it requires a more subtle setting than it is frequently recommended. The success of the updating procedure depends to a great extent on the specification of the hyper-parameters. In the implementation of the EnKF, the identification of the hyper-parameters is very unintuitive if the model error is not explicitly accounted for and best estimates of forcing and observation error lead to overconfident forecasts. It is shown that performance are also related to the choice of updated state variables and that all states variables should not systematically be updated. Additionally, the improvement over the open loop scheme depends on the watershed and hydrological model structure, as some models exhibit a poor compatibility with EnKF updating. Thus, it is not possible to conclude in detail on a single ideal manner to identify an optimal implementation; conclusions drawn from a unique event, catchment, or model are likely to be misleading since transferring hyper-parameters from a case to another may be hazardous. Finally, achieving reliability and bias jointly is a daunting challenge as the optimization of one score is done at the cost of the other.
Demand forecasting for automotive sector in Malaysia by system dynamics approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zulkepli, Jafri, E-mail: zhjafri@uum.edu.my; Abidin, Norhaslinda Zainal, E-mail: nhaslinda@uum.edu.my; Fong, Chan Hwa, E-mail: hfchan7623@yahoo.com
In general, Proton as an automotive company needs to forecast future demand of the car to assist in decision making related to capacity expansion planning. One of the forecasting approaches that based on judgemental or subjective factors is normally used to forecast the demand. As a result, demand could be overstock that eventually will increase the operation cost; or the company will face understock, which resulted losing their customers. Due to automotive industry is very challenging process because of high level of complexity and uncertainty involved in the system, an accurate tool to forecast the future of automotive demand frommore » the modelling perspective is required. Hence, the main objective of this paper is to forecast the demand of automotive Proton car industry in Malaysia using system dynamics approach. Two types of intervention namely optimistic and pessimistic experiments scenarios have been tested to determine the capacity expansion that can prevent the company from overstocking. Finding from this study highlighted that the management needs to expand their production for optimistic scenario, whilst pessimistic give results that would otherwise. Finally, this study could help Proton Edar Sdn. Bhd (PESB) to manage the long-term capacity planning in order to meet the future demand of the Proton cars.« less
Forecasting land cover change impacts on drinking water treatment costs in Minneapolis, Minnesota
NASA Astrophysics Data System (ADS)
Woznicki, S. A.; Wickham, J.
2017-12-01
Source protection is a critical aspect of drinking water treatment. The benefits of protecting source water quality in reducing drinking water treatment costs are clear. However, forecasting the impacts of environmental change on source water quality and its potential to influence future treatment processes is lacking. The drinking water treatment plant in Minneapolis, MN has recognized that land cover change threatens water quality in their source watershed, the Upper Mississippi River Basin (UMRB). Over 1,000 km2 of forests, wetlands, and grasslands in the UMRB were lost to agriculture from 2008-2013. This trend, coupled with a projected population increase of one million people in Minnesota by 2030, concerns drinking water treatment plant operators in Minneapolis with respect to meeting future demand for clean water in the UMRB. The objective of this study is to relate land cover change (forest and wetland loss, agricultural expansion, urbanization) to changes in treatment costs for the Minneapolis, MN drinking water utility. To do this, we first developed a framework to determine the relationship between land cover change and water quality in the context of recent historical changes and projected future changes in land cover. Next we coupled a watershed model, the Soil and Water Assessment Tool (SWAT) to projections of land cover change from the FOREcasting SCEnarios of Land-use Change (FORE-SCE) model for the mid-21st century. Using historical Minneapolis drinking water treatment data (chemical usage and costs), source water quality in the UMRB was linked to changes in treatment requirements as a function of projected future land cover change. These analyses will quantify the value of natural landscapes in protecting drinking water quality and future treatment processes requirements. In addition, our study provides the Minneapolis drinking water utility with information critical to their planning and capital improvement process.
Estimating the budget impact of orphan drugs in Sweden and France 2013–2020
2014-01-01
Background The growth in expenditure on orphan medicinal products (OMP) across Europe has been identified as a concern. Estimates of future expenditure in Europe have suggested that OMPs could account for a significant proportion of total pharmaceutical expenditure in some countries, but few of these forecasts have been well validated. This analysis aims to establish a robust forecast of the future budget impact of OMPs on the healthcare systems in Sweden and France. Methods A dynamic forecasting model was created to estimate the budget impact of OMPs in Sweden and France between 2013 and 2020. The model used historical data on OMP designation and approval rates to predict the number of new OMPs coming to the market. Average OMP sales were estimated for each year post-launch by regression analysis of historical sales data. Total forecast sales were compared with expected sales of all pharmaceuticals in each country to quantify the relative budget impact. Results The model predicts that by 2020, 152 OMPs will have marketing authorization in Europe. The base case OMP budget impacts are forecast to grow from 2.7% in Sweden and 3.2% in France of total drug expenditure in 2013 to 4.1% in Sweden and 4.9% in France by 2020. The principal driver of expenditure growth is the number of new OMPs obtaining OMP designation. This is tempered by the slowing success rate for new approvals and the loss of intellectual property protection on existing orphan medicines. Given the forward-looking nature of the analysis, uncertainty exists around model parameters and sensitivity analysis found peak year budget impact varying between 2% and 11%. Conclusion The budget impact of OMPs in Sweden and France is likely to remain sustainable over time and a relatively small proportion of total pharmaceutical expenditure. This forecast could be affected by changes in the success rate for OMP approvals, average cost of OMPs, and the type of companies developing OMPs. PMID:24524281
Estimating the budget impact of orphan drugs in Sweden and France 2013-2020.
Hutchings, Adam; Schey, Carina; Dutton, Richard; Achana, Felix; Antonov, Karolina
2014-02-13
The growth in expenditure on orphan medicinal products (OMP) across Europe has been identified as a concern. Estimates of future expenditure in Europe have suggested that OMPs could account for a significant proportion of total pharmaceutical expenditure in some countries, but few of these forecasts have been well validated. This analysis aims to establish a robust forecast of the future budget impact of OMPs on the healthcare systems in Sweden and France. A dynamic forecasting model was created to estimate the budget impact of OMPs in Sweden and France between 2013 and 2020. The model used historical data on OMP designation and approval rates to predict the number of new OMPs coming to the market. Average OMP sales were estimated for each year post-launch by regression analysis of historical sales data. Total forecast sales were compared with expected sales of all pharmaceuticals in each country to quantify the relative budget impact. The model predicts that by 2020, 152 OMPs will have marketing authorization in Europe. The base case OMP budget impacts are forecast to grow from 2.7% in Sweden and 3.2% in France of total drug expenditure in 2013 to 4.1% in Sweden and 4.9% in France by 2020. The principal driver of expenditure growth is the number of new OMPs obtaining OMP designation. This is tempered by the slowing success rate for new approvals and the loss of intellectual property protection on existing orphan medicines. Given the forward-looking nature of the analysis, uncertainty exists around model parameters and sensitivity analysis found peak year budget impact varying between 2% and 11%. The budget impact of OMPs in Sweden and France is likely to remain sustainable over time and a relatively small proportion of total pharmaceutical expenditure. This forecast could be affected by changes in the success rate for OMP approvals, average cost of OMPs, and the type of companies developing OMPs.
Ensemble data assimilation in the Red Sea: sensitivity to ensemble selection and atmospheric forcing
NASA Astrophysics Data System (ADS)
Toye, Habib; Zhan, Peng; Gopalakrishnan, Ganesh; Kartadikaria, Aditya R.; Huang, Huang; Knio, Omar; Hoteit, Ibrahim
2017-07-01
We present our efforts to build an ensemble data assimilation and forecasting system for the Red Sea. The system consists of the high-resolution Massachusetts Institute of Technology general circulation model (MITgcm) to simulate ocean circulation and of the Data Research Testbed (DART) for ensemble data assimilation. DART has been configured to integrate all members of an ensemble adjustment Kalman filter (EAKF) in parallel, based on which we adapted the ensemble operations in DART to use an invariant ensemble, i.e., an ensemble Optimal Interpolation (EnOI) algorithm. This approach requires only single forward model integration in the forecast step and therefore saves substantial computational cost. To deal with the strong seasonal variability of the Red Sea, the EnOI ensemble is then seasonally selected from a climatology of long-term model outputs. Observations of remote sensing sea surface height (SSH) and sea surface temperature (SST) are assimilated every 3 days. Real-time atmospheric fields from the National Center for Environmental Prediction (NCEP) and the European Center for Medium-Range Weather Forecasts (ECMWF) are used as forcing in different assimilation experiments. We investigate the behaviors of the EAKF and (seasonal-) EnOI and compare their performances for assimilating and forecasting the circulation of the Red Sea. We further assess the sensitivity of the assimilation system to various filtering parameters (ensemble size, inflation) and atmospheric forcing.
Real Time Volcanic Cloud Products and Predictions for Aviation Alerts
NASA Technical Reports Server (NTRS)
Krotkov, Nickolay A.; Habib, Shahid; da Silva, Arlindo; Hughes, Eric; Yang, Kai; Brentzel, Kelvin; Seftor, Colin; Li, Jason Y.; Schneider, David; Guffanti, Marianne;
2014-01-01
Volcanic eruptions can inject significant amounts of sulfur dioxide (SO2) and volcanic ash into the atmosphere, posing a substantial risk to aviation safety. Ingesting near-real time and Direct Readout satellite volcanic cloud data is vital for improving reliability of volcanic ash forecasts and mitigating the effects of volcanic eruptions on aviation and the economy. NASA volcanic products from the Ozone Monitoring Insrument (OMI) aboard the Aura satellite have been incorporated into Decision Support Systems of many operational agencies. With the Aura mission approaching its 10th anniversary, there is an urgent need to replace OMI data with those from the next generation operational NASA/NOAA Suomi National Polar Partnership (SNPP) satellite. The data provided from these instruments are being incorporated into forecasting models to provide quantitative ash forecasts for air traffic management. This study demonstrates the feasibility of the volcanic near-real time and Direct Readout data products from the new Ozone Monitoring and Profiling Suite (OMPS) ultraviolet sensor onboard SNPP for monitoring and forecasting volcanic clouds. The transition of NASA data production to our operational partners is outlined. Satellite observations are used to constrain volcanic cloud simulations and improve estimates of eruption parameters, resulting in more accurate forecasts. This is demonstrated for the 2012 eruption of Copahue. Volcanic eruptions are modeled using the Goddard Earth Observing System, Version 5 (GEOS-5) and the Goddard Chemistry Aerosol and Radiation Transport (GOCART) model. A hindcast of the disruptive eruption from Iceland's Eyjafjallajokull is used to estimate aviation re-routing costs using Metron Aviation's ATM Tools.
Accounting for groundwater in stream fish thermal habitat responses to climate change
Snyder, Craig D.; Hitt, Nathaniel P.; Young, John A.
2015-01-01
Forecasting climate change effects on aquatic fauna and their habitat requires an understanding of how water temperature responds to changing air temperature (i.e., thermal sensitivity). Previous efforts to forecast climate effects on brook trout habitat have generally assumed uniform air-water temperature relationships over large areas that cannot account for groundwater inputs and other processes that operate at finer spatial scales. We developed regression models that accounted for groundwater influences on thermal sensitivity from measured air-water temperature relationships within forested watersheds in eastern North America (Shenandoah National Park, USA, 78 sites in 9 watersheds). We used these reach-scale models to forecast climate change effects on stream temperature and brook trout thermal habitat, and compared our results to previous forecasts based upon large-scale models. Observed stream temperatures were generally less sensitive to air temperature than previously assumed, and we attribute this to the moderating effect of shallow groundwater inputs. Predicted groundwater temperatures from air-water regression models corresponded well to observed groundwater temperatures elsewhere in the study area. Predictions of brook trout future habitat loss derived from our fine-grained models were far less pessimistic than those from prior models developed at coarser spatial resolutions. However, our models also revealed spatial variation in thermal sensitivity within and among catchments resulting in a patchy distribution of thermally suitable habitat. Habitat fragmentation due to thermal barriers therefore may have an increasingly important role for trout population viability in headwater streams. Our results demonstrate that simple adjustments to air-water temperature regression models can provide a powerful and cost-effective approach for predicting future stream temperatures while accounting for effects of groundwater.
NASA Astrophysics Data System (ADS)
Skamarock, W. C.
2015-12-01
One of the major problems in atmospheric model applications is the representation of deep convection within the models; explicit simulation of deep convection on fine meshes performs much better than sub-grid parameterized deep convection on coarse meshes. Unfortunately, the high cost of explicit convective simulation has meant it has only been used to down-scale global simulations in weather prediction and regional climate applications, typically using traditional one-way interactive nesting technology. We have been performing real-time weather forecast tests using a global non-hydrostatic atmospheric model (the Model for Prediction Across Scales, MPAS) that employs a variable-resolution unstructured Voronoi horizontal mesh (nominally hexagons) to span hydrostatic to nonhydrostatic scales. The smoothly varying Voronoi mesh eliminates many downscaling problems encountered using traditional one- or two-way grid nesting. Our test weather forecasts cover two periods - the 2015 Spring Forecast Experiment conducted at the NOAA Storm Prediction Center during the month of May in which we used a 50-3 km mesh, and the PECAN field program examining nocturnal convection over the US during the months of June and July in which we used a 15-3 km mesh. An important aspect of this modeling system is that the model physics be scale-aware, particularly the deep convection parameterization. These MPAS simulations employ the Grell-Freitas scale-aware convection scheme. Our test forecasts show that the scheme produces a gradual transition in the deep convection, from the deep unstable convection being handled entirely by the convection scheme on the coarse mesh regions (dx > 15 km), to the deep convection being almost entirely explicit on the 3 km NA region of the meshes. We will present results illustrating the performance of critical aspects of the MPAS model in these tests.
Forecasting Lightning Threat using Cloud-resolving Model Simulations
NASA Technical Reports Server (NTRS)
McCaul, E. W., Jr.; Goodman, S. J.; LaCasse, K. M.; Cecil, D. J.
2009-01-01
As numerical forecasts capable of resolving individual convective clouds become more common, it is of interest to see if quantitative forecasts of lightning flash rate density are possible, based on fields computed by the numerical model. Previous observational research has shown robust relationships between observed lightning flash rates and inferred updraft and large precipitation ice fields in the mixed phase regions of storms, and that these relationships might allow simulated fields to serve as proxies for lightning flash rate density. It is shown in this paper that two simple proxy fields do indeed provide reasonable and cost-effective bases for creating time-evolving maps of predicted lightning flash rate density, judging from a series of diverse simulation case study events in North Alabama for which Lightning Mapping Array data provide ground truth. One method is based on the product of upward velocity and the mixing ratio of precipitating ice hydrometeors, modeled as graupel only, in the mixed phase region of storms at the -15\\dgc\\ level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domainwide statistics of the peak values of simulated flash rate proxy fields against domainwide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. A blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Weather Research and Forecast Model simulations of selected North Alabama cases show that this model can distinguish the general character and intensity of most convective events, and that the proposed methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because models tend to have more difficulty in correctly predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models, the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of cloud-allowing forecasts become available.
NASA Astrophysics Data System (ADS)
Bogner, Konrad; Monhart, Samuel; Liniger, Mark; Spririg, Christoph; Jordan, Fred; Zappa, Massimiliano
2015-04-01
In recent years large progresses have been achieved in the operational prediction of floods and hydrological drought with up to ten days lead time. Both the public and the private sectors are currently using probabilistic runoff forecast in order to monitoring water resources and take actions when critical conditions are to be expected. The use of extended-range predictions with lead times exceeding 10 days is not yet established. The hydropower sector in particular might have large benefits from using hydro meteorological forecasts for the next 15 to 60 days in order to optimize the operations and the revenues from their watersheds, dams, captions, turbines and pumps. The new Swiss Competence Centers in Energy Research (SCCER) targets at boosting research related to energy issues in Switzerland. The objective of HEPS4POWER is to demonstrate that operational extended-range hydro meteorological forecasts have the potential to become very valuable tools for fine tuning the production of energy from hydropower systems. The project team covers a specific system-oriented value chain starting from the collection and forecast of meteorological data (MeteoSwiss), leading to the operational application of state-of-the-art hydrological models (WSL) and terminating with the experience in data presentation and power production forecasts for end-users (e-dric.ch). The first task of the HEPS4POWER will be the downscaling and post-processing of ensemble extended-range meteorological forecasts (EPS). The goal is to provide well-tailored forecasts of probabilistic nature that should be reliable in statistical and localized at catchment or even station level. The hydrology related task will consist in feeding the post-processed meteorological forecasts into a HEPS using a multi-model approach by implementing models with different complexity. Also in the case of the hydrological ensemble predictions, post-processing techniques need to be tested in order to improve the quality of the forecasts against observed discharge. Analysis should be specifically oriented to the maximisation of hydroelectricity production. Thus, verification metrics should include economic measures like cost loss approaches. The final step will include the transfer of the HEPS system to several hydropower systems, the connection with the energy market prices and the development of probabilistic multi-reservoir production and management optimizations guidelines. The baseline model chain yielding three-days forecasts established for a hydropower system in southern-Switzerland will be presented alongside with the work-plan to achieve seasonal ensemble predictions.
2012-08-01
between fat score (Helms and Drury 1960) and the condition index (R2 = 0.56, P < 0.001). A condition index of zero corresponds to zero fat stores or...where bird-wildlife/aircraft collisions threaten lives and cost millions of dollars in damages to aircraft infrastructure every year. By identifying...from bird-aircraft strikes (Dolbeer 2006). In the United States, collisions between aircraft and wildlife cost the aviation industry over $600
Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T.; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio
2014-01-01
Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden–Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003–2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts. PMID:24801254
Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio
2014-05-06
Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden-Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003-2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts.
Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game
NASA Astrophysics Data System (ADS)
Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian
2016-04-01
Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.
Coupling Radar Rainfall to Hydrological Models for Water Abstraction Management
NASA Astrophysics Data System (ADS)
Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; MacDonald, Ken
2015-04-01
The impacts of climate change and growing water use are likely to put considerable pressure on water resources and the environment. In the UK, a reform to surface water abstraction policy has recently been proposed which aims to increase the efficiency of using available water resources whilst minimising impacts on the aquatic environment. Key aspects to this reform include the consideration of dynamic rather than static abstraction licensing as well as introducing water trading concepts. Dynamic licensing will permit varying levels of abstraction dependent on environmental conditions (i.e. river flow and quality). The practical implementation of an effective dynamic abstraction strategy requires suitable flow forecasting techniques to inform abstraction asset management. Potentially the predicted availability of water resources within a catchment can be coupled to predicted demand and current storage to inform a cost effective water resource management strategy which minimises environmental impacts. The aim of this work is to use a historical analysis of UK case study catchment to compare potential water resource availability using modelled dynamic abstraction scenario informed by a flow forecasting model, against observed abstraction under a conventional abstraction regime. The work also demonstrates the impacts of modelling uncertainties on the accuracy of predicted water availability over range of forecast lead times. The study utilised a conceptual rainfall-runoff model PDM - Probability-Distributed Model developed by Centre for Ecology & Hydrology - set up in the Dove River catchment (UK) using 1km2 resolution radar rainfall as inputs and 15 min resolution gauged flow data for calibration and validation. Data assimilation procedures are implemented to improve flow predictions using observed flow data. Uncertainties in the radar rainfall data used in the model are quantified using artificial statistical error model described by Gaussian distribution and propagated through the model to assess its influence on the forecasted flow uncertainty. Furthermore, the effects of uncertainties at different forecast lead times on potential abstraction strategies are assessed. The results show that over a 10 year period, an average of approximately 70 ML/d of potential water is missed in the study catchment under a convention abstraction regime. This indicates a considerable potential for the use of flow forecasting models to effectively implement advanced abstraction management and more efficiently utilize available water resources in the study catchment.
Post-processing of global model output to forecast point rainfall
NASA Astrophysics Data System (ADS)
Hewson, Tim; Pillosu, Fatima
2016-04-01
ECMWF (the European Centre for Medium range Weather Forecasts) has recently embarked upon a new project to post-process gridbox rainfall forecasts from its ensemble prediction system, to provide probabilistic forecasts of point rainfall. The new post-processing strategy relies on understanding how different rainfall generation mechanisms lead to different degrees of sub-grid variability in rainfall totals. We use a number of simple global model parameters, such as the convective rainfall fraction, to anticipate the sub-grid variability, and then post-process each ensemble forecast into a pdf (probability density function) for a point-rainfall total. The final forecast will comprise the sum of the different pdfs from all ensemble members. The post-processing is essentially a re-calibration exercise, which needs only rainfall totals from standard global reporting stations (and forecasts) to train it. High density observations are not needed. This presentation will describe results from the initial 'proof of concept' study, which has been remarkably successful. Reference will also be made to other useful outcomes of the work, such as gaining insights into systematic model biases in different synoptic settings. The special case of orographic rainfall will also be discussed. Work ongoing this year will also be described. This involves further investigations of which model parameters can provide predictive skill, and will then move on to development of an operational system for predicting point rainfall across the globe. The main practical benefit of this system will be a greatly improved capacity to predict extreme point rainfall, and thereby provide early warnings, for the whole world, of flash flood potential for lead times that extend beyond day 5. This will be incorporated into the suite of products output by GLOFAS (the GLObal Flood Awareness System) which is hosted at ECMWF. As such this work offers a very cost-effective approach to satisfying user needs right around the world. This field has hitherto relied on using very expensive high-resolution ensembles; by their very nature these can only run over small regions, and only for lead times up to about 2 days.
Forecasting fluid milk and cheese demands for the next decade.
Schmit, T M; Kaiser, H M
2006-12-01
Predictions of future market demands and farm prices for dairy products are important determinants in developing marketing strategies and farm-production planning decisions. The objective of this report was to use current aggregate forecast data, combined with existing econometric models of demand and supply, to forecast retail demands for fluid milk and cheese and the supply and price of farm milk over the next decade. In doing so, we can investigate whether projections of population and consumer food-spending patterns will extend or alter current consumption trends and examine the implications of future generic advertising strategies for dairy products. To conduct the forecast simulations and appropriately allocate the farm milk supply to various uses, we used a partial equilibrium model of the US domestic dairy sector that segmented the industry into retail, wholesale, and farm markets. Model simulation results indicated that declines in retail per capita demand would persist but at a reduced rate from years past and that retail per capita demand for cheese would continue to grow and strengthen over the next decade. These predictions rely on expected changes in the size of populations of various ages, races, and ethnicities and on existing patterns of spending on food at home and away from home. The combined effect of these forecasted changes in demand levels was reflected in annualized growth in the total farm-milk supply that was similar to growth realized during the past few years. Although we expect nominal farm milk prices to increase over the next decade, we expect real prices (relative to assumed growth in feed costs) to remain relatively stable and show no increase until the end of the forecast period. Supplemental industry model simulations also suggested that net losses in producer revenues would result if only nominal levels of generic advertising spending were maintained in forthcoming years. In fact, if real generic advertising expenditures are increased relative to 2005 levels, returns to the investment in generic advertising can be improved. Specifically, each additional real dollar invested in generic advertising for fluid milk and cheese products over the forecast period would result in an additional 5.61 dollars in producer revenues.
NASA Astrophysics Data System (ADS)
Brightwell, David A.
2008-04-01
This dissertation examines three facets of U.S. energy use and policy. First, I examine the Gulf Coast petroleum refining industry to determine the structure of the industry. Using the duality between cost-minimization and production functions, I estimate the demand for labor to determine the underlying production function. The results indicate that refineries have become more capital intensive due to the relative price increase of labor. The industry has consolidated in response to higher labor costs and costs of environmental compliance. Next, I examine oil production in the United States. An empirical model based on the theoretical framework of Pindyck is used to estimate production. This model differs from previous research by using state level data rather than national level data. The results indicate that the production elasticity with respect to reserves and the price elasticity of supply are both inelastic in the long run. The implication of these findings is that policies designed to increase domestic production through subsidies, tax breaks, or royalty reductions will likely provide little additional oil. We simulate production under three scenarios. In the most extreme scenario, prices double between 2005 and 2030 while reserves increase by 50%. Under this scenario, oil production in 2030 is approximately the same as the 2005 level. The third essay estimates demand for fossil fuels in the U.S. and uses these estimates to forecast CO2 emissions. The results indicate that there is almost no substitution from one fossil fuel to another and that all three fossil fuels are inelastic in the long run. Additionally, all three fuels respond differently to changes in GDP. The result of the differing elasticities with respect to GDP is that the energy mix has changed over time. The implication for forecasting CO2 emissions is that models that cannot distinguish changes in the energy mix are not effective in forecasting CO2 emissions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finley, Cathy
2014-04-30
This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less
Evaluation of statistical models for forecast errors from the HBV model
NASA Astrophysics Data System (ADS)
Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur
2010-04-01
SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.
NASA Astrophysics Data System (ADS)
Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter
2016-04-01
Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.
NASA Technical Reports Server (NTRS)
Mauldin, L. E.
1994-01-01
Business travel planning within an organization is often a time-consuming task. Travel Forecaster is a menu-driven, easy-to-use program which plans, forecasts cost, and tracks actual vs. planned cost for business-related travel of a division or branch of an organization and compiles this information into a database to aid the travel planner. The program's ability to handle multiple trip entries makes it a valuable time-saving device. Travel Forecaster takes full advantage of relational data base properties so that information that remains constant, such as per diem rates and airline fares (which are unique for each city), needs entering only once. A typical entry would include selection with the mouse of the traveler's name and destination city from pop-up lists, and typed entries for number of travel days and purpose of the trip. Multiple persons can be selected from the pop-up lists and multiple trips are accommodated by entering the number of days by each appropriate month on the entry form. An estimated travel cost is not required of the user as it is calculated by a Fourth Dimension formula. With this information, the program can produce output of trips by month with subtotal and total cost for either organization or sub-entity of an organization; or produce outputs of trips by month with subtotal and total cost for international-only travel. It will also provide monthly and cumulative formats of planned vs. actual outputs in data or graph form. Travel Forecaster users can do custom queries to search and sort information in the database, and it can create custom reports with the user-friendly report generator. Travel Forecaster 1.1 is a database program for use with Fourth Dimension Runtime 2.1.1. It requires a Macintosh Plus running System 6.0.3 or later, 2Mb of RAM and a hard disk. The standard distribution medium for this package is one 3.5 inch 800K Macintosh format diskette. Travel Forecaster was developed in 1991. Macintosh is a registered trademark of Apple Computer, Inc. Fourth Dimension is a registered trademark of Acius, Inc.
Sampling strategies based on singular vectors for assimilated models in ocean forecasting systems
NASA Astrophysics Data System (ADS)
Fattorini, Maria; Brandini, Carlo; Ortolani, Alberto
2016-04-01
Meteorological and oceanographic models do need observations, not only as a ground truth element to verify the quality of the models, but also to keep model forecast error acceptable: through data assimilation techniques which merge measured and modelled data, natural divergence of numerical solutions from reality can be reduced / controlled and a more reliable solution - called analysis - is computed. Although this concept is valid in general, its application, especially in oceanography, raises many problems due to three main reasons: the difficulties that have ocean models in reaching an acceptable state of equilibrium, the high measurements cost and the difficulties in realizing them. The performances of the data assimilation procedures depend on the particular observation networks in use, well beyond the background quality and the used assimilation method. In this study we will present some results concerning the great impact of the dataset configuration, in particular measurements position, on the evaluation of the overall forecasting reliability of an ocean model. The aim consists in identifying operational criteria to support the design of marine observation networks at regional scale. In order to identify the observation network able to minimize the forecast error, a methodology based on Singular Vectors Decomposition of the tangent linear model is proposed. Such a method can give strong indications on the local error dynamics. In addition, for the purpose of avoiding redundancy of information contained in the data, a minimal distance among data positions has been chosen on the base of a spatial correlation analysis of the hydrodynamic fields under investigation. This methodology has been applied for the choice of data positions starting from simplified models, like an ideal double-gyre model and a quasi-geostrophic one. Model configurations and data assimilation are based on available ROMS routines, where a variational assimilation algorithm (4D-var) is included as part of the code These first applications have provided encouraging results in terms of increased predictability time and reduced forecast error, also improving the quality of the analysis used to recover the real circulation patterns from a first guess quite far from the real state.
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.
2017-08-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
The System of Inventory Forecasting in PT. XYZ by using the Method of Holt Winter Multiplicative
NASA Astrophysics Data System (ADS)
Shaleh, W.; Rasim; Wahyudin
2018-01-01
Problems at PT. XYZ currently only rely on manual bookkeeping, then the cost of production will swell and all investments invested to be less to predict sales and inventory of goods. If the inventory prediction of goods is to large, then the cost of production will swell and all investments invested to be less efficient. Vice versa, if the inventory prediction is too small it will impact on consumers, so that consumers are forced to wait for the desired product. Therefore, in this era of globalization, the development of computer technology has become a very important part in every business plan. Almost of all companies, both large and small, use computer technology. By utilizing computer technology, people can make time in solving complex business problems. Computer technology for companies has become an indispensable activity to provide enhancements to the business services they manage but systems and technologies are not limited to the distribution model and data processing but the existing system must be able to analyze the possibilities of future company capabilities. Therefore, the company must be able to forecast conditions and circumstances, either from inventory of goods, force, or profits to be obtained. To forecast it, the data of total sales from December 2014 to December 2016 will be calculated by using the method of Holt Winters, which is the method of time series prediction (Multiplicative Seasonal Method) it is seasonal data that has increased and decreased, also has 4 equations i.e. Single Smoothing, Trending Smoothing, Seasonal Smoothing and Forecasting. From the results of research conducted, error value in the form of MAPE is below 1%, so it can be concluded that forecasting with the method of Holt Winter Multiplicative.
42 CFR 417.572 - Budget and enrollment forecast and interim reports.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Budget and enrollment forecast and interim reports... PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417.572 Budget and enrollment forecast and interim reports. (a) Annual submittal. The HMO or CMP must submit an annual operating budget...
NASA Astrophysics Data System (ADS)
Kuzma, H. A.; Golubkova, A.; Eklund, C.
2015-12-01
Nevada has the second largest output of geothermal energy in the United States (after California) with 14 major power plants producing over 425 megawatts of electricity meeting 7% of the state's total energy needs. A number of wells, particularly older ones, have shown significant temperature and pressure declines over their lifetimes, adversely affecting economic returns. Production declines are almost universal in the oil and gas (O&G) industry. BetaZi (BZ) is a proprietary algorithm which uses a physiostatistical model to forecast production from the past history of O&G wells and to generate "type curves" which are used to estimate the production of undrilled wells. Although BZ was designed and calibrated for O&G, it is a general purpose diffusion equation solver, capable of modeling complex fluid dynamics in multi-phase systems. In this pilot study, it is applied directly to the temperature data from five Nevada geothermal fields. With the data appropriately normalized, BZ is shown to accurately predict temperature declines. The figure shows several examples of BZ forecasts using historic data from Steamboat Hills field near Reno. BZ forecasts were made using temperature on a normalized scale (blue) with two years of data held out for blind testing (yellow). The forecast is returned in terms of percentiles of probability (red) with the median forecast marked (solid green). Actual production is expected to fall within the majority of the red bounds 80% of the time. Blind tests such as these are used to verify that the probabilistic forecast can be trusted. BZ is also used to compute and accurate type temperature profile for wells that have yet to be drilled. These forecasts can be combined with estimated costs to evaluate the economics and risks of a project or potential capital investment. It is remarkable that an algorithm developed for oil and gas can accurately predict temperature in geothermal wells without significant recasting.
NASA Astrophysics Data System (ADS)
Brook, Anna; Polinova, Maria; Housh, Mashor
2016-04-01
Agriculture and agricultural landscapes are increasingly under pressure to meet the demands of a constantly increasing human population and globally changing food patterns. At the same time, there is rising concern that climate change and food security will harm agriculture in many regions of the world (Nelson et al., 2009). Facing those treats, majority of Mediterranean countries had chosen irrigated agriculture. For crop plants water is one of the most important inputs, as it is responsible for crop growth, production and it ensures the efficiency of other inputs (e.g. seeds, fertilizers and pesticide) but its use is in competition with other local sectors (e.g. industry, urban human use). Thus, well-timed availability of water is vital to agriculture for ensured yields. The increasing demand for irrigation has necessitated the need for optimal irrigation scheduling techniques that coordinate the timing and amount of irrigation to optimally manage the water use in agriculture systems. The irrigation scheduling problem can be challenging as farmers try to deal with different conflicting objectives of maximizing their yield while minimizing irrigation water use. Another challenge in the irrigation scheduling problem is attributed to the uncertain factors involved in the plant growth process during the growing season. Most notable, the climatic factors such as evapotranspiration and rainfall, these uncertain factors add a third objective to the farmer perspective, namely, minimizing the risk associated with these uncertain factors. Nevertheless, advancements in weather forecasting reduced the uncertainty level associated with future climatic data. Thus, climatic forecasts can be reliably employed to guide optimal irrigation schedule scheme when coupled with stochastic optimization models (Housh et al., 2012). Many studies have concluded that optimal irrigation decisions can provide substantial economic value over conventional irrigation decisions (Wang and Cai 2009). These studies have only incorporated short-term (weekly) forecasts, missing the potential benefit of the mid-term (seasonal) climate forecasts The latest progress in new data acquisition technologies (mainly in the field of Earth observation by remote sensing and imaging spectroscopy systems) as well as the state-of-the-art achievements in the fields of geographical information systems (GIS), computer science and climate and climate impact modelling enable to develop both integrated modelling and realistic spatial simulations. The present method is the use of field spectroscopy technology to keep constant monitoring of the field. The majority of previously developed decision support systems use satellite remote sensing data that provide very limited capabilities (conventional and basic parameters). The alternative is to use a more progressive technology of hyperspectral airborne or ground-based imagery data that provide an exhaustive description of the field. Nevertheless, this alternative is known to be very costly and complex. As such, we will present a low-cost imaging spectroscopy technology supported by detailed and fine-resolution field spectroscopy as a cost effective option for near field real-time monitoring tool. In order to solve the soil water balance and to predict the water irrigation volume a pedological survey is realized in the evaluation study areas.The remote sensing and field spectroscopy were applied to integrate continuous feedbacks from the field (e.g. soil moisture, organic/inorganic carbon, nitrogen, salinity, fertilizers, sulphur acid, texture; crop water-stress, plant stage, LAI , chlorophyll, biomass, yield prediction applying PROSPECT+SILT ; Fraction of Absorbed Photosynthetically Active Radiation FAPAR) estimated based on remote sensing information to minimize the errors associated with crop simulation process. A stochastic optimization model will be formulated that take into account both mid-term seasonal probabilistic climate prediction and short-term weekly forecasts. In order to optimize the water resource use, the irrigation scheduling will be defined by use a simulation model of soil-plant and atmosphere system (e.g. SWAP model, Van Dam et al., 2008). The use of this tool is necessary to: i) take into account the soil spatial variability; ii) to predict the system behaviour under the forecasted climate; iii) define the optimized irrigation water volumes. Given this knowledge in the three domains of optimization under uncertainty, spectroscopy/remote sensing and climate forecasting, we will be presented as an integrated framework for deriving optimal irrigation decisions. References Nelson, Gerald C., et al. Climate change: Impact on agriculture and costs of adaptation. Vol. 21. Intl Food Policy Res Inst, 2009. Housh, Mashor, Avi Ostfeld, and Uri Shamir. "Seasonal multi-year optimal management of quantities and salinities in regional water supply systems." Environmental Modelling & Software 37 (2012): 55-67. Wang, Dingbao, and Ximing Cai. "Irrigation scheduling - Role of weather forecasting and farmers' behavior." Journal of Water Resources Planning and Management 135.5 (2009): 364-372. Van Dam, J. C., et al. SWAP version 3.2: Theory description and user manual. No. 1649. Wageningen, The Netherlands: Alterra, 2008.
Gaussian process regression for forecasting battery state of health
NASA Astrophysics Data System (ADS)
Richardson, Robert R.; Osborne, Michael A.; Howey, David A.
2017-07-01
Accurately predicting the future capacity and remaining useful life of batteries is necessary to ensure reliable system operation and to minimise maintenance costs. The complex nature of battery degradation has meant that mechanistic modelling of capacity fade has thus far remained intractable; however, with the advent of cloud-connected devices, data from cells in various applications is becoming increasingly available, and the feasibility of data-driven methods for battery prognostics is increasing. Here we propose Gaussian process (GP) regression for forecasting battery state of health, and highlight various advantages of GPs over other data-driven and mechanistic approaches. GPs are a type of Bayesian non-parametric method, and hence can model complex systems whilst handling uncertainty in a principled manner. Prior information can be exploited by GPs in a variety of ways: explicit mean functions can be used if the functional form of the underlying degradation model is available, and multiple-output GPs can effectively exploit correlations between data from different cells. We demonstrate the predictive capability of GPs for short-term and long-term (remaining useful life) forecasting on a selection of capacity vs. cycle datasets from lithium-ion cells.
Predicting and adapting to the agricultural impacts of large-scale drought (Invited)
NASA Astrophysics Data System (ADS)
Elliott, J. W.; Glotter, M.; Best, N.; Ruane, A. C.; Boote, K.; Hatfield, J.; Jones, J.; Rosenzweig, C.; Smith, L. A.; Foster, I.
2013-12-01
The impact of drought on agriculture is an important socioeconomic consequence of climate extremes. Drought affects millions of people globally each year, causing an average of 6-8 billion of damage annually in the U.S. alone. The 1988 U.S. drought is estimated to have cost 79 billion in 2013 dollars, behind only Hurricane Katrina as the most costly U.S. climate-related disaster in recent decades. The 2012 U.S. drought is expected to cost about 30 billion. Droughts and heat waves accounted for 12% of all billion-dollar disaster events in the U.S. from 1980-2011 but almost one quarter of total monetary damages. To make matters worse, the frequency and severity of large-scale droughts in important agricultural regions is expected to increase as temperatures rise and precipitation patterns shift, leading some researchers to suggest that extended drought will harm more people than any other climate-related impact, specifically in the area of food security. Improved understanding and forecasts of drought would have both immediate and long-term implications for the global economy and food security. We show that mechanistic agricultural models, applied in novel ways, can reproduce historical crop yield anomalies, especially in seasons for which drought is the overriding factor. With more accurate observations and forecasts for temperature and precipitation, the accuracy and lead times of drought impact predictions could be improved further. We provide evidence that changes in agricultural technologies and management have reduced system-level drought sensitivity in US maize production in recent decades, adaptations that could be applied elsewhere. This work suggests a new approach to modeling, monitoring, and forecasting drought impacts on agriculture. Simulated (dashed line), observed (solid line), and observed linear trend (dashed straight green line) of national average maize yield in tonnes per hectare from 1979-2012. The red dot indicates the USDA estimate for 2012 released in November 2012. We use shading to show the central 95% (lighter bands) and 75% (darker bands) of the resampled forecast error distribution. The June-August Palmer Z-Index (by US climate division) for b) 1988 and c) 2012.
Quantifying model uncertainty in seasonal Arctic sea-ice forecasts
NASA Astrophysics Data System (ADS)
Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin
2017-04-01
Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.
Forecast of long term coal supply and mining conditions: Model documentation and results
NASA Technical Reports Server (NTRS)
1980-01-01
A coal industry model was developed to support the Jet Propulsion Laboratory in its investigation of advanced underground coal extraction systems. The model documentation includes the programming for the coal mining cost models and an accompanying users' manual, and a guide to reading model output. The methodology used in assembling the transportation, demand, and coal reserve components of the model are also described. Results presented for 1986 and 2000, include projections of coal production patterns and marginal prices, differentiated by coal sulfur content.
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
NASA Astrophysics Data System (ADS)
Nyboer, John
Issues related to the reduction of greenhouse gases are encumbered with uncertainties for decision makers. Unfortunately, conventional analytical tools generate widely divergent forecasts of the effects of actions designed to mitigate these emissions. "Bottom-up" models show the costs of reducing emissions attained through the penetration of efficient technologies to be low or negative. In contrast, more aggregate "top-down" models show costs of reduction to be high. The methodological approaches of the different models used to simulate energy consumption generate, in part, the divergence found in model outputs. To address this uncertainty and bring convergence, I use a technology-explicit model that simulates turnover of equipment stock as a function of detailed data on equipment costs and stock characteristics and of verified behavioural data related to equipment acquisition and retrofitting. Such detail can inform the decision maker of the effects of actions to reduce greenhouse gases due to changes in (1) technology stocks, (2) products or services, or (3) the mix of fuels used. This thesis involves two main components: (1) the development of a quantitative model to analyse energy demand and (2) the application of this tool to a policy issue, abatement of COsb2 emissions. The analysis covers all of Canada by sector (8 industrial subsectors, residential commercial) and region. An electricity supply model to provide local electricity prices supplemented the quantitative model. Forecasts of growth and structural change were provided by national macroeconomic models. Seven different simulations were applied to each sector in each region including a base case run and three runs simulating emissions charges of 75/tonne, 150/tonne and 225/tonne CO sb2. The analysis reveals that there is significant variation in the costs and quantity of emissions reduction by sector and region. Aggregated results show that Canada can meet both stabilisation targets (1990 levels of emissions by 2000) and reduction targets (20% less than 1990 by 2010), but the cost of meeting reduction targets exceeds 225/tonne. After a review of the results, I provide several reasons for concluding that the costs are overestimated and the emissions reduction underestimated. I also provide several future research options.
A short-term ensemble wind speed forecasting system for wind power applications
NASA Astrophysics Data System (ADS)
Baidya Roy, S.; Traiteur, J. J.; Callicutt, D.; Smith, M.
2011-12-01
This study develops an adaptive, blended forecasting system to provide accurate wind speed forecasts 1 hour ahead of time for wind power applications. The system consists of an ensemble of 21 forecasts with different configurations of the Weather Research and Forecasting Single Column Model (WRFSCM) and a persistence model. The ensemble is calibrated against observations for a 2 month period (June-July, 2008) at a potential wind farm site in Illinois using the Bayesian Model Averaging (BMA) technique. The forecasting system is evaluated against observations for August 2008 at the same site. The calibrated ensemble forecasts significantly outperform the forecasts from the uncalibrated ensemble while significantly reducing forecast uncertainty under all environmental stability conditions. The system also generates significantly better forecasts than persistence, autoregressive (AR) and autoregressive moving average (ARMA) models during the morning transition and the diurnal convective regimes. This forecasting system is computationally more efficient than traditional numerical weather prediction models and can generate a calibrated forecast, including model runs and calibration, in approximately 1 minute. Currently, hour-ahead wind speed forecasts are almost exclusively produced using statistical models. However, numerical models have several distinct advantages over statistical models including the potential to provide turbulence forecasts. Hence, there is an urgent need to explore the role of numerical models in short-term wind speed forecasting. This work is a step in that direction and is likely to trigger a debate within the wind speed forecasting community.
NASA Astrophysics Data System (ADS)
Sun, Congcong; Wang, Zhijie; Liu, Sanming; Jiang, Xiuchen; Sheng, Gehao; Liu, Tianyu
2017-05-01
Wind power has the advantages of being clean and non-polluting and the development of bundled wind-thermal generation power systems (BWTGSs) is one of the important means to improve wind power accommodation rate and implement “clean alternative” on generation side. A two-stage optimization strategy for BWTGSs considering wind speed forecasting results and load characteristics is proposed. By taking short-term wind speed forecasting results of generation side and load characteristics of demand side into account, a two-stage optimization model for BWTGSs is formulated. By using the environmental benefit index of BWTGSs as the objective function, supply-demand balance and generator operation as the constraints, the first-stage optimization model is developed with the chance-constrained programming theory. By using the operation cost for BWTGSs as the objective function, the second-stage optimization model is developed with the greedy algorithm. The improved PSO algorithm is employed to solve the model and numerical test verifies the effectiveness of the proposed strategy.
Wille, Eberhard; Scholze, Jürgen; Alegria, Eduardo; Ferri, Claudio; Langham, Sue; Stevens, Warren; Jeffries, David; Uhl-Hochgraeber, Kerstin
2011-06-01
The presence of metabolic syndrome in patients with hypertension significantly increases the risk of cardiovascular disease, type 2 diabetes and mortality. Our aim is to estimate the economic burden to the health service of metabolic syndrome (MetS) in patients with hypertension and its consequences, in three European countries in 2008, and to forecast future economic burden in 2020 using projected demographic estimates and assumptions around the growth of MetS. An age-, sex- and risk group-structured prevalence-based cost of illness model was developed using the United States Adult Treatment Panel III of the National Cholesterol Education Program criteria to define MetS. Data sources included published information and public use databases on disease prevalence, incidence of cardiovascular events, prevalence of type 2 diabetes, treatment patterns and cost of management in Germany, Spain and Italy. The economic burden to the health service of MetS in patients with hypertension has been estimated at 24,427 euro, 1,900 euro and 4,877 euro million in Germany, Spain and Italy, and is forecast to rise by 59, 179 and 157%, respectively, by 2020. The largest components of costs included the management of prevalent type 2 diabetes and incident cardiovascular events. Mean annual costs per hypertensive patient were around three-fold higher in subjects with MetS compared to those without and rose incrementally with the additional number of MetS components present. In conclusion, the presence of MetS in patients with hypertension significantly inflates economic burden, and costs are likely to increase in the future due to an aging population and an increase in the prevalence of components of MetS.
Data Driven Ionospheric Modeling in Relation to Space Weather: Percent Cloud Coverage
NASA Astrophysics Data System (ADS)
Tulunay, Y.; Senalp, E. T.; Tulunay, E.
2009-04-01
Since 1990, a small group at METU has been developing data driven models in order to forecast some critical system parameters related with the near-Earth space processes. The background on the subject supports new achievements, which contributed the COST 724 activities, which will contribute to the new ES0803 activities. This work mentions one of the outstanding contributions, namely forecasting of meteorological parameters by considering the probable influence of cosmic rays (CR) and sunspot numbers (SSN). The data-driven method is generic and applicable to many Near-Earth Space processes including ionospheric/plasmaspheric interactions. It is believed that the EURIPOS initiative would be useful in supplying wide range reliable data to the models developed. Quantification of physical mechanisms, which causally link Space Weather to the Earth's Weather, has been a challenging task. In this basis, the percent cloud coverage (%CC) and cloud top temperatures (CTT) were forecast one month ahead of time between geographic coordinates of (22.5˚N; 57.5˚N); and (7.5˚W; 47.5˚E) at 96 grid locations and covering the years of 1983 to 2000 using the Middle East Technical University Fuzzy Neural Network Model (METU-FNN-M) [Tulunay, 2008]. The Near Earth Space variability at several different time scales arises from a number of separate factors and the physics of the variations cannot be modeled due to the lack of current information about the parameters of several natural processes. CR are shielded by the magnetosphere to a certain extent, but they can modulate the low level cloud cover. METU-FNN-M was developed, trained and applied for forecasting the %CC and CTT, by considering the history of those meteorological variables; Cloud Optical Depth (COD); the Ionization (I) value that is formulized and computed by using CR data and CTT; SSN; temporal variables; and defuzified cloudiness. The temporal and spatial variables and the cut off rigidity are used to compute the defuzified cloudiness. The forecast %CC and CTT values at uniformly spaced grids over the region of interest are used for mapping by Bezier surfaces. The major advantage of the fuzzy model is that it uses its inputs and the expert knowledge in coordination. Long-term cloud analysis was performed on a region having differences in terms of atmospheric activity, in order to show the generalization capability. Global and local parameters of the process were considered. Both CR Flux and SSN reflect the influence of Space Weather on general planetary situation; but other parameters in the inputs of the model reflect local situation. Error and correlation analysis on the forecast and observed parameters were performed. The correlations between the forecast and observed parameters are very promising. The model contributes to the dependence of the cloud formation process on CR Fluxes. The one-month in advance forecast values of the model can also be used as inputs to other models, which forecast some other local or global parameters in order to further test the hypothesis on possible link(s) between Space Weather and the Earth's Weather. The model based, theoretical and numerical works mentioned are promising and have potential for future research and developments. References Tulunay Y., E.T. Şenalp, Ş. Öz, L.I. Dorman, E. Tulunay, S.S. Menteş and M.E. Akcan (2008), A Fuzzy Neural Network Model to Forecast the Percent Cloud Coverage and Cloud Top Temperature Maps, Ann. Geophys., 26(12), 3945-3954, 2008.
Evaluating space weather forecasts of geomagnetic activity from a user perspective
NASA Astrophysics Data System (ADS)
Thomson, A. W. P.
2000-12-01
Decision Theory can be used as a tool for discussing the relative costs of complacency and false alarms with users of space weather forecasts. We describe a new metric for the value of space weather forecasts, derived from Decision Theory. In particular we give equations for the level of accuracy that a forecast must exceed in order to be useful to a specific customer. The technique is illustrated by simplified example forecasts for global geomagnetic activity and for geophysical exploration and power grid management in the British Isles.
Considering inventory distributions in a stochastic periodic inventory routing system
NASA Astrophysics Data System (ADS)
Yadollahi, Ehsan; Aghezzaf, El-Houssaine
2017-07-01
Dealing with the stochasticity of parameters is one of the critical issues in business and industry nowadays. Supply chain planners have difficulties in forecasting stochastic parameters of a distribution system. Demand rates of customers during their lead time are one of these parameters. In addition, holding a huge level of inventory at the retailers is costly and inefficient. To cover the uncertainty of forecasting demand rates, researchers have proposed the usage of safety stock to avoid stock-out. However, finding the precise level of safety stock depends on forecasting the statistical distribution of demand rates and their variations in different settings among the planning horizon. In this paper the demand rate distributions and its parameters are taken into account for each time period in a stochastic periodic IRP. An analysis of the achieved statistical distribution of the inventory and safety stock level is provided to measure the effects of input parameters on the output indicators. Different values for coefficient of variation are applied to the customers' demand rate in the optimization model. The outcome of the deterministic equivalent model of SPIRP is simulated in form of an illustrative case.
Cb-LIKE - Thunderstorm forecasts up to six hours with fuzzy logic
NASA Astrophysics Data System (ADS)
Köhler, Martin; Tafferner, Arnold
2016-04-01
Thunderstorms with their accompanying effects like heavy rain, hail, or downdrafts cause delays and flight cancellations and therefore high additional cost for airlines and airport operators. A reliable thunderstorm forecast up to several hours could provide more time for decision makers in air traffic for an appropriate reaction on possible storm cells and initiation of adequate counteractions. To provide the required forecasts Cb-LIKE (Cumulonimbus-LIKElihood) has been developed at the DLR (Deutsches Zentrum für Luft- und Raumfahrt) Institute of Atmospheric Physics. The new algorithm is an automated system which designates areas with possible thunderstorm development by using model data of the COSMO-DE weather model, which is driven by the German Meteorological Service (DWD). A newly developed "Best-Member- Selection" method allows the automatic selection of that particular model run of a time-lagged COSMO- DE model ensemble, which matches best the current thunderstorm situation. Thereby the application of the best available data basis for the calculation of the thunderstorm forecasts by Cb-LIKE is ensured. Altogether there are four different modes for the selection of the best member. Four atmospheric parameters (CAPE, vertical wind velocity, radar reflectivity and cloud top temperature) of the model output are used within the algorithm. A newly developed fuzzy logic system enables the subsequent combination of the model parameters and the calculation of a thunderstorm indicator within a value range of 12 up to 88 for each grid point of the model domain for the following six hours in one hour intervals. The higher the indicator value the more the model parameters imply the development of thunderstorms. The quality of the Cb-LIKE thunderstorm forecasts was evaluated by a substantial verification using a neighborhood verification approach and multi-event contingency tables. The verification was performed for the whole summer period of 2012. On the basis of a deterministic object comparison with heavy precipitation cells observed by the radar-based thunderstorm tracking algorithm Rad-TRAM, several verification scores like BIAS, POD, FAR and CSI were calculated to identify possible advantages of the new algorithm. The presentation illustrates in detail the concept of the Cb-LIKE algorithm with regard to the fuzzy logic system and the Best-Member-Selection. Additionally some case studies and the most important results of the verification will be shown. The implementation of the forecasts into the DLR WxFUSION system, an user oriented forecasting system for air traffic, will also be included.
NASA Astrophysics Data System (ADS)
Meißner, Dennis; Klein, Bastian; Ionita, Monica; Hemri, Stephan; Rademacher, Silke
2017-04-01
Inland waterway transport (IWT) is an important commercial sector significantly vulnerable to hydrological impacts. River ice and floods limit the availability of the waterway network and may cause considerable damages to waterway infrastructure. Low flows significantly affect IWT's operation efficiency usually several months a year due to the close correlation of (low) water levels / water depths and (high) transport costs. Therefore "navigation-related" hydrological forecasts focussing on the specific requirements of water-bound transport (relevant forecast locations, target parameters, skill characteristics etc.) play a major role in order to mitigate IWT's vulnerability to hydro-meteorological impacts. In light of continuing transport growth within the European Union, hydrological forecasts for the waterways are essential to stimulate the use of the free capacity IWT still offers more consequently. An overview of the current operational and pre-operational forecasting systems for the German waterways predicting water levels, discharges and river ice thickness on various time-scales will be presented. While short-term (deterministic) forecasts have a long tradition in navigation-related forecasting, (probabilistic) forecasting services offering extended lead-times are not yet well-established and are still subject to current research and development activities (e.g. within the EU-projects EUPORIAS and IMPREX). The focus is on improving technical aspects as well as on exploring adequate ways of disseminating and communicating probabilistic forecast information. For the German stretch of the River Rhine, one of the most frequented inland waterways worldwide, the existing deterministic forecast scheme has been extended by ensemble forecasts combined with statistical post-processing modules applying EMOS (Ensemble Model Output Statistics) and ECC (Ensemble Copula Coupling) in order to generate water level predictions up to 10 days and to estimate its predictive uncertainty properly. Additionally for the key locations at the international waterways Rhine, Elbe and Danube three competing forecast approaches are currently tested in a pre-operational set-up in order to generate monthly to seasonal (up to 3 months) forecasts: (1) the well-known Ensemble Streamflow Prediction approach (ensemble based on historical meteorology), (2) coupling hydrological models with post-processed outputs from ECMWF's general circulation model (System 4), and (3) a purely statistical approach based on the stable relationship (teleconnection) of global or regional oceanic, climate and hydrological data with river flows. The current results, still pre-operational, reveal the existence of a valuable predictability of water levels and streamflow also at monthly up to seasonal time-scales along the larger rivers used as waterways in Germany. Last but not least insight into the technical set-up of the aforementioned forecasting systems operated at the Federal Institute of Hydrology, which are based on a Delft-FEWS application, will be given focussing on the step-wise extension of the former system by integrating new components in order to meet the growing needs of the customers and to improve and extend the forecast portfolio for waterway users.
Marques-Toledo, Cecilia de Almeida; Degener, Carolin Marlen; Vinhal, Livia; Coelho, Giovanini; Meira, Wagner; Codeço, Claudia Torres; Teixeira, Mauro Martins
2017-07-01
Infectious diseases are a leading threat to public health. Accurate and timely monitoring of disease risk and progress can reduce their impact. Mentioning a disease in social networks is correlated with physician visits by patients, and can be used to estimate disease activity. Dengue is the fastest growing mosquito-borne viral disease, with an estimated annual incidence of 390 million infections, of which 96 million manifest clinically. Dengue burden is likely to increase in the future owing to trends toward increased urbanization, scarce water supplies and, possibly, environmental change. The epidemiological dynamic of Dengue is complex and difficult to predict, partly due to costly and slow surveillance systems. In this study, we aimed to quantitatively assess the usefulness of data acquired by Twitter for the early detection and monitoring of Dengue epidemics, both at country and city level at a weekly basis. Here, we evaluated and demonstrated the potential of tweets modeling for Dengue estimation and forecast, in comparison with other available web-based data, Google Trends and Wikipedia access logs. Also, we studied the factors that might influence the goodness-of-fit of the model. We built a simple model based on tweets that was able to 'nowcast', i.e. estimate disease numbers in the same week, but also 'forecast' disease in future weeks. At the country level, tweets are strongly associated with Dengue cases, and can estimate present and future Dengue cases until 8 weeks in advance. At city level, tweets are also useful for estimating Dengue activity. Our model can be applied successfully to small and less developed cities, suggesting a robust construction, even though it may be influenced by the incidence of the disease, the activity of Twitter locally, and social factors, including human development index and internet access. Tweets association with Dengue cases is valuable to assist traditional Dengue surveillance at real-time and low-cost. Tweets are able to successfully nowcast, i.e. estimate Dengue in the present week, but also forecast, i.e. predict Dengue at until 8 weeks in the future, both at country and city level with high estimation capacity.
ERIC Educational Resources Information Center
Hudson, Barclay M.
Descriptions of models for policy analysis in future studies are presented. Separate sections of the paper focus on the need for appropriate technologies of social science in future studies, a description of "compact policy assessment" (CPA), and a comparison of two CPA methods, Compass and Delphi. Compact policy assessment refers to any low-cost,…
Assimilating uncertain, dynamic and intermittent streamflow observations in hydrological models
NASA Astrophysics Data System (ADS)
Mazzoleni, Maurizio; Alfonso, Leonardo; Chacon-Hurtado, Juan; Solomatine, Dimitri
2015-09-01
Catastrophic floods cause significant socio-economical losses. Non-structural measures, such as real-time flood forecasting, can potentially reduce flood risk. To this end, data assimilation methods have been used to improve flood forecasts by integrating static ground observations, and in some cases also remote sensing observations, within water models. Current hydrologic and hydraulic research works consider assimilation of observations coming from traditional, static sensors. At the same time, low-cost, mobile sensors and mobile communication devices are becoming also increasingly available. The main goal and innovation of this study is to demonstrate the usefulness of assimilating uncertain streamflow observations that are dynamic in space and intermittent in time in the context of two different semi-distributed hydrological model structures. The developed method is applied to the Brue basin, where the dynamic observations are imitated by the synthetic observations of discharge. The results of this study show how model structures and sensors locations affect in different ways the assimilation of streamflow observations. In addition, it proves how assimilation of such uncertain observations from dynamic sensors can provide model improvements similar to those of streamflow observations coming from a non-optimal network of static physical sensors. This can be a potential application of recent efforts to build citizen observatories of water, which can make the citizens an active part in information capturing, evaluation and communication, helping simultaneously to improvement of model-based flood forecasting.
A High Resolution Tropical Cyclone Power Outage Forecasting Model for the Continental United States
NASA Astrophysics Data System (ADS)
Pino, J. V.; Quiring, S. M.; Guikema, S.; Shashaani, S.; Linger, S.; Backhaus, S.
2017-12-01
Tropical cyclones cause extensive damage to the power infrastructure system throughout the United States. This damage can leave millions without power for extended periods of time, as most recently seen with Hurricane Matthew (2016). Accurate and timely prediction of power outages are essential for utility companies, emergency management agencies, and governmental organizations. Here we present a high-resolution (250 m x 250 m) hurricane power outage model for the United States. The model uses only publicly-available data to make predictions. It uses forecasts of storm variables such as maximum 3-second wind gust, duration of strong winds > 20 m s-2, soil moisture, and precipitation. It also incorporates static environmental variables such as elevation characteristics, land cover type, population density, tree species data, and root zone depth. A web tool was established for use by the Department of Energy (DOE) so that the model can be used for real-time outage forecasting or for synthetic tropical cyclones as an exercise in emergency management. This web tool provides DOE decision-makers with high impact analytic results and products that can be disseminated to federal, local, and state agencies. The results then aid utility companies in their pre- and post-storm activities, thus decreasing restoration times and lowering costs.
Worldwide satellite market demand forecast
NASA Technical Reports Server (NTRS)
Bowyer, J. M.; Frankfort, M.; Steinnagel, K. M.
1981-01-01
The forecast is for the years 1981 - 2000 with benchmark years at 1985, 1990 and 2000. Two typs of markets are considered for this study: Hardware (worldwide total) - satellites, earth stations and control facilities (includes replacements and spares); and non-hardware (addressable by U.S. industry) - planning, launch, turnkey systems and operations. These markets were examined for the INTELSAT System (international systems and domestic and regional systems using leased transponders) and domestic and regional systems. Forecasts were determined for six worldwide regions encompassing 185 countries using actual costs for existing equipment and engineering estimates of costs for advanced systems. Most likely (conservative growth rate estimates) and optimistic (mid range growth rate estimates) scenarios were employed for arriving at the forecasts which are presented in constant 1980 U.S. dollars. The worldwide satellite market demand forecast predicts that the market between 181 and 2000 will range from $35 to $50 billion. Approximately one-half of the world market, $16 to $20 billion, will be generated in the United States.
Worldwide satellite market demand forecast
NASA Astrophysics Data System (ADS)
Bowyer, J. M.; Frankfort, M.; Steinnagel, K. M.
1981-06-01
The forecast is for the years 1981 - 2000 with benchmark years at 1985, 1990 and 2000. Two typs of markets are considered for this study: Hardware (worldwide total) - satellites, earth stations and control facilities (includes replacements and spares); and non-hardware (addressable by U.S. industry) - planning, launch, turnkey systems and operations. These markets were examined for the INTELSAT System (international systems and domestic and regional systems using leased transponders) and domestic and regional systems. Forecasts were determined for six worldwide regions encompassing 185 countries using actual costs for existing equipment and engineering estimates of costs for advanced systems. Most likely (conservative growth rate estimates) and optimistic (mid range growth rate estimates) scenarios were employed for arriving at the forecasts which are presented in constant 1980 U.S. dollars. The worldwide satellite market demand forecast predicts that the market between 181 and 2000 will range from $35 to $50 billion. Approximately one-half of the world market, $16 to $20 billion, will be generated in the United States.
Using risk-adjustment models to identify high-cost risks.
Meenan, Richard T; Goodman, Michael J; Fishman, Paul A; Hornbrook, Mark C; O'Keeffe-Rosetti, Maureen C; Bachman, Donald J
2003-11-01
We examine the ability of various publicly available risk models to identify high-cost individuals and enrollee groups using multi-HMO administrative data. Five risk-adjustment models (the Global Risk-Adjustment Model [GRAM], Diagnostic Cost Groups [DCGs], Adjusted Clinical Groups [ACGs], RxRisk, and Prior-expense) were estimated on a multi-HMO administrative data set of 1.5 million individual-level observations for 1995-1996. Models produced distributions of individual-level annual expense forecasts for comparison to actual values. Prespecified "high-cost" thresholds were set within each distribution. The area under the receiver operating characteristic curve (AUC) for "high-cost" prevalences of 1% and 0.5% was calculated, as was the proportion of "high-cost" dollars correctly identified. Results are based on a separate 106,000-observation validation dataset. For "high-cost" prevalence targets of 1% and 0.5%, ACGs, DCGs, GRAM, and Prior-expense are very comparable in overall discrimination (AUCs, 0.83-0.86). Given a 0.5% prevalence target and a 0.5% prediction threshold, DCGs, GRAM, and Prior-expense captured $963,000 (approximately 3%) more "high-cost" sample dollars than other models. DCGs captured the most "high-cost" dollars among enrollees with asthma, diabetes, and depression; predictive performance among demographic groups (Medicaid members, members over 64, and children under 13) varied across models. Risk models can efficiently identify enrollees who are likely to generate future high costs and who could benefit from case management. The dollar value of improved prediction performance of the most accurate risk models should be meaningful to decision-makers and encourage their broader use for identifying high costs.
The prediction of engineering cost for green buildings based on information entropy
NASA Astrophysics Data System (ADS)
Liang, Guoqiang; Huang, Jinglian
2018-03-01
Green building is the developing trend in the world building industry. Additionally, construction costs are an essential consideration in building constructions. Therefore, it is necessary to investigate the problems of cost prediction in green building. On the basis of analyzing the cost of green building, this paper proposes the forecasting method of actual cost in green building based on information entropy and provides the forecasting working procedure. Using the probability density obtained from statistical data, such as labor costs, material costs, machinery costs, administration costs, profits, risk costs a unit project quotation and etc., situations can be predicted which lead to cost variations between budgeted cost and actual cost in constructions, through estimating the information entropy of budgeted cost and actual cost. The research results of this article have a practical significance in cost control of green building. Additionally, the method proposed in this article can be generalized and applied to a variety of other aspects in building management.
NASA Astrophysics Data System (ADS)
Pillosu, F. M.; Hewson, T.; Mazzetti, C.
2017-12-01
Prediction of local extreme rainfall has historically been the remit of nowcasting and high resolution limited area modelling, which represent only limited areas, may not be spatially accurate, give reasonable results only for limited lead times (<2 days) and become prohibitively expensive at global scale. ECMWF/EFAS/GLOFAS have developed a novel, cost-effective and physically-based statistical post-processing software ("ecPoint-Rainfall, ecPR", operational in 2017) that uses ECMWF Ensemble (ENS) output to deliver global probabilistic rainfall forecasts for points up to day 10. Firstly, ecPR applies a new notion of "remote calibration", which 1) allows us to replicate a multi-centennial training period using only one year of data, and 2) provides forecasts for anywhere in the world. Secondly, the software applies an understanding of how different rainfall generation mechanisms lead to different degrees of sub-grid variability in rainfall totals, and of where biases in the model can be improved upon. A long-term verification has shown that the post-processed rainfall has better reliability and resolution at every lead time if compared with ENS, and for large totals, ecPR outputs have the same skill at day 5 that the raw ENS has at day 1 (ROC area metric). ecPR could be used as input for hydrological models if its probabilistic output is modified accordingly to the inputs requirements for hydrological models. Indeed, ecPR does not provide information on where the highest total is likely to occur inside the gridbox, nor on the spatial distribution of rainfall values nearby. "Scenario forecasts" could be a solution. They are derived from locating the rainfall peak in sensitive positions (e.g. urban areas), and then redistributing the remaining quantities in the gridbox modifying traditional spatial correlation characterization methodologies (e.g. variogram analysis) in order to take account, for instance, of the type of rainfall forecast (stratiform, convective). Such an approach could be a turning point in the field of medium-range global real-time riverine flood forecasts. This presentation will illustrate for ecPR 1) system calibration, 2) operational implementation, 3) long-term verification, 4) future developments, and 5) early ideas for the application of ecPR outputs in hydrological models.
Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models
NASA Astrophysics Data System (ADS)
Hernandez, F.; Liang, X.
2017-12-01
Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in isolation, and of those achieved through their coupling. We expect these developments to help catalyze improvements in the predictive accuracy of large-scale forecasting operations by lowering the costs of deploying advanced data assimilation techniques.
PLS Road surface temperature forecast for susceptibility of ice occurrence
NASA Astrophysics Data System (ADS)
Marchetti, Mario; Khalifa, Abderrhamen; Bues, Michel
2014-05-01
Winter maintenance relies on many operational tools consisting in monitoring atmospheric and pavement physical parameters. Among them, road weather information systems (RWIS) and thermal mapping are mostly used by service in charge of managing infrastructure networks. The Data from RWIS and thermal mapping are considered as inputs for forecasting physical numerical models, commonly in place since the 80s. These numerical models do need an accurate description of the infrastructure, such as pavement layers and sub-layers, along with many meteorological parameters, such as air temperature and global and infrared radiation. The description is sometimes partially known, and meteorological data is only monitored on specific spot. On the other hand, thermal mapping is now an easy, reliable and cost effective way to monitor road surface temperature (RST), and many meteorological parameters all along routes of infrastructure networks, including with a whole fleet of vehicles in the specific cases of roads, or airports. The technique uses infrared thermometry to measure RST and an atmospheric probes for air temperature, relative humidity, wind speed and global radiation, both at a high resolution interval, to identify sections of the road network prone to ice occurrence. However, measurements are time-consuming, and the data from thermal mapping is one input among others to establish the forecast. The idea was to build a reliable forecast on the sole data from thermal mapping. Previous work has established the interest to use principal component analysis (PCA) on the basis of a reduced number of thermal fingerprints. The work presented here is a focus on the use of partial least-square regression (PLS) to build a RST forecast with air temperature measurements. Roads with various environments, weather conditions (clear, cloudy mainly) and seasons were monitored over several months to generate an appropriate number of samples. The study was conducted to determine the minimum number of samples to get a reliable forecast, considering inputs for numerical models do not exceed five thermal fingerprints. Results of PLS have shown that the PLS model could have a R² of 0.9562, a RMSEP of 1.34 and a bias of -0.66. The same model applied to establish a forecast on past event indicates an average difference between measurements and forecasts of 0.20 °C. The advantage of such approach is its potential application not only to winter events, but also the extreme summer ones for urban heat island.
[Forecast of costs of ecodependent cancer treatment for the development of management decisions].
Krasovskiy, V O
2014-01-01
The methodical approach for probabilistic forecasting and differentiation of treatment of costs of ecodependent cancer cases has been elaborated. The modality is useful in the organization of medical aid to cancer patients, in developing management decisions for the reduction the occupational load on the population, as well as in solutions problems in compensation to the population economic and social loss from industrial plants.
Swainson's Thrushes do not show strong wind selectivity prior to crossing the Gulf of Mexico.
Bolus, Rachel T; Diehl, Robert H; Moore, Frank R; Deppe, Jill L; Ward, Michael P; Smolinsky, Jaclyn; Zenzal, Theodore J
2017-10-27
During long-distance fall migrations, nocturnally migrating Swainson's Thrushes often stop on the northern Gulf of Mexico coast before flying across the Gulf. To minimize energetic costs, trans-Gulf migrants should stop over when they encounter crosswinds or headwinds, and depart with supportive tailwinds. However, time constrained migrants should be less selective, balancing costs of headwinds with benefits of continuing their migrations. To test the hypotheses that birds select supportive winds and that selectivity is mediated by seasonal time constraints, we examined whether local winds affected Swainson's Thrushes' arrival and departure at Ft. Morgan, Alabama, USA at annual, seasonal, and nightly time scales. Additionally, migrants could benefit from forecasting future wind conditions, crossing on nights when winds are consistently supportive across the Gulf, thereby avoiding the potentially lethal consequences of depleting their energetic reserves over water. To test whether birds forecast, we developed a movement model, calculated to what extent departure winds were predictive of future Gulf winds, and tested whether birds responded to predictability. Swainson's Thrushes were only slightly selective and did not appear to forecast. By following the simple rule of avoiding only the strongest headwinds at departure, Swainson's Thrushes could survive the 1500 km flight between Alabama and Veracruz, Mexico.
Forecasting Global Point Rainfall using ECMWF's Ensemble Forecasting System
NASA Astrophysics Data System (ADS)
Pillosu, Fatima; Hewson, Timothy; Zsoter, Ervin; Baugh, Calum
2017-04-01
ECMWF (the European Centre for Medium range Weather Forecasts), in collaboration with the EFAS (European Flood Awareness System) and GLOFAS (GLObal Flood Awareness System) teams, has developed a new operational system that post-processes grid box rainfall forecasts from its ensemble forecasting system to provide global probabilistic point-rainfall predictions. The project attains a higher forecasting skill by applying an understanding of how different rainfall generation mechanisms lead to different degrees of sub-grid variability in rainfall totals. In turn this approach facilitates identification of cases in which very localized extreme totals are much more likely. This approach aims also to improve the rainfall input required in different hydro-meteorological applications. Flash flood forecasting, in particular in urban areas, is a good example. In flash flood scenarios precipitation is typically characterised by high spatial variability and response times are short. In this case, to move beyond radar based now casting, the classical approach has been to use very high resolution hydro-meteorological models. Of course these models are valuable but they can represent only very limited areas, may not be spatially accurate and may give reasonable results only for limited lead times. On the other hand, our method aims to use a very cost-effective approach to downscale global rainfall forecasts to a point scale. It needs only rainfall totals from standard global reporting stations and forecasts over a relatively short period to train it, and it can give good results even up to day 5. For these reasons we believe that this approach better satisfies user needs around the world. This presentation aims to describe two phases of the project: The first phase, already completed, is the implementation of this new system to provide 6 and 12 hourly point-rainfall accumulation probabilities. To do this we use a limited number of physically relevant global model parameters (i.e. convective precipitation ratio, speed of steering winds, CAPE - Convective Available Potential Energy - and solar radiation), alongside the rainfall forecasts themselves, to define the "weather types" that in turn define the expected sub-grid variability. The calibration and computational strategy intrinsic to the system will be illustrated. The quality of the global point rainfall forecasts is also illustrated by analysing recent case studies in which extreme totals and a greatly elevated flash flood risk could be foreseen some days in advance but especially by a longer-term verification that arises out of retrospective global point rainfall forecasting for 2016. The second phase, currently in development, is focussing on the relationships with other relevant geographical aspects, for instance, orography and coastlines. Preliminary results will be presented. These are promising but need further study to fully understand their impact on the spatial distribution of point rainfall totals.
NASA Astrophysics Data System (ADS)
Liu, Zhiquan; Liu, Quanhua; Lin, Hui-Chuan; Schwartz, Craig S.; Lee, Yen-Huei; Wang, Tijian
2011-12-01
Assimilation of the Moderate Resolution Imaging Spectroradiometer (MODIS) total aerosol optical depth (AOD) retrieval products (at 550 nm wavelength) from both Terra and Aqua satellites have been developed within the National Centers for Environmental Prediction (NCEP) Gridpoint Statistical Interpolation (GSI) three-dimensional variational (3DVAR) data assimilation system. This newly developed algorithm allows, in a one-step procedure, the analysis of 3-D mass concentration of 14 aerosol variables from the Goddard Chemistry Aerosol Radiation and Transport (GOCART) module. The Community Radiative Transfer Model (CRTM) was extended to calculate AOD using GOCART aerosol variables as input. Both the AOD forward model and corresponding Jacobian model were developed within the CRTM and used in the 3DVAR minimization algorithm to compute the AOD cost function and its gradient with respect to 3-D aerosol mass concentration. The impact of MODIS AOD data assimilation was demonstrated by application to a dust storm from 17 to 24 March 2010 over East Asia. The aerosol analyses initialized Weather Research and Forecasting/Chemistry (WRF/Chem) model forecasts. Results indicate that assimilating MODIS AOD substantially improves aerosol analyses and subsequent forecasts when compared to MODIS AOD, independent AOD observations from the Aerosol Robotic Network (AERONET) and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) instrument, and surface PM10 (particulate matter with diameters less than 10 μm) observations. The newly developed AOD data assimilation system can serve as a tool to improve simulations of dust storms and general air quality analyses and forecasts.
Forecasting Lightning Threat using Cloud-Resolving Model Simulations
NASA Technical Reports Server (NTRS)
McCaul, Eugene W., Jr.; Goodman, Steven J.; LaCasse, Katherine M.; Cecil, Daniel J.
2008-01-01
Two new approaches are proposed and developed for making time and space dependent, quantitative short-term forecasts of lightning threat, and a blend of these approaches is devised that capitalizes on the strengths of each. The new methods are distinctive in that they are based entirely on the ice-phase hydrometeor fields generated by regional cloud-resolving numerical simulations, such as those produced by the WRF model. These methods are justified by established observational evidence linking aspects of the precipitating ice hydrometeor fields to total flash rates. The methods are straightforward and easy to implement, and offer an effective near-term alternative to the incorporation of complex and costly cloud electrification schemes into numerical models. One method is based on upward fluxes of precipitating ice hydrometeors in the mixed phase region at the-15 C level, while the second method is based on the vertically integrated amounts of ice hydrometeors in each model grid column. Each method can be calibrated by comparing domain-wide statistics of the peak values of simulated flash rate proxy fields against domain-wide peak total lightning flash rate density data from observations. Tests show that the first method is able to capture much of the temporal variability of the lightning threat, while the second method does a better job of depicting the areal coverage of the threat. Our blended solution is designed to retain most of the temporal sensitivity of the first method, while adding the improved spatial coverage of the second. Exploratory tests for selected North Alabama cases show that, because WRF can distinguish the general character of most convective events, our methods show promise as a means of generating quantitatively realistic fields of lightning threat. However, because the models tend to have more difficulty in predicting the instantaneous placement of storms, forecasts of the detailed location of the lightning threat based on single simulations can be in error. Although these model shortcomings presently limit the precision of lightning threat forecasts from individual runs of current generation models,the techniques proposed herein should continue to be applicable as newer and more accurate physically-based model versions, physical parameterizations, initialization techniques and ensembles of forecasts become available.
NASA Astrophysics Data System (ADS)
Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.
2018-07-01
Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.
Status of Air Quality in Central California and Needs for Further Study
NASA Astrophysics Data System (ADS)
Tanrikulu, S.; Beaver, S.; Soong, S.; Tran, C.; Jia, Y.; Matsuoka, J.; McNider, R. T.; Biazar, A. P.; Palazoglu, A.; Lee, P.; Wang, J.; Kang, D.; Aneja, V. P.
2012-12-01
Ozone and PM2.5 levels frequently exceed NAAQS in central California (CC). Additional emission reductions are needed to attain and maintain the standards there. Agencies are developing cost-effective emission control strategies along with complementary incentive programs to reduce emissions when exceedances are forecasted. These approaches require accurate modeling and forecasting capabilities. A variety of models have been rigorously applied (MM5, WRF, CMAQ, CAMx) over CC. Despite the vast amount of land-based measurements from special field programs and significant effort, models have historically exhibited marginal performance. Satellite data may improve model performance by: establishing IC/BC over outlying areas of the modeling domain having unknown conditions; enabling FDDA over the Pacific Ocean to characterize important marine inflows and pollutant outflows; and filling in the gaps of the land-based monitoring network. BAAQMD, in collaboration with the NASA AQAST, plans to conduct four studies that include satellite-based data in CC air quality analysis and modeling: The first project enhances and refines weather patterns, especially aloft, impacting summer ozone formation. Surface analyses were unable to characterize the strong attenuating effect of the complex terrain to steer marine winds impinging on the continent. The dense summer clouds and fog over the Pacific Ocean form spatial patterns that can be related to the downstream air flows through polluted areas. The goal of this project is to explore, characterize, and quantify these relationships using cloud cover data. Specifically, cloud agreement statistics will be developed using satellite data and model clouds. Model skin temperature predictions will be compared to both MODIS and GOES skin temperatures. The second project evaluates and improves the initial and simulated fields of meteorological models that provide inputs to air quality models. The study will attempt to determine whether a cloud dynamical adjustment developed by UAHuntsville can improve model performance for maritime stratus and whether a moisture adjustment scheme in the Pleim-Xiu boundary layer scheme can use satellite data in place of coarse surface air temperature measurements. The goal is to improve meteorological model performance that leads to improved air quality model performance. The third project evaluates and improves forecasting skills of the National Air Quality Forecasting Model in CC by using land-based routine measurements as well as satellite data. Local forecasts are mostly based on surface meteorological and air quality measurements and weather charts provided by NWS. The goal is to improve the average accuracy in forecasting exceedances, which is around 60%. The fourth project uses satellite data for monitoring trends in fine particulate matter (PM2.5) in the San Francisco Bay Area. It evaluates the effectiveness of a rule adopted in 2008 that restricts household wood burning on days forecasted to have high PM2.5 levels. The goal is to complement current analyses based on surface data covering the largest sub-regions and population centers. The overall goal is to use satellite data to overcome limitations of land-based measurements. The outcomes will be further conceptual understanding of pollutant formation, improved regulatory model performance, and better optimized forecasting programs.
Araz, Ozgur M; Bentley, Dan; Muelleman, Robert L
2014-09-01
Emergency department (ED) visits increase during the influenza seasons. It is essential to identify statistically significant correlates in order to develop an accurate forecasting model for ED visits. Forecasting influenza-like-illness (ILI)-related ED visits can significantly help in developing robust resource management strategies at the EDs. We first performed correlation analyses to understand temporal correlations between several predictors of ILI-related ED visits. We used the data available for Douglas County, the biggest county in Nebraska, for Omaha, the biggest city in the state, and for a major hospital in Omaha. The data set included total and positive influenza test results from the hospital (ie, Antigen rapid (Ag) and Respiratory Syncytial Virus Infection (RSV) tests); an Internet-based influenza surveillance system data, that is, Google Flu Trends, for both Nebraska and Omaha; total ED visits in Douglas County attributable to ILI; and ILI surveillance network data for Douglas County and Nebraska as the predictors and data for the hospital's ILI-related ED visits as the dependent variable. We used Seasonal Autoregressive Integrated Moving Average and Holt Winters methods with3 linear regression models to forecast ILI-related ED visits at the hospital and evaluated model performances by comparing the root means square errors (RMSEs). Because of strong positive correlations with ILI-related ED visits between 2008 and 2012, we validated the use of Google Flu Trends data as a predictor in an ED influenza surveillance tool. Of the 5 forecasting models we have tested, linear regression models performed significantly better when Google Flu Trends data were included as a predictor. Regression models including Google Flu Trends data as a predictor variable have lower RMSE, and the lowest is achieved when all other variables are also included in the model in our forecasting experiments for the first 5 weeks of 2013 (with RMSE = 57.61). Google Flu Trends data statistically improve the performance of predicting ILI-related ED visits in Douglas County, and this result can be generalized to other communities. Timely and accurate estimates of ED volume during the influenza season, as well as during pandemic outbreaks, can help hospitals plan their ED resources accordingly and lower their costs by optimizing supplies and staffing and can improve service quality by decreasing ED wait times and overcrowding. Copyright © 2014 Elsevier Inc. All rights reserved.
Prediction of Winter Storm Tracks and Intensities Using the GFDL fvGFS Model
NASA Astrophysics Data System (ADS)
Rees, S.; Boaggio, K.; Marchok, T.; Morin, M.; Lin, S. J.
2017-12-01
The GFDL Finite-Volume Cubed-Sphere Dynamical core (FV3) is coupled to a modified version of the Global Forecast System (GFS) physics and initial conditions, to form the fvGFS model. This model is similar to the one being implemented as the next-generation operational weather model for the NWS, which is also FV3-powered. Much work has been done to verify fvGFS tropical cyclone prediction, but little has been done to verify winter storm prediction. These costly and dangerous storms impact parts of the U.S. every year. To verify winter storms we ran the NCEP operational cyclone tracker, developed at GFDL, on semi-real-time 13 km horizontal resolution fvGFS forecasts. We have found that fvGFS compares well to the operational GFS in storm track and intensity, though often predicts slightly higher intensities. This presentation will show the track and intensity verification from the past two winter seasons and explore possible reasons for bias.
Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.
1981-03-01
Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS
NASA Technical Reports Server (NTRS)
Sepehry-Fard, F.; Coulthard, Maurice H.
1995-01-01
The process of predicting the values of maintenance time dependent variable parameters such as mean time between failures (MTBF) over time must be one that will not in turn introduce uncontrolled deviation in the results of the ILS analysis such as life cycle costs, spares calculation, etc. A minor deviation in the values of the maintenance time dependent variable parameters such as MTBF over time will have a significant impact on the logistics resources demands, International Space Station availability and maintenance support costs. There are two types of parameters in the logistics and maintenance world: a. Fixed; b. Variable Fixed parameters, such as cost per man hour, are relatively easy to predict and forecast. These parameters normally follow a linear path and they do not change randomly. However, the variable parameters subject to the study in this report such as MTBF do not follow a linear path and they normally fall within the distribution curves which are discussed in this publication. The very challenging task then becomes the utilization of statistical techniques to accurately forecast the future non-linear time dependent variable arisings and events with a high confidence level. This, in turn, shall translate in tremendous cost savings and improved availability all around.
Timetable of an operational flood forecasting system
NASA Astrophysics Data System (ADS)
Liechti, Katharina; Jaun, Simon; Zappa, Massimiliano
2010-05-01
At present a new underground part of Zurich main station is under construction. For this purpose the runoff capacity of river Sihl, which is passing beneath the main station, is reduced by 40%. If a flood is to occur the construction site is evacuated and gates can be opened for full runoff capacity to prevent bigger damages. However, flooding the construction site, even if it is controlled, is coupled with costs and retardation. The evacuation of the construction site at Zurich main station takes about 2 to 4 hours and opening the gates takes another 1 to 2 hours each. In the upper part of the 336 km2 Sihl catchment the Sihl lake, a reservoir lake, is situated. It belongs and is used by the Swiss Railway Company for hydropower production. This lake can act as a retention basin for about 46% of the Sihl catchment. Lowering the lake level to gain retention capacity, and therewith safety, is coupled with direct loss for the Railway Company. To calculate the needed retention volume and the water to be released facing unfavourable weather conditions, forecasts with a minimum lead time of 2 to 3 days are needed. Since the catchment is rather small, this can only be realised by the use of meteorological forecast data. Thus the management of the construction site depends on accurate forecasts to base their decisions on. Therefore an operational hydrological ensemble prediction system (HEPS) was introduced in September 2008 by the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL). It delivers daily discharge forecasts with a time horizon of 5 days. The meteorological forecasts are provided by MeteoSwiss and stem from the operational limited-area COSMO-LEPS which downscales the ECMWF ensemble prediction system to a spatial resolution of 7 km. Additional meteorological data for model calibration and initialisation (air temperature, precipitation, water vapour pressure, global radiation, wind speed and sunshine duration) and radar data are also provided by MeteoSwiss. Additional meteorological and hydrological observations are provided by a hydropower company, the Canton of Zurich and the Federal Office for the Environment (FOEN). The hydrological forecasting is calculated by the semi-distributed hydrological model PREVAH (Precipitation-Runoff-EVapotranspiration-HRU-related Model) and is further processed by the hydraulic model FLORIS. Finally the forecasts and alerts along with additional meteorological and hydrological observations and forecasts from collaborating institution are sent to a webserver accessible for decision makers. We will document the setup of our operational flood forecasting system, evaluate its performance and show how the collaboration and communication between science and practice, including all the different interests, works for this particular example.
Selecting Single Model in Combination Forecasting Based on Cointegration Test and Encompassing Test
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability. PMID:24892061
Selecting single model in combination forecasting based on cointegration test and encompassing test.
Jiang, Chuanjin; Zhang, Jing; Song, Fugen
2014-01-01
Combination forecasting takes all characters of each single forecasting method into consideration, and combines them to form a composite, which increases forecasting accuracy. The existing researches on combination forecasting select single model randomly, neglecting the internal characters of the forecasting object. After discussing the function of cointegration test and encompassing test in the selection of single model, supplemented by empirical analysis, the paper gives the single model selection guidance: no more than five suitable single models can be selected from many alternative single models for a certain forecasting target, which increases accuracy and stability.
Optimizing Microgrid Architecture on Department of Defense Installations
2014-09-01
PPA power purchase agreement PV photovoltaic QDR Quadrennial Defense Review SNL Sandia National Laboratory SPIDERS Smart Power Infrastructure...a MILP that dispatches fuel-based generators with consideration to an ensemble of forecasted inputs from renewable power sources, subject to physical...wind power project costs by region: 2012 projects, from [30]. 6. Weather Forecasts Weather forecasts are often presented as a single prediction
Water balance models in one-month-ahead streamflow forecasting
Alley, William M.
1985-01-01
Techniques are tested that incorporate information from water balance models in making 1-month-ahead streamflow forecasts in New Jersey. The results are compared to those based on simple autoregressive time series models. The relative performance of the models is dependent on the month of the year in question. The water balance models are most useful for forecasts of April and May flows. For the stations in northern New Jersey, the April and May forecasts were made in order of decreasing reliability using the water-balance-based approaches, using the historical monthly means, and using simple autoregressive models. The water balance models were useful to a lesser extent for forecasts during the fall months. For the rest of the year the improvements in forecasts over those obtained using the simpler autoregressive models were either very small or the simpler models provided better forecasts. When using the water balance models, monthly corrections for bias are found to improve minimum mean-square-error forecasts as well as to improve estimates of the forecast conditional distributions.
Evaluation Of Statistical Models For Forecast Errors From The HBV-Model
NASA Astrophysics Data System (ADS)
Engeland, K.; Kolberg, S.; Renard, B.; Stensland, I.
2009-04-01
Three statistical models for the forecast errors for inflow to the Langvatn reservoir in Northern Norway have been constructed and tested according to how well the distribution and median values of the forecasts errors fit to the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order autoregressive model was constructed for the forecast errors. The parameters were conditioned on climatic conditions. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order autoregressive model was constructed for the forecast errors. For the last model positive and negative errors were modeled separately. The errors were first NQT-transformed before a model where the mean values were conditioned on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: We wanted a) the median values to be close to the observed values; b) the forecast intervals to be narrow; c) the distribution to be correct. The results showed that it is difficult to obtain a correct model for the forecast errors, and that the main challenge is to account for the auto-correlation in the errors. Model 1 and 2 gave similar results, and the main drawback is that the distributions are not correct. The 95% forecast intervals were well identified, but smaller forecast intervals were over-estimated, and larger intervals were under-estimated. Model 3 gave a distribution that fits better, but the median values do not fit well since the auto-correlation is not properly accounted for. If the 95% forecast interval is of interest, Model 2 is recommended. If the whole distribution is of interest, Model 3 is recommended.
Chance-Constrained Day-Ahead Hourly Scheduling in Distribution System Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard
This paper aims to propose a two-step approach for day-ahead hourly scheduling in a distribution system operation, which contains two operation costs, the operation cost at substation level and feeder level. In the first step, the objective is to minimize the electric power purchase from the day-ahead market with the stochastic optimization. The historical data of day-ahead hourly electric power consumption is used to provide the forecast results with the forecasting error, which is presented by a chance constraint and formulated into a deterministic form by Gaussian mixture model (GMM). In the second step, the objective is to minimize themore » system loss. Considering the nonconvexity of the three-phase balanced AC optimal power flow problem in distribution systems, the second-order cone program (SOCP) is used to relax the problem. Then, a distributed optimization approach is built based on the alternating direction method of multiplier (ADMM). The results shows that the validity and effectiveness method.« less
Short-term load and wind power forecasting using neural network-based prediction intervals.
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2014-02-01
Electrical power systems are evolving from today's centralized bulk systems to more decentralized systems. Penetrations of renewable energies, such as wind and solar power, significantly increase the level of uncertainty in power systems. Accurate load forecasting becomes more complex, yet more important for management of power systems. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in system operations. To quantify potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for the construction of prediction intervals (PIs). A newly introduced method, called lower upper bound estimation (LUBE), is applied and extended to develop PIs using NN models. A new problem formulation is proposed, which translates the primary multiobjective problem into a constrained single-objective problem. Compared with the cost function, this new formulation is closer to the primary problem and has fewer parameters. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Electrical demands from Singapore and New South Wales (Australia), as well as wind power generation from Capital Wind Farm, are used to validate the PSO-based LUBE method. Comparative results show that the proposed method can construct higher quality PIs for load and wind power generation forecasts in a short time.
Advancing solar energy forecasting through the underlying physics
NASA Astrophysics Data System (ADS)
Yang, H.; Ghonima, M. S.; Zhong, X.; Ozge, B.; Kurtz, B.; Wu, E.; Mejia, F. A.; Zamora, M.; Wang, G.; Clemesha, R.; Norris, J. R.; Heus, T.; Kleissl, J. P.
2017-12-01
As solar power comprises an increasingly large portion of the energy generation mix, the ability to accurately forecast solar photovoltaic generation becomes increasingly important. Due to the variability of solar power caused by cloud cover, knowledge of both the magnitude and timing of expected solar power production ahead of time facilitates the integration of solar power onto the electric grid by reducing electricity generation from traditional ancillary generators such as gas and oil power plants, as well as decreasing the ramping of all generators, reducing start and shutdown costs, and minimizing solar power curtailment, thereby providing annual economic value. The time scales involved in both the energy markets and solar variability range from intra-hour to several days ahead. This wide range of time horizons led to the development of a multitude of techniques, with each offering unique advantages in specific applications. For example, sky imagery provides site-specific forecasts on the minute-scale. Statistical techniques including machine learning algorithms are commonly used in the intra-day forecast horizon for regional applications, while numerical weather prediction models can provide mesoscale forecasts on both the intra-day and days-ahead time scale. This talk will provide an overview of the challenges unique to each technique and highlight the advances in their ongoing development which come alongside advances in the fundamental physics underneath.
Accuracy of short‐term sea ice drift forecasts using a coupled ice‐ocean model
Zhang, Jinlun
2015-01-01
Abstract Arctic sea ice drift forecasts of 6 h–9 days for the summer of 2014 are generated using the Marginal Ice Zone Modeling and Assimilation System (MIZMAS); the model is driven by 6 h atmospheric forecasts from the Climate Forecast System (CFSv2). Forecast ice drift speed is compared to drifting buoys and other observational platforms. Forecast positions are compared with actual positions 24 h–8 days since forecast. Forecast results are further compared to those from the forecasts generated using an ice velocity climatology driven by multiyear integrations of the same model. The results are presented in the context of scheduling the acquisition of high‐resolution images that need to follow buoys or scientific research platforms. RMS errors for ice speed are on the order of 5 km/d for 24–48 h since forecast using the sea ice model compared with 9 km/d using climatology. Predicted buoy position RMS errors are 6.3 km for 24 h and 14 km for 72 h since forecast. Model biases in ice speed and direction can be reduced by adjusting the air drag coefficient and water turning angle, but the adjustments do not affect verification statistics. This suggests that improved atmospheric forecast forcing may further reduce the forecast errors. The model remains skillful for 8 days. Using the forecast model increases the probability of tracking a target drifting in sea ice with a 10 km × 10 km image from 60 to 95% for a 24 h forecast and from 27 to 73% for a 48 h forecast. PMID:27818852
Value-based resource management: a model for best value nursing care.
Caspers, Barbara A; Pickard, Beth
2013-01-01
With the health care environment shifting to a value-based payment system, Catholic Health Initiatives nursing leadership spearheaded an initiative with 14 hospitals to establish best nursing care at a lower cost. The implementation of technology-enabled business processes at point of care led to a new model for best value nursing care: Value-Based Resource Management. The new model integrates clinical patient data from the electronic medical record and embeds the new information in care team workflows for actionable real-time decision support and predictive forecasting. The participating hospitals reported increased patient satisfaction and cost savings in the reduction of overtime and improvement in length of stay management. New data generated by the initiative on nursing hours and cost by patient and by population (Medicare severity diagnosis-related groups), and patient health status outcomes across the acute care continuum expanded business intelligence for a value-based population health system.
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
NASA Astrophysics Data System (ADS)
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast
DOE Office of Scientific and Technical Information (OSTI.GOV)
STADLER, MICHAEL; MASHAYEKH, SALMAN; DEFOREST, NICHOLAS
The ODC Microgrid Controller is an optimization-based model predicative microgrid controller (MPMC) to minimize operation cost (and/or CO2 emissions) in a microgrid in the grid-connected mode. It is composed of several modules, including a) forecasting, b) optimization, c) data exchange and d) power balancing modules. In the presence of a multi-layered control system architecture, these modules will reside in the supervisory control layer.
A probabilistic analysis of silicon cost
NASA Technical Reports Server (NTRS)
Reiter, L. J.
1983-01-01
Silicon materials costs represent both a cost driver and an area where improvement can be made in the manufacture of photovoltaic modules. The cost from three processes for the production of low-cost silicon being developed under the U.S. Department of Energy's (DOE) National Photovoltaic Program is analyzed. The approach is based on probabilistic inputs and makes use of two models developed at the Jet Propulsion Laboratory: SIMRAND (SIMulation of Research ANd Development) and IPEG (Improved Price Estimating Guidelines). The approach, assumptions, and limitations are detailed along with a verification of the cost analyses methodology. Results, presented in the form of cumulative probability distributions for silicon cost, indicate that there is a 55% chance of reaching the DOE target of $16/kg for silicon material. This is a technically achievable cost based on expert forecasts of the results of ongoing research and development and do not imply any market prices for a given year.
Kaye, I David; Adrados, Murillo; Karia, Raj J; Protopsaltis, Themistocles S; Bosco, Joseph A
2017-11-01
Observational database review. To determine the effect of patient severity of illness (SOI) on the cost of spine surgery among New York state hospitals. National health care spending has risen at an unsustainable rate with musculoskeletal care, and spine surgery in particular, accounting for a significant portion of this expenditure. In an effort towards cost-containment, health care payers are exploring novel payment models some of which reward cost savings but penalize excessive spending. To mitigate risk to health care institutions, accurate cost forecasting is essential. No studies have evaluated the effect of SOI on costs within spine surgery. The New York State Hospital Inpatient Cost Transparency Database was reviewed to determine the costs of 69,831 hospital discharges between 2009 and 2011 comprising the 3 most commonly performed spine surgeries in the state. These costs were then analyzed in the context of the specific all patient refined diagnosis-related group (DRG) SOI modifier to determine this index's effect on overall costs. Overall, hospital-reported cost increases with the patient's SOI class and patients with worse baseline health incur greater hospital costs (P<0.001). Moreover, these costs are increasingly variable for each worsening SOI class (P<0.001). This trend of increasing costs is persistent for all 3 DRGs across all 3 years studied (2009-2011), within each of the 7 New York state regions, and occurs irrespective of the hospital's teaching status or size. Using the 3M all patient refined-DRG SOI index as a measure of patient's health status, a significant increase in cost for spine surgery for patients with higher SOI index was found. This study confirms the greater cost and variability of spine surgery for sicker patients and illustrates the inherent unpredictability in cost forecasting and budgeting for these same patients.
Integration of Behind-the-Meter PV Fleet Forecasts into Utility Grid System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoff, Thomas Hoff; Kankiewicz, Adam
Four major research objectives were completed over the course of this study. Three of the objectives were to evaluate three, new, state-of-the-art solar irradiance forecasting models. The fourth objective was to improve the California Independent System Operator’s (ISO) load forecasts by integrating behind-the-meter (BTM) PV forecasts. The three, new, state-of-the-art solar irradiance forecasting models included: the infrared (IR) satellite-based cloud motion vector (CMV) model; the WRF-SolarCA model and variants; and the Optimized Deep Machine Learning (ODML)-training model. The first two forecasting models targeted known weaknesses in current operational solar forecasts. They were benchmarked against existing operational numerical weather prediction (NWP)more » forecasts, visible satellite CMV forecasts, and measured PV plant power production. IR CMV, WRF-SolarCA, and ODML-training forecasting models all improved the forecast to a significant degree. Improvements varied depending on time of day, cloudiness index, and geographic location. The fourth objective was to demonstrate that the California ISO’s load forecasts could be improved by integrating BTM PV forecasts. This objective represented the project’s most exciting and applicable gains. Operational BTM forecasts consisting of 200,000+ individual rooftop PV forecasts were delivered into the California ISO’s real-time automated load forecasting (ALFS) environment. They were then evaluated side-by-side with operational load forecasts with no BTM-treatment. Overall, ALFS-BTM day-ahead (DA) forecasts performed better than baseline ALFS forecasts when compared to actual load data. Specifically, ALFS-BTM DA forecasts were observed to have the largest reduction of error during the afternoon on cloudy days. Shorter term 30 minute-ahead ALFS-BTM forecasts were shown to have less error under all sky conditions, especially during the morning time periods when traditional load forecasts often experience their largest uncertainties. This work culminated in a GO decision being made by the California ISO to include zonal BTM forecasts into its operational load forecasting system. The California ISO’s Manager of Short Term Forecasting, Jim Blatchford, summarized the research performed in this project with the following quote: “The behind-the-meter (BTM) California ISO region forecasting research performed by Clean Power Research and sponsored by the Department of Energy’s SUNRISE program was an opportunity to verify value and demonstrate improved load forecast capability. In 2016, the California ISO will be incorporating the BTM forecast into the Hour Ahead and Day Ahead load models to look for improvements in the overall load forecast accuracy as BTM PV capacity continues to grow.”« less
The Best of Both Worlds: Developing a Hybrid Data System for the ASF DAAC
NASA Astrophysics Data System (ADS)
Arko, S. A.; Buechler, B.; Wolf, V. G.
2017-12-01
The Alaska Satellite Facility (ASF) at the University of Alaska Fairbanks hosts the NASA Distributed Active Archive Center (DAAC) specializing in synthetic aperture radar (SAR). Historically, the ASF DAAC has hosted hardware on-premises and developed DAAC-specific software to operate, manage, and maintain the DAAC data system. In the past year, ASF DAAC has been moving many of the standard DAAC operations into the Amazon Web Services (AWS) cloud. This includes data ingest, basic pre-processing, archiving, and distribution within the AWS environment. While the cloud offers nearly unbounded capacity for expansion and a great host of services, there also can be unexpected and unplanned costs for such. Additionally, these costs can be difficult to forecast even with historic data usage patterns and models for future usage. In an effort to maximize the effectiveness of the DAAC data system, while still managing and accurately forecasting costs, ASF DAAC has developed a hybrid, cloud and on-premises, data system. The goal of this project is to make extensive use of the AWS cloud, and when appropriate, utilize on-premises resources to help constrain costs. This hybrid system attempts to mimic, on premises, a cloud environment using Kubernetes container orchestration in order that software can be run in either location with little change. Combined with hybrid data storage architecture, the new data system makes use of the great capacity of the cloud while maintaining an on-premises options. This presentation will describe the development of the hybrid data system, including the micro-services architecture and design, the container orchestration, and hybrid storage. Additional we will highlight the lessons learned through the development process, cost forecasting for current and future SAR-mission operations, and provide a discussion of the pros and cons of hybrid architectures versus all-cloud deployments. This development effort has led to a system that is capable and flexible for the future while allowing ASF DAAC to continue supporting the SAR community with the highest level of services.
NASA Astrophysics Data System (ADS)
Fobair, Richard C., II
This research presents a model for forecasting the numbers of jobs created in the energy efficiency retrofit (EER) supply chain resulting from an investment in upgrading residential buildings in Florida. This investigation examined material supply chains stretching from mining to project installation for three product types: insulation, windows/doors, and heating, ventilating, and air conditioning (HVAC) systems. Outputs from the model are provided for the project, sales, manufacturing, and mining level. The model utilizes reverse-estimation to forecast the numbers of jobs that result from an investment. Reverse-estimation is a process that deconstructs a total investment into its constituent parts. In this research, an investment is deconstructed into profit, overhead, and hard costs for each level of the supply chain and over multiple iterations of inter-industry exchanges. The model processes an investment amount, the type of work and method of contracting into a prediction of the number of jobs created. The deconstruction process utilizes data from the U.S. Economic Census. At each supply chain level, the cost of labor is reconfigured into full-time equivalent (FTE) jobs (i.e. equivalent to 40 hours per week for 52 weeks) utilizing loaded labor rates and a typical employee mix. The model is sensitive to adjustable variables, such as percentage of work performed per type of product, allocation of worker time per skill level, annual hours for FTE calculations, wage rate, and benefits. This research provides several new insights into job creation. First, it provides definitions that can be used for future research on jobs in supply chains related to energy efficiency. Second, it provides a methodology for future investigators to calculate jobs in a supply chain resulting from an investment in energy efficiency upgrades to a building. The methodology used in this research is unique because it examines gross employment at the sub-industry level for specific commodities. Most research on employment examines the net employment change (job creation less job destruction) at levels for regions, industries, and the aggregate economy. Third, it provides a forecast of the numbers of jobs for an investment in energy efficiency over the entire supply chain for the selected industries and the job factors for major levels of the supply chain.
Brownstein, John S; Chu, Shuyu; Marathe, Achla; Marathe, Madhav V; Nguyen, Andre T; Paolotti, Daniela; Perra, Nicola; Perrotta, Daniela; Santillana, Mauricio; Swarup, Samarth; Tizzoni, Michele; Vespignani, Alessandro; Vullikanti, Anil Kumar S; Wilson, Mandy L; Zhang, Qian
2017-11-01
Influenza outbreaks affect millions of people every year and its surveillance is usually carried out in developed countries through a network of sentinel doctors who report the weekly number of Influenza-like Illness cases observed among the visited patients. Monitoring and forecasting the evolution of these outbreaks supports decision makers in designing effective interventions and allocating resources to mitigate their impact. Describe the existing participatory surveillance approaches that have been used for modeling and forecasting of the seasonal influenza epidemic, and how they can help strengthen real-time epidemic science and provide a more rigorous understanding of epidemic conditions. We describe three different participatory surveillance systems, WISDM (Widely Internet Sourced Distributed Monitoring), Influenzanet and Flu Near You (FNY), and show how modeling and simulation can be or has been combined with participatory disease surveillance to: i) measure the non-response bias in a participatory surveillance sample using WISDM; and ii) nowcast and forecast influenza activity in different parts of the world (using Influenzanet and Flu Near You). WISDM-based results measure the participatory and sample bias for three epidemic metrics i.e. attack rate, peak infection rate, and time-to-peak, and find the participatory bias to be the largest component of the total bias. The Influenzanet platform shows that digital participatory surveillance data combined with a realistic data-driven epidemiological model can provide both short-term and long-term forecasts of epidemic intensities, and the ground truth data lie within the 95 percent confidence intervals for most weeks. The statistical accuracy of the ensemble forecasts increase as the season progresses. The Flu Near You platform shows that participatory surveillance data provide accurate short-term flu activity forecasts and influenza activity predictions. The correlation of the HealthMap Flu Trends estimates with the observed CDC ILI rates is 0.99 for 2013-2015. Additional data sources lead to an error reduction of about 40% when compared to the estimates of the model that only incorporates CDC historical information. While the advantages of participatory surveillance, compared to traditional surveillance, include its timeliness, lower costs, and broader reach, it is limited by a lack of control over the characteristics of the population sample. Modeling and simulation can help overcome this limitation as well as provide real-time and long-term forecasting of influenza activity in data-poor parts of the world. ©John S Brownstein, Shuyu Chu, Achla Marathe, Madhav V Marathe, Andre T Nguyen, Daniela Paolotti, Nicola Perra, Daniela Perrotta, Mauricio Santillana, Samarth Swarup, Michele Tizzoni, Alessandro Vespignani, Anil Kumar S Vullikanti, Mandy L Wilson, Qian Zhang. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 01.11.2017.
Forecasting wildland fire behavior using high-resolution large-eddy simulations
NASA Astrophysics Data System (ADS)
Munoz-Esparza, D.; Kosovic, B.; Jimenez, P. A.; Anderson, A.; DeCastro, A.; Brown, B.
2016-12-01
Wildland fires are responsible for large socio-economic impacts. Fires affect the environment, damage structures, threaten lives, cause health issues, and involve large suppression costs. These impacts can be mitigated via accurate fire spread forecast to inform the incident management team. To this end, the state of Colorado is funding the development of the Colorado Fire Prediction System (CO-FPS). The system is based on the Weather Research and Forecasting (WRF) model enhanced with a fire behavior module (WRF-Fire). Realistic representation of wildland fire behavior requires explicit representation of small scale weather phenomena to properly account for coupled atmosphere-wildfire interactions. Moreover, transport and dispersion of biomass burning emissions from wildfires is controlled by turbulent processes in the atmospheric boundary layer, which are difficult to parameterize and typically lead to large errors when simplified source estimation and injection height methods are used. Therefore, we utilize turbulence-resolving large-eddy simulations at a resolution of 111 m to forecast fire spread and smoke distribution using a coupled atmosphere-wildfire model. This presentation will describe our improvements to the level-set based fire-spread algorithm in WRF-Fire and an evaluation of the operational system using 12 wildfire events that occurred in Colorado in 2016, as well as other historical fires. In addition, the benefits of explicit representation of turbulence for smoke transport and dispersion will be demonstrated.
Forecasting wildland fire behavior using high-resolution large-eddy simulations
NASA Astrophysics Data System (ADS)
Munoz-Esparza, D.; Kosovic, B.; Jimenez, P. A.; Anderson, A.; DeCastro, A.; Brown, B.
2017-12-01
Wildland fires are responsible for large socio-economic impacts. Fires affect the environment, damage structures, threaten lives, cause health issues, and involve large suppression costs. These impacts can be mitigated via accurate fire spread forecast to inform the incident management team. To this end, the state of Colorado is funding the development of the Colorado Fire Prediction System (CO-FPS). The system is based on the Weather Research and Forecasting (WRF) model enhanced with a fire behavior module (WRF-Fire). Realistic representation of wildland fire behavior requires explicit representation of small scale weather phenomena to properly account for coupled atmosphere-wildfire interactions. Moreover, transport and dispersion of biomass burning emissions from wildfires is controlled by turbulent processes in the atmospheric boundary layer, which are difficult to parameterize and typically lead to large errors when simplified source estimation and injection height methods are used. Therefore, we utilize turbulence-resolving large-eddy simulations at a resolution of 111 m to forecast fire spread and smoke distribution using a coupled atmosphere-wildfire model. This presentation will describe our improvements to the level-set based fire-spread algorithm in WRF-Fire and an evaluation of the operational system using 12 wildfire events that occurred in Colorado in 2016, as well as other historical fires. In addition, the benefits of explicit representation of turbulence for smoke transport and dispersion will be demonstrated.
New drug adoption models: a review and assessment of future needs.
Agrawal, M; Calantone, R J
1995-01-01
New drug products today are the key to survival in the pharmaceutical industry. However, the new product development process in the pharmaceutical industry also happens to be one of the riskiest and most expensive undertakings because of the huge research and development costs involved. Consequently market forecasting of new pharmaceutical products takes on added importance if the formidable investments are to be recovered. New drug adoption models provide the marketer with a means to assess new product potential. Although several adoption models are available in the marketing literature for assessing potential of common consumer goods, the unique characteristics of the prescription drug market makes it necessary to examine the current state of pharmaceutical innovations. The purpose of this study, therefore, is to: (1) review new drug adoption models in the pharmaceutical literature, (2) evaluate the existing models of new drug adoption using the ten criteria for a good model as prescribed by Zaltman and Wallendorf (1983), and (3) provide an overall assessment and a ¿prescription¿ for better forecasting of new drug products.
Forecasting biodiversity in breeding birds using best practices
Taylor, Shawn D.; White, Ethan P.
2018-01-01
Biodiversity forecasts are important for conservation, management, and evaluating how well current models characterize natural systems. While the number of forecasts for biodiversity is increasing, there is little information available on how well these forecasts work. Most biodiversity forecasts are not evaluated to determine how well they predict future diversity, fail to account for uncertainty, and do not use time-series data that captures the actual dynamics being studied. We addressed these limitations by using best practices to explore our ability to forecast the species richness of breeding birds in North America. We used hindcasting to evaluate six different modeling approaches for predicting richness. Hindcasts for each method were evaluated annually for a decade at 1,237 sites distributed throughout the continental United States. All models explained more than 50% of the variance in richness, but none of them consistently outperformed a baseline model that predicted constant richness at each site. The best practices implemented in this study directly influenced the forecasts and evaluations. Stacked species distribution models and “naive” forecasts produced poor estimates of uncertainty and accounting for this resulted in these models dropping in the relative performance compared to other models. Accounting for observer effects improved model performance overall, but also changed the rank ordering of models because it did not improve the accuracy of the “naive” model. Considering the forecast horizon revealed that the prediction accuracy decreased across all models as the time horizon of the forecast increased. To facilitate the rapid improvement of biodiversity forecasts, we emphasize the value of specific best practices in making forecasts and evaluating forecasting methods. PMID:29441230
Forecasting Medicaid Expenditures for Antipsychotic Medications.
Slade, Eric P; Simoni-Wastila, Linda
2015-07-01
The ongoing transition from use of mostly branded to mostly generic second-generation antipsychotic medications could bring about a substantial reduction in Medicaid expenditures for antipsychotic medications, a change with critical implications for formulary restrictions on second-generation antipsychotics in Medicaid. This study provided a forecast of the impact of generics on Medicaid expenditures for antipsychotic medications. Quarterly (N=816) state-level aggregate data on outpatient antipsychotic prescriptions in Medicaid between 2008 and 2011 were drawn from the Medicaid state drug utilization database. Annual numbers of prescriptions, expenditures, and cost per prescription were constructed for each antipsychotic medication. Forecasts of antipsychotic expenditures in calendar years 2016 and 2019 were developed on the basis of the estimated percentage reduction in Medicaid expenditures for risperidone, the only second-generation antipsychotic available generically throughout the study period. Two models of savings from generic risperidone use were estimated, one based on constant risperidone prices and the other based on variable risperidone prices. The sensitivity of the expenditure forecast to expected changes in Medicaid enrollment was also examined. In the main model, annual Medicaid expenditures for antipsychotics were forecasted to decrease by $1,794 million (48.8%) by 2016 and by $2,814 million (76.5%) by 2019. Adjustment for variable prices of branded medications and changes in Medicaid enrollment only moderately affected the magnitude of these reductions. Within five years, antipsychotic expenditures in Medicaid may decline to less than half their current levels. Such a spending reduction warrants a reassessment of the continued necessity of formulary restrictions for second-generation antipsychotics in Medicaid.
Model documentation renewable fuels module of the National Energy Modeling System
NASA Astrophysics Data System (ADS)
1995-06-01
This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1995 Annual Energy Outlook (AEO95) forecasts. The report catalogs and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. The RFM also reads in hydroelectric facility capacities and capacity factors from a data file for use by the NEMS Electricity Market Module (EMM). The purpose of the RFM is to define the technological, cost, and resource size characteristics of renewable energy technologies. These characteristics are used to compute a levelized cost to be competed against other similarly derived costs from other energy sources and technologies. The competition of these energy sources over the NEMS time horizon determines the market penetration of these renewable energy technologies. The characteristics include available energy capacity, capital costs, fixed operating costs, variable operating costs, capacity factor, heat rate, construction lead time, and fuel product price.
Flexible reserve markets for wind integration
NASA Astrophysics Data System (ADS)
Fernandez, Alisha R.
The increased interconnection of variable generation has motivated the use of improved forecasting to more accurately predict future production with the purpose to lower total system costs for balancing when the expected output exceeds or falls short of the actual output. Forecasts are imperfect, and the forecast errors associated with utility-scale generation from variable generators need new balancing capabilities that cannot be handled by existing ancillary services. Our work focuses on strategies for integrating large amounts of wind generation under the flex reserve market, a market that would called upon for short-term energy services during an under or oversupply of wind generation to maintain electric grid reliability. The flex reserve market would be utilized for time intervals that fall in-between the current ancillary services markets that would be longer than second-to-second energy services for maintaining system frequency and shorter than reserve capacity services that are called upon for several minutes up to an hour during an unexpected contingency on the grid. In our work, the wind operator would access the flex reserve market as an energy service to correct for unanticipated forecast errors, akin to paying the generators participating in the market to increase generation during a shortfall or paying the other generators to decrease generation during an excess of wind generation. Such a market does not currently exist in the Mid-Atlantic United States. The Pennsylvania-New Jersey-Maryland Interconnection (PJM) is the Mid-Atlantic electric grid case study that was used to examine if a flex reserve market can be utilized for integrating large capacities of wind generation in a lowcost manner for those providing, purchasing and dispatching these short-term balancing services. The following work consists of three studies. The first examines the ability of a hydroelectric facility to provide short-term forecast error balancing services via a flex reserve market, identifying the operational constraints that inhibit a multi-purpose dam facility to meet the desired flexible energy demand. The second study transitions from the hydroelectric facility as the decision maker providing flex reserve services to the wind plant as the decision maker purchasing these services. In this second study, methods for allocating the costs of flex reserve services under different wind policy scenarios are explored that aggregate farms into different groupings to identify the least-cost strategy for balancing the costs of hourly day-ahead forecast errors. The least-cost strategy may be different for an individual wind plant and for the system operator, noting that the least-cost strategy is highly sensitive to cost allocation and aggregation schemes. The latter may also cause cross-subsidies in the cost for balancing wind forecast errors among the different wind farms. The third study builds from the second, with the objective to quantify the amount of flex reserves needed for balancing future forecast errors using a probabilistic approach (quantile regression) to estimating future forecast errors. The results further examine the usefulness of separate flexible markets PJM could use for balancing oversupply and undersupply events, similar to the regulation up and down markets used in Europe. These three studies provide the following results and insights to large-scale wind integration using actual PJM wind farm data that describe the markets and generators within PJM. • Chapter 2 provides an in-depth analysis of the valuable, yet highly-constrained, energy services multi-purpose hydroelectric facilities can provide, though the opportunity cost for providing these services can result in large deviations from the reservoir policies with minimal revenue gain in comparison to dedicating the whole of dam capacity to providing day-ahead, baseload generation. • Chapter 3 quantifies the system-wide efficiency gains and the distributive effects of PJM's decision to act as a single balancing authority, which means that it procures ancillary services across its entire footprint simultaneously. This can be contrasted to Midwest Independent System Operator (MISO), which has several balancing authorities operating under its footprint. • Chapter 4 uses probabilistic methods to estimate the uncertainty in the forecast errors and the quantity of energy needed to balance these forecast errors at a certain percentile. Current practice is to use a point forecast that describes the conditional expectation of the dependent variable at each time step. The approach here uses quantile regression to describe the relationship between independent variable and the conditional quantiles (equivalently the percentiles) of the dependent variable. An estimate of the conditional density is performed, which contains information about the covariate relationship of the sign of the forecast errors (negative for too much wind generation and positive for too little wind generation) and the wind power forecast. This additional knowledge may be implemented in the decision process to more accurately schedule day-ahead wind generation bids and provide an example for using separate markets for balancing an oversupply and undersupply of generation. Such methods are currently used for coordinating large footprints of wind generation in Europe.
Technology requirements for communication satellites in the 1980's
NASA Technical Reports Server (NTRS)
Burtt, J. E.; Moe, C. R.; Elms, R. V.; Delateur, L. A.; Sedlacek, W. C.; Younger, G. G.
1973-01-01
The key technology requirements are defined for meeting the forecasted demands for communication satellite services in the 1985 to 1995 time frame. Evaluation is made of needs for services and technical and functional requirements for providing services. The future growth capabilities of the terrestrial telephone network, cable television, and satellite networks are forecasted. The impact of spacecraft technology and booster performance and costs upon communication satellite costs are analyzed. Systems analysis techniques are used to determine functional requirements and the sensitivities of technology improvements for reducing the costs of meeting requirements. Recommended development plans and funding levels are presented, as well as the possible cost saving for communications satellites in the post 1985 era.
[Macro-economic calculation of spending versus micro-economic follow-up of costs of breast cancer].
Borella, L; Paraponaris, A
2002-12-01
In the healthcare field, the ability to make economic forecasts requires knowledge of the costs of caring for major diseases. In the case of a semi-chronic condition like cancer, this cost covers all the episodes of care associated with a patient. An evaluation of a macro-economic method of calculating costs for treating non-metastatic cancer, covering all hospital episodes, is proposed. This method is based entirely on the use of annual hospital activity databases, linked to data concerning the incidence of cancer. It allows us to obtain the global cost of care for a neoplasm of a particular site, without the need to reconstruct the whole care pathway of the patients. The model was assessed by comparing it's own results, in the particular case of breast cancer to those issuing from a micro-economic follow-up of 115 patients. Data for macro-economic calculation are extracted from the national French hospital database for the year 1999 and from cancer incidence data. The prospective study was done in 1995, in a comprehensive cancer centre. Macro-economic calculation leads to a cost of 14,555 Euro, for primary breast cancer. Prospective follow-up showed a cost of 14,350 Euro (data corrected, 1999 value). With a difference of 1%, there was a clear cohesion of the two results, while a higher level of divergence was noticed (from 1 to 15%) in the comparison between therapeutic techniques. Accuracy and reliability of results were evaluated. This method may be extended to all types of neoplasms. This method cannot be used instead of follow-up studies, for cost-efficacy or cost-severity analysis, but may be interesting beyond economic forecasts, in the field of payment per pathology.
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...
2017-07-11
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
Algorithm aversion: people erroneously avoid algorithms after seeing them err.
Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade
2015-02-01
Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.
Counteracting structural errors in ensemble forecast of influenza outbreaks.
Pei, Sen; Shaman, Jeffrey
2017-10-13
For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.
NASA Technical Reports Server (NTRS)
1985-01-01
The results of detailed cost estimates and economic analysis performed on the updated Model 101 configuration of the general purpose Aft Cargo Carrier (ACC) are given. The objective of this economic analysis is to provide the National Aeronautics and Space Administration (NASA) with information on the economics of using the ACC on the Space Transportation System (STS). The detailed cost estimates for the ACC are presented by a work breakdown structure (WBS) to ensure that all elements of cost are considered in the economic analysis and related subsystem trades. Costs reported by WBS provide NASA with a basis for comparing competing designs and provide detailed cost information that can be used to forecast phase C/D planning for new projects or programs derived from preliminary conceptual design studies. The scope covers all STS and STS/ACC launch vehicle cost impacts for delivering payloads to a 160 NM low Earth orbit (LEO).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yishen; Zhou, Zhi; Liu, Cong
2016-08-01
As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Two approaches to forecast Ebola synthetic epidemics.
Champredon, David; Li, Michael; Bolker, Benjamin M; Dushoff, Jonathan
2018-03-01
We use two modelling approaches to forecast synthetic Ebola epidemics in the context of the RAPIDD Ebola Forecasting Challenge. The first approach is a standard stochastic compartmental model that aims to forecast incidence, hospitalization and deaths among both the general population and health care workers. The second is a model based on the renewal equation with latent variables that forecasts incidence in the whole population only. We describe fitting and forecasting procedures for each model and discuss their advantages and drawbacks. We did not find that one model was consistently better in forecasting than the other. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Streamflow forecasts from WRF precipitation for flood early warning in mountain tropical areas
NASA Astrophysics Data System (ADS)
Rogelis, María Carolina; Werner, Micha
2018-02-01
Numerical weather prediction (NWP) models are fundamental to extend forecast lead times beyond the concentration time of a watershed. Particularly for flash flood forecasting in tropical mountainous watersheds, forecast precipitation is required to provide timely warnings. This paper aims to assess the potential of NWP for flood early warning purposes, and the possible improvement that bias correction can provide, in a tropical mountainous area. The paper focuses on the comparison of streamflows obtained from the post-processed precipitation forecasts, particularly the comparison of ensemble forecasts and their potential in providing skilful flood forecasts. The Weather Research and Forecasting (WRF) model is used to produce precipitation forecasts that are post-processed and used to drive a hydrologic model. Discharge forecasts obtained from the hydrological model are used to assess the skill of the WRF model. The results show that post-processed WRF precipitation adds value to the flood early warning system when compared to zero-precipitation forecasts, although the precipitation forecast used in this analysis showed little added value when compared to climatology. However, the reduction of biases obtained from the post-processed ensembles show the potential of this method and model to provide usable precipitation forecasts in tropical mountainous watersheds. The need for more detailed evaluation of the WRF model in the study area is highlighted, particularly the identification of the most suitable parameterisation, due to the inability of the model to adequately represent the convective precipitation found in the study area.
Paradis, Pierre Emmanuel; Latrémouille-Viau, Dominick; Moore, Yuliya; Mishagina, Natalia; Lafeuille, Marie-Hélène; Lefebvre, Patrick; Gaudig, Maren; Duh, Mei Sheng
2009-07-01
To explore the effects of generic substitution of the antiepileptic drug (AED) topiramate (Topamax) in Canada; to convert observed Canadian costs into the settings of France, Germany, Italy, and the United Kingdom (UK); and to forecast the economic impact of generic topiramate entry in these four European countries. Health claims from Régie de l'assurance maladie du Québec (RAMQ) plan (1/2006-9/2008) and IMS Health data (1998-2008) were used. Patients with epilepsy and > or = 2 topiramate dispensings were selected. An open-cohort design was used to classify observation into mutually-exclusive periods of branded versus generic use of topiramate. Canadian healthcare utilization and costs (2007 CAN$/person-year) were compared between periods using multivariate models. Annualized per-patient costs (2007 euro or 2007 pound sterling/person-year) were converted using Canadian utilization rates, European prices and service-use ratios. Non-parametric bootstrap served to assess statistical significance of cost differences. Topiramate market was forecasted following generic entry (09/2009-09/2010) using autoregressive models based on the European experience. The economic impact of generic topiramate entry was estimated for each country. A total of 1164 patients (mean age: 39.8 years, 61.7% female) were observed for 2.6 years on average. After covariates adjustment, generic-use periods were associated with increased pharmacy dispensings (other AEDs: +0.95/person-year, non-AEDs: +12.28/person-year, p < 0.001), hospitalizations ( + 0.08/person-year, p = 0.015), and lengths of hospital stays (+0.51 days/person-year, p < 0.001). Adjusted costs, excluding topiramate, were CAN$1060/person-year higher during generic use (p = 0.005). Converted per-patient costs excluding topiramate were significantly higher for generic relative to brand periods in all European countries (adjusted cost differences per person-year: 706-815 euro, p < 0.001 for all comparisons). System-wide costs would increase from 3.5 to 24.4% one year after generic entry. Study limitations include the absence of indirect costs, possible claim inaccuracies, and IMS data limitations. Higher health costs were projected for G4 European countries from the Canadian experience following the generic entry of topiramate.
Development of WRF-ROI system by incorporating eigen-decomposition
NASA Astrophysics Data System (ADS)
Kim, S.; Noh, N.; Song, H.; Lim, G.
2011-12-01
This study presents the development of WRF-ROI system, which is the implementation of Retrospective Optimal Interpolation (ROI) to the Weather Research and Forecasting model (WRF). ROI is a new data assimilation algorithm introduced by Song et al. (2009) and Song and Lim (2009). The formulation of ROI is similar with that of Optimal Interpolation (OI), but ROI iteratively assimilates an observation set at a post analysis time into a prior analysis, possibly providing the high quality reanalysis data. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In previous study, ROI method is applied to Lorenz 40-variable model (Lorenz, 1996) to validate the algorithm and to investigate the capability. It is therefore required to apply this ROI method into a more realistic and complicated model framework such as WRF. In this research, the reduced-rank formulation of ROI is used instead of a reduced-resolution method. The computational costs can be reduced due to the eigen-decomposition of background error covariance in the reduced-rank method. When single profile of observations is assimilated in the WRF-ROI system by incorporating eigen-decomposition, the analysis error tends to be reduced if compared with the background error. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation.
NASA Astrophysics Data System (ADS)
Shaman, J.; Stieglitz, M.; Zebiak, S.; Cane, M.; Day, J. F.
2002-12-01
We present an ensemble local hydrologic forecast derived from the seasonal forecasts of the International Research Institute (IRI) for Climate Prediction. Three- month seasonal forecasts were used to resample historical meteorological conditions and generate ensemble forcing datasets for a TOPMODEL-based hydrology model. Eleven retrospective forecasts were run at a Florida and New York site. Forecast skill was assessed for mean area modeled water table depth (WTD), i.e. near surface soil wetness conditions, and compared with WTD simulated with observed data. Hydrology model forecast skill was evident at the Florida site but not at the New York site. At the Florida site, persistence of hydrologic conditions and local skill of the IRI seasonal forecast contributed to the local hydrologic forecast skill. This forecast will permit probabilistic prediction of future hydrologic conditions. At the Florida site, we have also quantified the link between modeled WTD (i.e. drought) and the amplification and transmission of St. Louis Encephalitis virus (SLEV). We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission associated with human clinical cases. We then combine the seasonal forecasts of local, modeled WTD with this empirical relationship and produce retrospective probabilistic seasonal forecasts of epidemic SLEV transmission in Florida. Epidemic SLEV transmission forecast skill is demonstrated. These findings will permit real-time forecast of drought and resultant SLEV transmission in Florida.
Wildland Fire Forecasting: Predicting Wildfire Behavior, Growth, and Feedbacks on Weather
NASA Astrophysics Data System (ADS)
Coen, J. L.
2005-12-01
Recent developments in wildland fire research models have represented more complex of fire behavior. The cost has been to increase the computational requirements. When operational constraints are included, such as the need to produce such forecasts faster than real time, the challenge becomes a balance of how much complexity (with corresponding gains in realism) and accuracy can be achieved in producing the quantities of interest while meeting the specified operational constraints. Current field tools are calculator or Palm-Pilot based algorithms such as BEHAVE and BEHAVE Plus that produce timely estimates of instantaneous fire spread rates, flame length, and fire intensity at a point using readily estimated inputs of fuel model, terrain slope, and atmospheric wind speed at a point. At the cost of requiring a PC and slower calculation, FARSITE represents two-dimensional fire spread and adds capabilities including a parameterized representation of crown fire ignition, This work describes how a coupled atmosphere-fire model previously used as a research tool has been adapted for production of real-time forecasts of fire growth and its interactions with weather over a domain focusing on Colorado during summer 2004. The coupled atmosphere-wildland fire-environment (CAWFE) model composed of a 3-dimensional atmospheric prediction model that has been two-way coupled with an empirical fire spread model. The models are connected in that atmospheric conditions (and fuel conditions influenced by the atmosphere) affect the rate and direction of fire propagation, which releases sensible and latent heat (i.e. thermal and water vapor fluxes) to the atmosphere that in turn alter the winds and atmospheric structure around the fire. Thus, it can represent time and spatially-varying weather and the fire feedbacks on the atmospheric which are at the heart of sudden changes in fire behavior and examples of extreme fire behavior such as blow ups, which are now not predictable with current tools. Thus, although this work shows that is it possible to perform more detailed simulations in real time, fire behavior forecasting remains a challenging problem. This is due to challenges in weather prediction, particularly at fine spatial and temporal scales considered "nowcasting" (0-6 hrs), uncertainties in fire behavior even with known meteorological conditions, limitations in quantitative datasets on fuel properties such as fuel loading, and verification. This work describes efforts to advance these capabilities with input from remote sensing data on fuel characteristics and dynamic steering and object-based verification with remotely sensed fire perimeters.
Energy Consumption Forecasting Using Semantic-Based Genetic Programming with Local Search Optimizer.
Castelli, Mauro; Trujillo, Leonardo; Vanneschi, Leonardo
2015-01-01
Energy consumption forecasting (ECF) is an important policy issue in today's economies. An accurate ECF has great benefits for electric utilities and both negative and positive errors lead to increased operating costs. The paper proposes a semantic based genetic programming framework to address the ECF problem. In particular, we propose a system that finds (quasi-)perfect solutions with high probability and that generates models able to produce near optimal predictions also on unseen data. The framework blends a recently developed version of genetic programming that integrates semantic genetic operators with a local search method. The main idea in combining semantic genetic programming and a local searcher is to couple the exploration ability of the former with the exploitation ability of the latter. Experimental results confirm the suitability of the proposed method in predicting the energy consumption. In particular, the system produces a lower error with respect to the existing state-of-the art techniques used on the same dataset. More importantly, this case study has shown that including a local searcher in the geometric semantic genetic programming system can speed up the search process and can result in fitter models that are able to produce an accurate forecasting also on unseen data.
Improving medium-range and seasonal hydroclimate forecasts in the southeast USA
NASA Astrophysics Data System (ADS)
Tian, Di
Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.
Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis
NASA Astrophysics Data System (ADS)
Mohamed Ismael, Hawa; Vandyck, George Kobina
The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.
Group Prenatal Care: A Financial Perspective.
Rowley, Rebecca A; Phillips, Lindsay E; O'Dell, Lisa; Husseini, Racha El; Carpino, Sarah; Hartman, Scott
2016-01-01
Multiple studies have demonstrated improved perinatal outcomes for group prenatal care (GPC) when compared to traditional prenatal care. Benefits of GPC include lower rates of prematurity and low birth weight, fewer cesarean deliveries, improved breastfeeding outcomes and improved maternal satisfaction with care. However, the outpatient financial costs of running a GPC program are not well established. This study involved the creation of a financial model that forecasted costs and revenues for prenatal care groups with various numbers of participants based on numerous variables, including patient population, payor mix, patient show rates, staffing mix, supply usage and overhead costs. The model was developed for use in an urban underserved practice. Adjusted revenue per pregnancy in this model was found to be $989.93 for traditional care and $1080.69 for GPC. Cost neutrality for GPC was achieved when each group enrolled an average of 10.652 women with an enriched staffing model or 4.801 women when groups were staffed by a single nurse and single clinician. Mathematical cost-benefit modeling in an urban underserved practice demonstrated that GPC can be not only financially sustainable but possibly a net income generator for the outpatient clinic. Use of this model could offer maternity care practices an important tool for demonstrating the financial practicality of GPC.
Forecasting Dust Storms Using the CARMA-Dust Model and MM5 Weather Data
NASA Astrophysics Data System (ADS)
Barnum, B. H.; Winstead, N. S.; Wesely, J.; Hakola, A.; Colarco, P.; Toon, O. B.; Ginoux, P.; Brooks, G.; Hasselbarth, L. M.; Toth, B.; Sterner, R.
2002-12-01
An operational model for the forecast of dust storms in Northern Africa, the Middle East and Southwest Asia has been developed for the United States Air Force Weather Agency (AFWA). The dust forecast model uses the 5th generation Penn State Mesoscale Meteorology Model (MM5), and a modified version of the Colorado Aerosol and Radiation Model for Atmospheres (CARMA). AFWA conducted a 60 day evaluation of the dust model to look at the model's ability to forecast dust storms for short, medium and long range (72 hour) forecast periods. The study used satellite and ground observations of dust storms to verify the model's effectiveness. Each of the main mesoscale forecast theaters was broken down into smaller sub-regions for detailed analysis. The study found the forecast model was able to forecast dust storms in Saharan Africa and the Sahel region with an average Probability of Detection (POD)exceeding 68%, with a 16% False Alarm Rate (FAR). The Southwest Asian theater had average POD's of 61% with FAR's averaging 10%.
NASA Technical Reports Server (NTRS)
Sharp, J. M.; Thomas, R. W.
1975-01-01
How LANDSAT imagery can be cost effectively employed to augment an operational hydrologic model is described. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the LANDSAT-aided approach.
A prediction model to forecast the cost impact from a break in the production schedule
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1977-01-01
The losses which are experienced after a break or stoppage in sequence of a production cycle portends an extremely complex situation and involves numerous variables, some of uncertain quantity and quality. There are no discrete formulas to define the losses during a gap in production. The techniques which are employed are therefore related to a prediction or forecast of the losses that take place, based on the conditions which exist in the production environment. Such parameters as learning curve slope, number of predecessor units, and length of time the production sequence is halted are utilized in formulating a prediction model. The pertinent current publications related to this subject are few in number, but are reviewed to provide an understanding of the problem. Example problems are illustrated together with appropriate trend curves to show the approach. Solved problems are also given to show the application of the models to actual cases or production breaks in the real world.
2007-03-01
terror attacks led people to believe that flying was unsafe. The Breusch - Pagan /Cook-Weisberg test for heteroskedasticity was performed to find...would yield an increase of 13.8 million revenue passenger miles. As with the previous model, the Breusch - Pagan /Cook-Weisberg test for...Appendix A. Data Set of Variables Studied ............................................................ 31 Appendix B. Stationarity Tests
Public Investment and the Goal of Providing Universal Access to Primary Education by 2015 in Kenya
ERIC Educational Resources Information Center
Omwami, Edith Mukudi; Omwami, Raymond K.
2010-01-01
The authors use population census data to project school enrolment for Kenya. They also employ current education sector budget and national revenue base statistics to model the sector budget and to forecast the revenue base growth required to sustain universal primary education (UPE). The 2003 fiscal year unit cost of education is used as the base…
Research on light rail electric load forecasting based on ARMA model
NASA Astrophysics Data System (ADS)
Huang, Yifan
2018-04-01
The article compares a variety of time series models and combines the characteristics of power load forecasting. Then, a light load forecasting model based on ARMA model is established. Based on this model, a light rail system is forecasted. The prediction results show that the accuracy of the model prediction is high.
Parr, Nick; Li, Jackie; Tickle, Leonie
2016-07-01
The economic implications of increasing life expectancy are important concerns for governments in developed countries. The aims of this study were as follows: (i) to forecast mortality for 14 developed countries from 2010 to 2050, using the Poisson Common Factor Model; (ii) to project the effects of the forecast mortality patterns on support ratios; and (iii) to calculate labour force participation increases which could offset these effects. The forecast gains in life expectancy correlate negatively with current fertility. Pre-2050 support ratios are projected to fall most in Japan and east-central and southern Europe, and least in Sweden and Australia. A post-2050 recovery is projected for most east-central and southern European countries. The increases in labour force participation needed to counterbalance the effects of mortality improvement are greatest for Japan, Poland, and the Czech Republic, and least for the USA, Canada, Netherlands, and Sweden. The policy implications are discussed.
NASA Astrophysics Data System (ADS)
Wu, Guocan; Zheng, Xiaogu; Dan, Bo
2016-04-01
The shallow soil moisture observations are assimilated into Common Land Model (CoLM) to estimate the soil moisture in different layers. The forecast error is inflated to improve the analysis state accuracy and the water balance constraint is adopted to reduce the water budget residual in the assimilation procedure. The experiment results illustrate that the adaptive forecast error inflation can reduce the analysis error, while the proper inflation layer can be selected based on the -2log-likelihood function of the innovation statistic. The water balance constraint can result in reducing water budget residual substantially, at a low cost of assimilation accuracy loss. The assimilation scheme can be potentially applied to assimilate the remote sensing data.
NASA Astrophysics Data System (ADS)
Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.
2013-10-01
Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.
A Sensor Driven Probabilistic Method for Enabling Hyper Resolution Flood Simulations
NASA Astrophysics Data System (ADS)
Fries, K. J.; Salas, F.; Kerkez, B.
2016-12-01
A reduction in the cost of sensors and wireless communications is now enabling researchers and local governments to make flow, stage and rain measurements at locations that are not covered by existing USGS or state networks. We ask the question: how should these new sources of densified, street-level sensor measurements be used to make improved forecasts using the National Water Model (NWM)? Assimilating these data "into" the NWM can be challenging due to computational complexity, as well as heterogeneity of sensor and other input data. Instead, we introduce a machine learning and statistical framework that layers these data "on top" of the NWM outputs to improve high-resolution hydrologic and hydraulic forecasting. By generalizing our approach into a post-processing framework, a rapidly repeatable blueprint is generated for for decision makers who want to improve local forecasts by coupling sensor data with the NWM. We present preliminary results based on case studies in highly instrumented watersheds in the US. Through the use of statistical learning tools and hydrologic routing schemes, we demonstrate the ability of our approach to improve forecasts while simultaneously characterizing bias and uncertainty in the NWM.
Short-term forecasting of turbidity in trunk main networks.
Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward
2017-11-01
Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reservoir optimisation using El Niño information. Case study of Daule Peripa (Ecuador)
NASA Astrophysics Data System (ADS)
Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan
2010-05-01
The optimisation of water resources systems requires the ability to produce runoff scenarios that are consistent with available climatic information. We approach stochastic runoff modelling with a Markov-modulated autoregressive model with exogenous input, which belongs to the class of Markov-switching models. The model assumes runoff parameterisation to be conditioned on a hidden climatic state following a Markov chain, whose state transition probabilities depend on climatic information. This approach allows stochastic modeling of non-stationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We calibrate the model on the inflows of the Daule Peripa reservoir located in western Ecuador, where the occurrence of El Niño leads to anomalously heavy rainfall caused by positive sea surface temperature anomalies along the coast. El Niño - Southern Oscillation (ENSO) information is used to condition the runoff parameterisation. Inflow predictions are realistic, especially at the occurrence of El Niño events. The Daule Peripa reservoir serves a hydropower plant and a downstream water supply facility. Using historical ENSO records, synthetic monthly inflow scenarios are generated for the period 1950-2007. These scenarios are used as input to perform stochastic optimisation of the reservoir rule curves with a multi-objective Genetic Algorithm (MOGA). The optimised rule curves are assumed to be the reservoir base policy. ENSO standard indices are currently forecasted at monthly time scale with nine-month lead time. These forecasts are used to perform stochastic optimisation of reservoir releases at each monthly time step according to the following procedure: (i) nine-month inflow forecast scenarios are generated using ENSO forecasts; (ii) a MOGA is set up to optimise the upcoming nine monthly releases; (iii) the optimisation is carried out by simulating the releases on the inflow forecasts, and by applying the base policy on a subsequent synthetic inflow scenario in order to account for long-term costs; (iv) the optimised release for the first month is implemented; (v) the state of the system is updated and (i), (ii), (iii), and (iv) are iterated for the following time step. The results highlight the advantages of using a climate-driven stochastic model to produce inflow scenarios and forecasts for reservoir optimisation, showing potential improvements with respect to the current management. Dynamic programming was used to find the best possible release time series given the inflow observations, in order to benchmark any possible operational improvement.
Karin L. Riley; Crystal Stonesifer; Haiganoush Preisler; Dave Calkin
2014-01-01
Can fire potential forecasts assist with pre-positioning of fire suppression resources, which could result in a cost savings to the United States government? Here, we present a preliminary assessment of the 7-Day Fire Potential Outlook forecasts made by the Predictive Services program. We utilized historical fire occurrence data and archived forecasts to assess how...
Solar Photovoltaic and Liquid Natural Gas Opportunities for Command Naval Region Hawaii
2014-12-01
Utilities Commission xii PV Photovoltaic Pwr Power RE Renewable Energy Re-gas Regasification RFP Request For Proposal RMI Rocky... forecasted LS diesel price and the forecasted LNG delivered-to-the- power -plant cost. The forecast for LS diesel by FGE from year 2020–2030 is seen...annual/html/epa_08_01.html Electric Power Research Institute. (July, 2010). Addressing solar photovoltaic operations and maintenance challenges: A
A Comparison of Forecast Error Generators for Modeling Wind and Load Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Ning; Diao, Ruisheng; Hafen, Ryan P.
2013-12-18
This paper presents four algorithms to generate random forecast error time series, including a truncated-normal distribution model, a state-space based Markov model, a seasonal autoregressive moving average (ARMA) model, and a stochastic-optimization based model. The error time series are used to create real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast time series that statistically match historically observed forecasting data sets, used for variable generation integration studies. A comparison is made using historical DA load forecast and actual load values to generate new sets of DA forecasts with similar stoical forecast error characteristics. This paper discusses and comparesmore » the capabilities of each algorithm to preserve the characteristics of the historical forecast data sets.« less
NASA Astrophysics Data System (ADS)
Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.
2015-12-01
Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.
Combining forecast weights: Why and how?
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim
2012-09-01
This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.
Quantifying the Value of Satellite Imagery in Agriculture and other Sectors
NASA Astrophysics Data System (ADS)
Brown, M. E.; Abbott, P. C.; Escobar, V. M.
2013-12-01
This study focused on quantifying the commercial value of satellite remote sensing for agriculture. Commercial value from satellite imagery arises when improved information leads to better economic decisions. We identified five areas of application of remote sensing to agriculture where there is this potential: crop management (precision agriculture), insurance, real estate assessment, crop forecasting, and environmental monitoring. These applications can be divided between public information (crop forecasting) and those that may generate private commercial value (crop management), with both public and private information dimensions in some categories. Public information applications of remote sensing have been more successful in the past, and are likely to generate more economic value in the future. It was found that several issues have limited realization of the potential to generate private value from remote sensing in agriculture. The scale of use is small to the high cost of acquiring and interpreting large images has limited the cost effectiveness to individual farmers. Insurance, environmental monitoring, and crop management services by cooperatives or consultants may be cases overcoming this limitation. The greatest opportunities for potential commercial value from agriculture are probably in the crop forecasting area, especially where agricultural statistics services are not as well developed, since public market information benefits a broad range of economic actors, not limited to countries where forecasts are made. We estimate here the value from components of USDA's World Agricultural Supply and Demand Estimates (WASDE) forecasts for corn, indicating potential value increasing in the range of 60 to 240 million if improved satellite based information enhances those forecasts. The research was conducted by agricultural economists at Purdue University, and will be the basis for further evaluation of the use of satellite data within the NASA Carbon Monitoring System (CMS). A general evaluation framework to determine the usefulness of the CMS products to various users and to the broader community interested in managing carbon is shown in Figure 2. The first step in conducting such an analysis is to develop an understanding of the history, institutions, behaviors and other factors setting the context of an application which CMS data products inform. Decision makers are identified (who may become early adopters), and the alternative decisions they might take are elaborated. Economic models informed by biophysical models would then predict the outcome of the engagement. The new information must then be linked to a revised decision, and that decision in turn must lead to better economic or social outcomes on average. The value of the information is estimated as the predicted increase in economic surplus (profit, cost, consumer welfare) or social outcome that is a direct result of that revised decision. Alternative Monte Carlo simulations would estimate averages of key outcomes under alternative circumstances, such as differing regulations or better data, hence capturing consequences of the changes induced. These approaches will be described in the context of NASA and satellite data.
Provincial Variation of Cochlear Implantation Surgical Volumes and Cost in Canada.
Crowson, Matthew G; Chen, Joseph M; Tucci, Debara
2017-01-01
Objectives To investigate provincial cochlear implantation (CI) annual volume and cost trends. Study Design Database analysis. Setting National surgical volume and cost database. Subjects and Methods Aggregate-level provincial CI volumes and cost data for adult and pediatric CI surgery from 2005 to 2014 were obtained from the Canadian Institute for Health Information. Population-level aging forecast estimates were obtained from the Ontario Ministry of Finance and Statistics Canada. Linear fit, analysis of variance, and Tukey's analyses were utilized to compare variances and means. Results The national volume of annual CI procedures is forecasted to increase by <30 per year ( R 2 = 0.88). Ontario has the highest mean annual CI volume (282; 95% confidence interval, 258-308), followed by Alberta (92.0; 95% confidence interval, 66.3-118), which are significantly higher than all other provinces ( P < .05 for each). Ontario's annual CI procedure volume is forecasted to increase by <11 per year ( R 2 = 0.62). Newfoundland and Nova Scotia have the highest CI procedures per 100,000 residents as compared with all other provinces ( P < .05). Alberta, Newfoundland, and Manitoba have the highest estimated implantation cost of all provinces ( P < .05). Conclusions Historical trends of CI forecast modest national volume growth. Potential bottlenecks include provincial funding and access to surgical expertise. The proportion of older adult patients who may benefit from a CI will rise, and there may be insufficient capacity to meet this need. Delayed access to CI for pediatric patients is also a concern, given recent reports of long wait times for CI surgery.
NASA Astrophysics Data System (ADS)
Declair, Stefan; Saint-Drenan, Yves-Marie; Potthast, Roland
2017-04-01
Determining the amount of weather dependent renewable energy is a demanding task for transmission system operators (TSOs) and wind and photovoltaic (PV) prediction errors require the use of reserve power, which generate costs and can - in extreme cases - endanger the security of supply. In the project EWeLiNE funded by the German government, the German Weather Service and the Fraunhofer Institute on Wind Energy and Energy System Technology develop innovative weather- and power forecasting models and tools for grid integration of weather dependent renewable energy. The key part in energy prediction process chains is the numerical weather prediction (NWP) system. Irradiation forecasts from NWP systems are however subject to several sources of error. For PV power prediction, weaknesses of the NWP model to correctly forecast i.e. low stratus, absorption of condensed water or aerosol optical depths are the main sources of errors. Inaccurate radiation schemes (i.e. the two-stream parametrization) are also known as a deficit of NWP systems with regard to irradiation forecast. To mitigate errors like these, latest observations can be used in a pre-processing technique called data assimilation (DA). In DA, not only the initial fields are provided, but the model is also synchronized with reality - the observations - and hence forecast errors are reduced. Besides conventional observation networks like radiosondes, synoptic observations or air reports of wind, pressure and humidity, the number of observations measuring meteorological information indirectly by means of remote sensing such as satellite radiances, radar reflectivities or GPS slant delays strongly increases. Numerous PV plants installed in Germany potentially represent a dense meteorological network assessing irradiation through their power measurements. Forecast accuracy may thus be enhanced by extending the observations in the assimilation by this new source of information. PV power plants can provide information on clouds, aerosol optical depth or low stratus in terms of remote sensing: the power output is strongly dependent on perturbations along the slant between sun position and PV panel. Since these data are not limited to the vertical column above or below the detector, it may thus complement satellite data and compensate weaknesses in the radiation scheme. In this contribution, the used DA technique (Local Ensemble Transform Kalman Filter, LETKF) is shortly sketched. Furthermore, the computation of the model power equivalents is described and first results are presented and discussed.
Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power
NASA Astrophysics Data System (ADS)
Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab
2014-01-01
Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.
NASA Astrophysics Data System (ADS)
Prahara, Eduardi; Suangga, Made; Lutfi Ansori, Ahmad
2017-12-01
This study aims to determine of potential of passenger car divert from national road to on-construction Cisumdawu Toll Road. The study was conducted by traffic count survey and followed by a roadside interview survey. Stated Preference method was used in order to analyse trip forecasting value. Mode choice model of new trip mode plans (Cisumdawu Toll Road) and current intercity road for Cileunyi - Sumedang is (UJT -UJR ) = 0.1079-0.507726x 1-0.8953764x 2, while Sumedang - Cileunyi is (UJT -UJR ) = 0.0790-0.301341x 1-0.548446x 2. Multiple linear regression analysis was used to obtain the forecasting of private vehicle that diverts to the new toll road (Cisumdawu Toll Road). Trip characteristics such as trip origin and destination, types of trips, occupations, salary, and others become a motive for respondents to choose a new trip mode. Results of the new trip mode forecasting that prefer to divert to the toll road in terms of the value of cost and time for Cileunyi - Sumedang are 74.11% and 86.62% respectively, while for Sumedang - Cileunyi are 69.60% and 76.48% respectively. These results are relatively high compare to toll planning document. The impact of this results can be determined such as lower overall fuel consumption, lower pollution and more important is the maintenance cost of national road will be decrease.
A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.
Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan
2017-01-01
Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.
NASA Astrophysics Data System (ADS)
Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.
2013-12-01
One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.
A Wind Forecasting System for Energy Application
NASA Astrophysics Data System (ADS)
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
2010-05-01
Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.
The Escalating Costs of Higher Education.
ERIC Educational Resources Information Center
Kirshstein, Rita J.; And Others
This congressionally mandated study of the escalating cost of higher education focuses on: (1) identifying the cost of obtaining a higher education and determining how that cost has changed from 1976-77 to 1987-88; (2) determining specific causes of such cost changes; (3) forecasting the future cost of obtaining a higher education; (4) evaluating…
Large Scale Skill in Regional Climate Modeling and the Lateral Boundary Condition Scheme
NASA Astrophysics Data System (ADS)
Veljović, K.; Rajković, B.; Mesinger, F.
2009-04-01
Several points are made concerning the somewhat controversial issue of regional climate modeling: should a regional climate model (RCM) be expected to maintain the large scale skill of the driver global model that is supplying its lateral boundary condition (LBC)? Given that this is normally desired, is it able to do so without help via the fairly popular large scale nudging? Specifically, without such nudging, will the RCM kinetic energy necessarily decrease with time compared to that of the driver model or analysis data as suggested by a study using the Regional Atmospheric Modeling System (RAMS)? Finally, can the lateral boundary condition scheme make a difference: is the almost universally used but somewhat costly relaxation scheme necessary for a desirable RCM performance? Experiments are made to explore these questions running the Eta model in two versions differing in the lateral boundary scheme used. One of these schemes is the traditional relaxation scheme, and the other the Eta model scheme in which information is used at the outermost boundary only, and not all variables are prescribed at the outflow boundary. Forecast lateral boundary conditions are used, and results are verified against the analyses. Thus, skill of the two RCM forecasts can be and is compared not only against each other but also against that of the driver global forecast. A novel verification method is used in the manner of customary precipitation verification in that forecast spatial wind speed distribution is verified against analyses by calculating bias adjusted equitable threat scores and bias scores for wind speeds greater than chosen wind speed thresholds. In this way, focusing on a high wind speed value in the upper troposphere, verification of large scale features we suggest can be done in a manner that may be more physically meaningful than verifications via spectral decomposition that are a standard RCM verification method. The results we have at this point are somewhat limited in view of the integrations having being done only for 10-day forecasts. Even so, one should note that they are among very few done using forecast as opposed to reanalysis or analysis global driving data. Our results suggest that (1) running the Eta as an RCM no significant loss of large-scale kinetic energy with time seems to be taking place; (2) no disadvantage from using the Eta LBC scheme compared to the relaxation scheme is seen, while enjoying the advantage of the scheme being significantly less demanding than the relaxation given that it needs driver model fields at the outermost domain boundary only; and (3) the Eta RCM skill in forecasting large scales, with no large scale nudging, seems to be just about the same as that of the driver model, or, in the terminology of Castro et al., the Eta RCM does not lose "value of the large scale" which exists in the larger global analyses used for the initial condition and for verification.
Time series regression and ARIMAX for forecasting currency flow at Bank Indonesia in Sulawesi region
NASA Astrophysics Data System (ADS)
Suharsono, Agus; Suhartono, Masyitha, Aulia; Anuravega, Arum
2015-12-01
The purpose of the study is to forecast the outflow and inflow of currency at Indonesian Central Bank or Bank Indonesia (BI) in Sulawesi Region. The currency outflow and inflow data tend to have a trend pattern which is influenced by calendar variation effects. Therefore, this research focuses to apply some forecasting methods that could handle calendar variation effects, i.e. Time Series Regression (TSR) and ARIMAX models, and compare the forecast accuracy with ARIMA model. The best model is selected based on the lowest of Root Mean Squares Errors (RMSE) at out-sample dataset. The results show that ARIMA is the best model for forecasting the currency outflow and inflow at South Sulawesi. Whereas, the best model for forecasting the currency outflow at Central Sulawesi and Southeast Sulawesi, and for forecasting the currency inflow at South Sulawesi and North Sulawesi is TSR. Additionally, ARIMAX is the best model for forecasting the currency outflow at North Sulawesi. Hence, the results show that more complex models do not neccessary yield more accurate forecast than the simpler one.
Gambling scores for earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang
2010-04-01
This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
NASA Astrophysics Data System (ADS)
Rhee, Jinyoung; Kim, Gayoung; Im, Jungho
2017-04-01
Three regions of Indonesia with different rainfall characteristics were chosen to develop drought forecast models based on machine learning. The 6-month Standardized Precipitation Index (SPI6) was selected as the target variable. The models' forecast skill was compared to the skill of long-range climate forecast models in terms of drought accuracy and regression mean absolute error (MAE). Indonesian droughts are known to be related to El Nino Southern Oscillation (ENSO) variability despite of regional differences as well as monsoon, local sea surface temperature (SST), other large-scale atmosphere-ocean interactions such as Indian Ocean Dipole (IOD) and Southern Pacific Convergence Zone (SPCZ), and local factors including topography and elevation. Machine learning models are thus to enhance drought forecast skill by combining local and remote SST and remote sensing information reflecting initial drought conditions to the long-range climate forecast model results. A total of 126 machine learning models were developed for the three regions of West Java (JB), West Sumatra (SB), and Gorontalo (GO) and six long-range climate forecast models of MSC_CanCM3, MSC_CanCM4, NCEP, NASA, PNU, POAMA as well as one climatology model based on remote sensing precipitation data, and 1 to 6-month lead times. When compared the results between the machine learning models and the long-range climate forecast models, West Java and Gorontalo regions showed similar characteristics in terms of drought accuracy. Drought accuracy of the long-range climate forecast models were generally higher than the machine learning models with short lead times but the opposite appeared for longer lead times. For West Sumatra, however, the machine learning models and the long-range climate forecast models showed similar drought accuracy. The machine learning models showed smaller regression errors for all three regions especially with longer lead times. Among the three regions, the machine learning models developed for Gorontalo showed the highest drought accuracy and the lowest regression error. West Java showed higher drought accuracy compared to West Sumatra, while West Sumatra showed lower regression error compared to West Java. The lower error in West Sumatra may be because of the smaller sample size used for training and evaluation for the region. Regional differences of forecast skill are determined by the effect of ENSO and the following forecast skill of the long-range climate forecast models. While shown somewhat high in West Sumatra, relative importance of remote sensing variables was mostly low in most cases. High importance of the variables based on long-range climate forecast models indicates that the forecast skill of the machine learning models are mostly determined by the forecast skill of the climate models.
Sensitivity of a Simulated Derecho Event to Model Initial Conditions
NASA Astrophysics Data System (ADS)
Wang, Wei
2014-05-01
Since 2003, the MMM division at NCAR has been experimenting cloud-permitting scale weather forecasting using Weather Research and Forecasting (WRF) model. Over the years, we've tested different model physics, and tried different initial and boundary conditions. Not surprisingly, we found that the model's forecasts are more sensitive to the initial conditions than model physics. In 2012 real-time experiment, WRF-DART (Data Assimilation Research Testbed) at 15 km was employed to produce initial conditions for twice-a-day forecast at 3 km. On June 29, this forecast system captured one of the most destructive derecho event on record. In this presentation, we will examine forecast sensitivity to different model initial conditions, and try to understand the important features that may contribute to the success of the forecast.
A multi-period optimization model for energy planning with CO(2) emission consideration.
Mirzaesmaeeli, H; Elkamel, A; Douglas, P L; Croiset, E; Gupta, M
2010-05-01
A novel deterministic multi-period mixed-integer linear programming (MILP) model for the power generation planning of electric systems is described and evaluated in this paper. The model is developed with the objective of determining the optimal mix of energy supply sources and pollutant mitigation options that meet a specified electricity demand and CO(2) emission targets at minimum cost. Several time-dependent parameters are included in the model formulation; they include forecasted energy demand, fuel price variability, construction lead time, conservation initiatives, and increase in fixed operational and maintenance costs over time. The developed model is applied to two case studies. The objective of the case studies is to examine the economical, structural, and environmental effects that would result if the electricity sector was required to reduce its CO(2) emissions to a specified limit. Copyright 2009 Elsevier Ltd. All rights reserved.
Integrated Mode Choice, Small Aircraft Demand, and Airport Operations Model User's Guide
NASA Technical Reports Server (NTRS)
Yackovetsky, Robert E. (Technical Monitor); Dollyhigh, Samuel M.
2004-01-01
A mode choice model that generates on-demand air travel forecasts at a set of GA airports based on changes in economic characteristics, vehicle performance characteristics such as speed and cost, and demographic trends has been integrated with a model to generate itinerate aircraft operations by airplane category at a set of 3227 airports. Numerous intermediate outputs can be generated, such as the number of additional trips diverted from automobiles and schedule air by the improved performance and cost of on-demand air vehicles. The total number of transported passenger miles that are diverted is also available. From these results the number of new aircraft to service the increased demand can be calculated. Output from the models discussed is in the format to generate the origin and destination traffic flow between the 3227 airports based on solutions to a gravity model.
Dilokthornsakul, Piyameth; Patidar, Mausam; Campbell, Jonathan D
2017-12-01
To forecast lifetime outcomes and cost of lumacaftor/ivacaftor combination therapy in patients with cystic fibrosis (CF) with homozygous phe508del mutation from the US payer perspective. A lifetime Markov model was developed from a US payer perspective. The model included five health states: 1) mild lung disease (percent predicted forced expiratory volume in 1 second [FEV 1 ] >70%), 2) moderate lung disease (40% ≤ FEV 1 ≤ 70%), 3) severe lung disease (FEV 1 < 40%), 4) lung transplantation, and 5) death. All inputs were derived from published literature. We estimated lumacaftor/ivacaftor's improvement in outcomes compared with a non-CF referent population as well as CF-specific mortality estimates. Lumacaftor/ivacaftor was associated with additional 2.91 life-years (95% credible interval 2.55-3.56) and additional 2.42 quality-adjusted life-years (QALYs) (95% credible interval 2.10-2.98). Lumacaftor/ivacaftor was associated with improvements in survival and QALYs equivalent to 27.6% and 20.7%, respectively, for the survival and QALY gaps between CF usual care and their non-CF peers. The incremental lifetime cost was $2,632,249. Lumacaftor/ivacaftor increased life-years and QALYs in CF patients with the homozygous phe508del mutation and moved morbidity and mortality closer to that of their non-CF peers but it came with higher cost. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
System-of-Systems Technology-Portfolio-Analysis Tool
NASA Technical Reports Server (NTRS)
O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne
2012-01-01
Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.
Ability of matrix models to explain the past and predict the future of plant populations.
McEachern, Kathryn; Crone, Elizabeth E.; Ellis, Martha M.; Morris, William F.; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlen, Johan; Kaye, Thomas N.; Knight, Tiffany M.; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer I.; Doak, Daniel F.; Ganesan, Rengaian; Thorpe, Andrea S.; Menges, Eric S.
2013-01-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models.
Ability of matrix models to explain the past and predict the future of plant populations.
Crone, Elizabeth E; Ellis, Martha M; Morris, William F; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlén, Johan; Kaye, Thomas N; Knight, Tiffany M; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer L; Doak, Daniel F; Ganesan, Rengaian; McEachern, Kathyrn; Thorpe, Andrea S; Menges, Eric S
2013-10-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models. © 2013 Society for Conservation Biology.
Economic benefits of improved meteorological forecasts - The construction industry
NASA Technical Reports Server (NTRS)
Bhattacharyya, R. K.; Greenberg, J. S.
1976-01-01
Estimates are made of the potential economic benefits accruing to particular industries from timely utilization of satellite-derived six-hour weather forecasts, and of economic penalties resulting from failure to utilize such forecasts in day-to-day planning. The cost estimate study is centered on the U.S. construction industry, with results simplified to yes/no 6-hr forecasts on thunderstorm activity and work/no work decisions. Effects of weather elements (thunderstorms, snow and sleet) on various construction operations are indicated. Potential dollar benefits for other industries, including air transportation and other forms of transportation, are diagrammed for comparison. Geosynchronous satellites such as STORMSAT, SEOS, and SMS/GOES are considered as sources of the forecast data.
Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region
NASA Astrophysics Data System (ADS)
Khan, Muhammad Yousaf; Mittnik, Stefan
2018-01-01
In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.
Daily air quality index forecasting with hybrid models: A case in China.
Zhu, Suling; Lian, Xiuyuan; Liu, Haixia; Hu, Jianming; Wang, Yuanyuan; Che, Jinxing
2017-12-01
Air quality is closely related to quality of life. Air pollution forecasting plays a vital role in air pollution warnings and controlling. However, it is difficult to attain accurate forecasts for air pollution indexes because the original data are non-stationary and chaotic. The existing forecasting methods, such as multiple linear models, autoregressive integrated moving average (ARIMA) and support vector regression (SVR), cannot fully capture the information from series of pollution indexes. Therefore, new effective techniques need to be proposed to forecast air pollution indexes. The main purpose of this research is to develop effective forecasting models for regional air quality indexes (AQI) to address the problems above and enhance forecasting accuracy. Therefore, two hybrid models (EMD-SVR-Hybrid and EMD-IMFs-Hybrid) are proposed to forecast AQI data. The main steps of the EMD-SVR-Hybrid model are as follows: the data preprocessing technique EMD (empirical mode decomposition) is utilized to sift the original AQI data to obtain one group of smoother IMFs (intrinsic mode functions) and a noise series, where the IMFs contain the important information (level, fluctuations and others) from the original AQI series. LS-SVR is applied to forecast the sum of the IMFs, and then, S-ARIMA (seasonal ARIMA) is employed to forecast the residual sequence of LS-SVR. In addition, EMD-IMFs-Hybrid first separately forecasts the IMFs via statistical models and sums the forecasting results of the IMFs as EMD-IMFs. Then, S-ARIMA is employed to forecast the residuals of EMD-IMFs. To certify the proposed hybrid model, AQI data from June 2014 to August 2015 collected from Xingtai in China are utilized as a test case to investigate the empirical research. In terms of some of the forecasting assessment measures, the AQI forecasting results of Xingtai show that the two proposed hybrid models are superior to ARIMA, SVR, GRNN, EMD-GRNN, Wavelet-GRNN and Wavelet-SVR. Therefore, the proposed hybrid models can be used as effective and simple tools for air pollution forecasting and warning as well as for management. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lim, S.; Park, S. K.; Zupanski, M.
2015-04-01
Since the air quality forecast is related to both chemistry and meteorology, the coupled atmosphere-chemistry data assimilation (DA) system is essential to air quality forecasting. Ozone (O3) plays an important role in chemical reactions and is usually assimilated in chemical DA. In tropical cyclones (TCs), O3 usually shows a lower concentration inside the eyewall and an elevated concentration around the eye, impacting atmospheric as well as chemical variables. To identify the impact of O3 observations on TC structure, including atmospheric and chemical information, we employed the Weather Research and Forecasting model coupled with Chemistry (WRF-Chem) with an ensemble-based DA algorithm - the maximum likelihood ensemble filter (MLEF). For a TC case that occurred over the East Asia, our results indicate that the ensemble forecast is reasonable, accompanied with larger background state uncertainty over the TC, and also over eastern China. Similarly, the assimilation of O3 observations impacts atmospheric and chemical variables near the TC and over eastern China. The strongest impact on air quality in the lower troposphere was over China, likely due to the pollution advection. In the vicinity of the TC, however, the strongest impact on chemical variables adjustment was at higher levels. The impact on atmospheric variables was similar in both over China and near the TC. The analysis results are validated using several measures that include the cost function, root-mean-squared error with respect to observations, and degrees of freedom for signal (DFS). All measures indicate a positive impact of DA on the analysis - the cost function and root mean square error have decreased by 16.9 and 8.87%, respectively. In particular, the DFS indicates a strong positive impact of observations in the TC area, with a weaker maximum over northeast China.
A Novel Wind Speed Forecasting Model for Wind Farms of Northwest China
NASA Astrophysics Data System (ADS)
Wang, Jian-Zhou; Wang, Yun
2017-01-01
Wind resources are becoming increasingly significant due to their clean and renewable characteristics, and the integration of wind power into existing electricity systems is imminent. To maintain a stable power supply system that takes into account the stochastic nature of wind speed, accurate wind speed forecasting is pivotal. However, no single model can be applied to all cases. Recent studies show that wind speed forecasting errors are approximately 25% to 40% in Chinese wind farms. Presently, hybrid wind speed forecasting models are widely used and have been verified to perform better than conventional single forecasting models, not only in short-term wind speed forecasting but also in long-term forecasting. In this paper, a hybrid forecasting model is developed, the Similar Coefficient Sum (SCS) and Hermite Interpolation are exploited to process the original wind speed data, and the SVM model whose parameters are tuned by an artificial intelligence model is built to make forecast. The results of case studies show that the MAPE value of the hybrid model varies from 22.96% to 28.87 %, and the MAE value varies from 0.47 m/s to 1.30 m/s. Generally, Sign test, Wilcoxon's Signed-Rank test, and Morgan-Granger-Newbold test tell us that the proposed model is different from the compared models.
An Introduction to the NCHEMS Costing and Data Management System. Technical Report No. 55.
ERIC Educational Resources Information Center
Haight, Mike; Martin, Ron
The NCHEMS Costing and Data Management System is designed to assist institutions in the implementation of cost studies. There are at least two kinds of cost studies: historical cost studies which display cost-related data that reflect actual events over a specific prior time period, and predictive cost studies which forecast costs that will be…
Data-driven agent-based modeling, with application to rooftop solar adoption
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Haifeng; Vorobeychik, Yevgeniy; Letchford, Joshua
Agent-based modeling is commonly used for studying complex system properties emergent from interactions among many agents. We present a novel data-driven agent-based modeling framework applied to forecasting individual and aggregate residential rooftop solar adoption in San Diego county. Our first step is to learn a model of individual agent behavior from combined data of individual adoption characteristics and property assessment. We then construct an agent-based simulation with the learned model embedded in artificial agents, and proceed to validate it using a holdout sequence of collective adoption decisions. We demonstrate that the resulting agent-based model successfully forecasts solar adoption trends andmore » provides a meaningful quantification of uncertainty about its predictions. We utilize our model to optimize two classes of policies aimed at spurring solar adoption: one that subsidizes the cost of adoption, and another that gives away free systems to low-income house- holds. We find that the optimal policies derived for the latter class are significantly more efficacious, whereas the policies similar to the current California Solar Initiative incentive scheme appear to have a limited impact on overall adoption trends.« less
Data-driven agent-based modeling, with application to rooftop solar adoption
Zhang, Haifeng; Vorobeychik, Yevgeniy; Letchford, Joshua; ...
2016-01-25
Agent-based modeling is commonly used for studying complex system properties emergent from interactions among many agents. We present a novel data-driven agent-based modeling framework applied to forecasting individual and aggregate residential rooftop solar adoption in San Diego county. Our first step is to learn a model of individual agent behavior from combined data of individual adoption characteristics and property assessment. We then construct an agent-based simulation with the learned model embedded in artificial agents, and proceed to validate it using a holdout sequence of collective adoption decisions. We demonstrate that the resulting agent-based model successfully forecasts solar adoption trends andmore » provides a meaningful quantification of uncertainty about its predictions. We utilize our model to optimize two classes of policies aimed at spurring solar adoption: one that subsidizes the cost of adoption, and another that gives away free systems to low-income house- holds. We find that the optimal policies derived for the latter class are significantly more efficacious, whereas the policies similar to the current California Solar Initiative incentive scheme appear to have a limited impact on overall adoption trends.« less
An Intelligent Decision Support System for Workforce Forecast
2011-01-01
ARIMA ) model to forecast the demand for construction skills in Hong Kong. This model was based...Decision Trees ARIMA Rule Based Forecasting Segmentation Forecasting Regression Analysis Simulation Modeling Input-Output Models LP and NLP Markovian...data • When results are needed as a set of easily interpretable rules 4.1.4 ARIMA Auto-regressive, integrated, moving-average ( ARIMA ) models
NASA Astrophysics Data System (ADS)
Tommasi, D.; Stock, C. A.
2016-02-01
It is well established that environmental fluctuations affect the productivity of numerous fish stocks. Recent advances in prediction capability of dynamical global forecast systems, such as the state of the art NOAA Geophysical Fluid dynamics Laboratory (GFDL) 2.5-FLOR model, allow for climate predictions of fisheries-relevant variables at temporal scales relevant to the fishery management decision making process. We demonstrate that the GFDL FLOR model produces skillful seasonal SST anomaly predictions over the continental shelf , where most of the global fish yield is generated. The availability of skillful SST projections at this "fishery relevant" scale raises the potential for better constrained estimates of future fish biomass and improved harvest decisions. We assessed the utility of seasonal SST coastal shelf predictions for fisheries management using the case study of Pacific sardine. This fishery was selected because it is one of the few to already incorporate SST into its harvest guideline, and show a robust recruitment-SST relationship. We quantified the effectiveness of management under the status quo harvest guideline (HG) and under alternative HGs including future information at different levels of uncertainty. Usefulness of forecast SST to management was dependent on forecast uncertainty. If the standard deviation of the SST anomaly forecast residuals was less than 0.65, the alternative HG produced higher long-term yield and stock biomass, and reduced the probability of either catch or stock biomass falling below management-set threshold values as compared to the status quo. By contrast, probability of biomass falling to extremely low values increased as compared to the status quo for all alternative HGs except for a perfectly known future SST case. To safeguard against occurrence of such low probability but costly events, a harvest cutoff biomass also has to be implemented into the HG.
Potentialities of ensemble strategies for flood forecasting over the Milano urban area
NASA Astrophysics Data System (ADS)
Ravazzani, Giovanni; Amengual, Arnau; Ceppi, Alessandro; Homar, Víctor; Romero, Romu; Lombardi, Gabriele; Mancini, Marco
2016-08-01
Analysis of ensemble forecasting strategies, which can provide a tangible backing for flood early warning procedures and mitigation measures over the Mediterranean region, is one of the fundamental motivations of the international HyMeX programme. Here, we examine two severe hydrometeorological episodes that affected the Milano urban area and for which the complex flood protection system of the city did not completely succeed. Indeed, flood damage have exponentially increased during the last 60 years, due to industrial and urban developments. Thus, the improvement of the Milano flood control system needs a synergism between structural and non-structural approaches. First, we examine how land-use changes due to urban development have altered the hydrological response to intense rainfalls. Second, we test a flood forecasting system which comprises the Flash-flood Event-based Spatially distributed rainfall-runoff Transformation, including Water Balance (FEST-WB) and the Weather Research and Forecasting (WRF) models. Accurate forecasts of deep moist convection and extreme precipitation are difficult to be predicted due to uncertainties arising from the numeric weather prediction (NWP) physical parameterizations and high sensitivity to misrepresentation of the atmospheric state; however, two hydrological ensemble prediction systems (HEPS) have been designed to explicitly cope with uncertainties in the initial and lateral boundary conditions (IC/LBCs) and physical parameterizations of the NWP model. No substantial differences in skill have been found between both ensemble strategies when considering an enhanced diversity of IC/LBCs for the perturbed initial conditions ensemble. Furthermore, no additional benefits have been found by considering more frequent LBCs in a mixed physics ensemble, as ensemble spread seems to be reduced. These findings could help to design the most appropriate ensemble strategies before these hydrometeorological extremes, given the computational cost of running such advanced HEPSs for operational purposes.
Deep Learning Based Solar Flare Forecasting Model. I. Results for Line-of-sight Magnetograms
NASA Astrophysics Data System (ADS)
Huang, Xin; Wang, Huaning; Xu, Long; Liu, Jinfu; Li, Rong; Dai, Xinghua
2018-03-01
Solar flares originate from the release of the energy stored in the magnetic field of solar active regions, the triggering mechanism for these flares, however, remains unknown. For this reason, the conventional solar flare forecast is essentially based on the statistic relationship between solar flares and measures extracted from observational data. In the current work, the deep learning method is applied to set up the solar flare forecasting model, in which forecasting patterns can be learned from line-of-sight magnetograms of solar active regions. In order to obtain a large amount of observational data to train the forecasting model and test its performance, a data set is created from line-of-sight magnetogarms of active regions observed by SOHO/MDI and SDO/HMI from 1996 April to 2015 October and corresponding soft X-ray solar flares observed by GOES. The testing results of the forecasting model indicate that (1) the forecasting patterns can be automatically reached with the MDI data and they can also be applied to the HMI data; furthermore, these forecasting patterns are robust to the noise in the observational data; (2) the performance of the deep learning forecasting model is not sensitive to the given forecasting periods (6, 12, 24, or 48 hr); (3) the performance of the proposed forecasting model is comparable to that of the state-of-the-art flare forecasting models, even if the duration of the total magnetograms continuously spans 19.5 years. Case analyses demonstrate that the deep learning based solar flare forecasting model pays attention to areas with the magnetic polarity-inversion line or the strong magnetic field in magnetograms of active regions.
Improving inflow forecasting into hydropower reservoirs through a complementary modelling framework
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2014-10-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead-time is considered within the day-ahead (Elspot) market of the Nordic exchange market. We present here a new approach for issuing hourly reservoir inflow forecasts that aims to improve on existing forecasting models that are in place operationally, without needing to modify the pre-existing approach, but instead formulating an additive or complementary model that is independent and captures the structure the existing model may be missing. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. The procedure presented comprises an error model added on top of an un-alterable constant parameter conceptual model, the models being demonstrated with reference to the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead-times up to 17 h. Season based evaluations indicated that the improvement in inflow forecasts varies across seasons and inflow forecasts in autumn and spring are less successful with the 95% prediction interval bracketing less than 95% of the observations for lead-times beyond 17 h.
Marcilio, Izabel; Hajat, Shakoor; Gouveia, Nelson
2013-08-01
This study aimed to develop different models to forecast the daily number of patients seeking emergency department (ED) care in a general hospital according to calendar variables and ambient temperature readings and to compare the models in terms of forecasting accuracy. The authors developed and tested six different models of ED patient visits using total daily counts of patient visits to an ED in Sao Paulo, Brazil, from January 1, 2008, to December 31, 2010. The first 33 months of the data set were used to develop the ED patient visits forecasting models (the training set), leaving the last 3 months to measure each model's forecasting accuracy by the mean absolute percentage error (MAPE). Forecasting models were developed using three different time-series analysis methods: generalized linear models (GLM), generalized estimating equations (GEE), and seasonal autoregressive integrated moving average (SARIMA). For each method, models were explored with and without the effect of mean daily temperature as a predictive variable. The daily mean number of ED visits was 389, ranging from 166 to 613. Data showed a weekly seasonal distribution, with highest patient volumes on Mondays and lowest patient volumes on weekends. There was little variation in daily visits by month. GLM and GEE models showed better forecasting accuracy than SARIMA models. For instance, the MAPEs from GLM models and GEE models at the first month of forecasting (October 2012) were 11.5 and 10.8% (models with and without control for the temperature effect, respectively), while the MAPEs from SARIMA models were 12.8 and 11.7%. For all models, controlling for the effect of temperature resulted in worse or similar forecasting ability than models with calendar variables alone, and forecasting accuracy was better for the short-term horizon (7 days in advance) than for the longer term (30 days in advance). This study indicates that time-series models can be developed to provide forecasts of daily ED patient visits, and forecasting ability was dependent on the type of model employed and the length of the time horizon being predicted. In this setting, GLM and GEE models showed better accuracy than SARIMA models. Including information about ambient temperature in the models did not improve forecasting accuracy. Forecasting models based on calendar variables alone did in general detect patterns of daily variability in ED volume and thus could be used for developing an automated system for better planning of personnel resources. © 2013 by the Society for Academic Emergency Medicine.
NASA Astrophysics Data System (ADS)
Li, Ji; Chen, Yangbo; Wang, Huanyu; Qin, Jianming; Li, Jie; Chiao, Sen
2017-03-01
Long lead time flood forecasting is very important for large watershed flood mitigation as it provides more time for flood warning and emergency responses. The latest numerical weather forecast model could provide 1-15-day quantitative precipitation forecasting products in grid format, and by coupling this product with a distributed hydrological model could produce long lead time watershed flood forecasting products. This paper studied the feasibility of coupling the Liuxihe model with the Weather Research and Forecasting quantitative precipitation forecast (WRF QPF) for large watershed flood forecasting in southern China. The QPF of WRF products has three lead times, including 24, 48 and 72 h, with the grid resolution being 20 km × 20 km. The Liuxihe model is set up with freely downloaded terrain property; the model parameters were previously optimized with rain gauge observed precipitation, and re-optimized with the WRF QPF. Results show that the WRF QPF has bias with the rain gauge precipitation, and a post-processing method is proposed to post-process the WRF QPF products, which improves the flood forecasting capability. With model parameter re-optimization, the model's performance improves also. This suggests that the model parameters be optimized with QPF, not the rain gauge precipitation. With the increasing of lead time, the accuracy of the WRF QPF decreases, as does the flood forecasting capability. Flood forecasting products produced by coupling the Liuxihe model with the WRF QPF provide a good reference for large watershed flood warning due to its long lead time and rational results.
NASA Astrophysics Data System (ADS)
Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.
2016-12-01
Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.
Liu, Yaoze; Theller, Lawrence O; Pijanowski, Bryan C; Engel, Bernard A
2016-05-15
The adverse impacts of urbanization and climate change on hydrology and water quality can be mitigated by applying green infrastructure practices. In this study, the impacts of land use change and climate change on hydrology and water quality in the 153.2 km(2) Trail Creek watershed located in northwest Indiana were estimated using the Long-Term Hydrologic Impact Assessment-Low Impact Development 2.1 (L-THIA-LID 2.1) model for the following environmental concerns: runoff volume, Total Suspended Solids (TSS), Total Phosphorous (TP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx). Using a recent 2001 land use map and 2050 land use forecasts, we found that land use change resulted in increased runoff volume and pollutant loads (8.0% to 17.9% increase). Climate change reduced runoff and nonpoint source pollutant loads (5.6% to 10.2% reduction). The 2050 forecasted land use with current rainfall resulted in the largest runoff volume and pollutant loads. The optimal selection and placement of green infrastructure practices using L-THIA-LID 2.1 model were conducted. Costs of applying green infrastructure were estimated using the L-THIA-LID 2.1 model considering construction, maintenance, and opportunity costs. To attain the same runoff volume and pollutant loads as in 2001 land uses for 2050 land uses, the runoff volume, TSS, TP, TKN, and NOx for 2050 needed to be reduced by 10.8%, 14.4%, 13.1%, 15.2%, and 9.0%, respectively. The corresponding annual costs of implementing green infrastructure to achieve the goals were $2.1, $0.8, $1.6, $1.9, and $0.8 million, respectively. Annual costs of reducing 2050 runoff volume/pollutant loads were estimated, and results show green infrastructure annual cost greatly increased for larger reductions in runoff volume and pollutant loads. During optimization, the most cost-efficient green infrastructure practices were selected and implementation levels increased for greater reductions of runoff and nonpoint source pollutants. Copyright © 2016 Elsevier B.V. All rights reserved.
Weighting of NMME temperature and precipitation forecasts across Europe
NASA Astrophysics Data System (ADS)
Slater, Louise J.; Villarini, Gabriele; Bradley, A. Allen
2017-09-01
Multi-model ensemble forecasts are obtained by weighting multiple General Circulation Model (GCM) outputs to heighten forecast skill and reduce uncertainties. The North American Multi-Model Ensemble (NMME) project facilitates the development of such multi-model forecasting schemes by providing publicly-available hindcasts and forecasts online. Here, temperature and precipitation forecasts are enhanced by leveraging the strengths of eight NMME GCMs (CCSM3, CCSM4, CanCM3, CanCM4, CFSv2, GEOS5, GFDL2.1, and FLORb01) across all forecast months and lead times, for four broad climatic European regions: Temperate, Mediterranean, Humid-Continental and Subarctic-Polar. We compare five different approaches to multi-model weighting based on the equally weighted eight single-model ensembles (EW-8), Bayesian updating (BU) of the eight single-model ensembles (BU-8), BU of the 94 model members (BU-94), BU of the principal components of the eight single-model ensembles (BU-PCA-8) and BU of the principal components of the 94 model members (BU-PCA-94). We assess the forecasting skill of these five multi-models and evaluate their ability to predict some of the costliest historical droughts and floods in recent decades. Results indicate that the simplest approach based on EW-8 preserves model skill, but has considerable biases. The BU and BU-PCA approaches reduce the unconditional biases and negative skill in the forecasts considerably, but they can also sometimes diminish the positive skill in the original forecasts. The BU-PCA models tend to produce lower conditional biases than the BU models and have more homogeneous skill than the other multi-models, but with some loss of skill. The use of 94 NMME model members does not present significant benefits over the use of the 8 single model ensembles. These findings may provide valuable insights for the development of skillful, operational multi-model forecasting systems.
NWP model forecast skill optimization via closure parameter variations
NASA Astrophysics Data System (ADS)
Järvinen, H.; Ollinaho, P.; Laine, M.; Solonen, A.; Haario, H.
2012-04-01
We present results of a novel approach to tune predictive skill of numerical weather prediction (NWP) models. These models contain tunable parameters which appear in parameterizations schemes of sub-grid scale physical processes. The current practice is to specify manually the numerical parameter values, based on expert knowledge. We developed recently a concept and method (QJRMS 2011) for on-line estimation of the NWP model parameters via closure parameter variations. The method called EPPES ("Ensemble prediction and parameter estimation system") utilizes ensemble prediction infra-structure for parameter estimation in a very cost-effective way: practically no new computations are introduced. The approach provides an algorithmic decision making tool for model parameter optimization in operational NWP. In EPPES, statistical inference about the NWP model tunable parameters is made by (i) generating an ensemble of predictions so that each member uses different model parameter values, drawn from a proposal distribution, and (ii) feeding-back the relative merits of the parameter values to the proposal distribution, based on evaluation of a suitable likelihood function against verifying observations. In this presentation, the method is first illustrated in low-order numerical tests using a stochastic version of the Lorenz-95 model which effectively emulates the principal features of ensemble prediction systems. The EPPES method correctly detects the unknown and wrongly specified parameters values, and leads to an improved forecast skill. Second, results with an ensemble prediction system emulator, based on the ECHAM5 atmospheric GCM show that the model tuning capability of EPPES scales up to realistic models and ensemble prediction systems. Finally, preliminary results of EPPES in the context of ECMWF forecasting system are presented.
A comparison of GLAS SAT and NMC high resolution NOSAT forecasts from 19 and 11 February 1976
NASA Technical Reports Server (NTRS)
Atlas, R.
1979-01-01
A subjective comparison of the Goddard Laboratory for Atmospheric Sciences (GLAS) and the National Meteorological Center (NMC) high resolution model forecasts is presented. Two cases where NMC's operational model in 1976 had serious difficulties in forecasting for the United States were examined. For each of the cases, the GLAS model forecasts from initial conditions which included satellite sounding data were compared directly to the NMC higher resolution model forecasts, from initial conditions which excluded the satellite data. The comparison showed that the GLAS satellite forecasts significantly improved upon the current NMC operational model's predictions in both cases.
NASA Astrophysics Data System (ADS)
Kim, Shin-Woo; Noh, Nam-Kyu; Lim, Gyu-Ho
2013-04-01
This study presents the introduction of retrospective optimal interpolation (ROI) and its application with Weather Research and Forecasting model (WRF). Song et al. (2009) suggested ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. The assimilation window of ROI algorithm is gradually increased, similar with that of the quasi-static variational assimilation (QSVA; Pires et al., 1996). Unlike QSVA method, however, ROI method assimilates the data at post analysis time using perturbation method (Verlaan and Heemink, 1997) without adjoint model. Song and Lim (2011) improved this method by incorporating eigen-decomposition and covariance inflation. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance which can concentrate ROI analyses on the error variances of governing eigenmodes by transforming the control variables into eigenspace. A total energy norm is used for the normalization of each control variables. In this study, ROI method is applied to WRF model with Observing System Simulation Experiment (OSSE) to validate the algorithm and to investigate the capability. Horizontal wind, pressure, potential temperature, and water vapor mixing ratio are used for control variables and observations. Firstly, 1-profile assimilation experiment is performed. Subsequently, OSSE's are performed using the virtual observing system which consists of synop, ship, and sonde data. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error with the assimilation by ROI. The characteristics and strength/weakness of ROI method are also investigated by conducting the experiments with 3D-Var (3-dimensional variational) method and 4D-Var (4-dimensional variational) method. In the initial time, ROI produces a larger forecast error than that of 4D-Var. However, the difference between the two experimental results is decreased gradually with time, and the ROI shows apparently better result (i.e., smaller forecast error) than that of 4D-Var after 9-hour forecast.
NASA Astrophysics Data System (ADS)
Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald
2017-04-01
Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.
Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast
NASA Technical Reports Server (NTRS)
Zhu, Jiang; Stevens, E.; Zhang, X.; Zavodsky, B. T.; Heinrichs, T.; Broderson, D.
2014-01-01
A case study and monthly statistical analysis using sounder data assimilation to improve the Alaska regional weather forecast model are presented. Weather forecast in Alaska faces challenges as well as opportunities. Alaska has a large land with multiple types of topography and coastal area. Weather forecast models must be finely tuned in order to accurately predict weather in Alaska. Being in the high-latitudes provides Alaska greater coverage of polar orbiting satellites for integration into forecasting models than the lower 48. Forecasting marine low stratus clouds is critical to the Alaska aviation and oil industry and is the current focus of the case study. NASA AIRS/CrIS sounder profiles data are used to do data assimilation for the Alaska regional weather forecast model to improve Arctic marine stratus clouds forecast. Choosing physical options for the WRF model is discussed. Preprocess of AIRS/CrIS sounder data for data assimilation is described. Local observation data, satellite data, and global data assimilation data are used to verify and/or evaluate the forecast results by the MET tools Model Evaluation Tools (MET).
Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
NASA Astrophysics Data System (ADS)
Krzyścin, J. W.; Jaroslawski, J.; Sobolewski, P.
2001-10-01
A forecast of the UV index for the following day is presented. The standard approach to the UV index modelling is applied, i.e., the clear-sky UV index is multiplied by the UV cloud transmission factor. The input to the clear-sky model (tropospheric ultraviolet and visible-TUV model, Madronich, in: M. Tevini (Ed.), Environmental Effects of Ultraviolet Radiation, Lewis Publisher, Boca Raton, /1993, p. 17) consists of the total ozone forecast (by a regression model using the observed and forecasted meteorological variables taken as the initial values of aviation (AVN) global model and their 24-hour forecasts, respectively) and aerosols optical depth (AOD) forecast (assumed persistence). The cloud transmission factor forecast is inferred from the 24-h AVN model run for the total (Sun/+sky) solar irradiance at noon. The model is validated comparing the UV index forecasts with the observed values, which are derived from the daily pattern of the UV erythemal irradiance taken at Belsk (52°N,21°E), Poland, by means of the UV Biometer Solar model 501A for the period May-September 1999. Eighty-one percent and 92% of all forecasts fall into /+/-1 and /+/-2 index unit range, respectively. Underestimation of UV index occurs only in 15%. Thus, the model gives a high security in Sun protection for the public. It is found that in /~35% of all cases a more accurate forecast of AOD is needed to estimate the daily maximum of clear-sky irradiance with the error not exceeding 5%. The assumption of the persistence of the cloud characteristics appears as an alternative to the 24-h forecast of the cloud transmission factor in the case when the AVN prognoses are not available.
Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service
NASA Astrophysics Data System (ADS)
Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.
2016-12-01
The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.
Cost-effectiveness Analysis of Vascular Access Referral Policies in CKD.
Shechter, Steven M; Chandler, Talon; Skandari, M Reza; Zalunardo, Nadia
2017-09-01
The optimal timing of vascular access referral for patients with chronic kidney disease who may need hemodialysis (HD) is a pressing question in nephrology. Current referral policies have not been rigorously compared with respect to costs and benefits and do not consider patient-specific factors such as age. Monte Carlo simulation model. Patients with chronic kidney disease, referred to a multidisciplinary kidney clinic in a universal health care system. Cost-effectiveness analysis, payer perspective, lifetime horizon. The following vascular access referral policies are considered: central venous catheter (CVC) only, arteriovenous fistula (AVF) or graft (AVG) referral upon HD initiation, AVF (or AVG) referral when HD is forecast to begin within 12 (or 3 for AVG) months, AVF (or AVG) referral when estimated glomerular filtration rate is <15 (or <10 for AVG) mL/min/1.73m 2 . Incremental cost-effectiveness ratios (ICERs, in 2014 US dollars per quality-adjusted life-year [QALY] gained). The ICER of AVF (AVG) referral within 12 (3) months of forecasted HD initiation, compared to using only a CVC, is ∼$105k/QALY ($101k/QALY) at a population level (HD costs included). Pre-HD AVF or AVG referral dominates delaying referral until HD initiation. The ICER of pre-HD referral increases with patient age. Results are most sensitive to erythropoietin costs, ongoing HD costs, and patients' utilities for HD. When ongoing HD costs are excluded from the analysis, pre-HD AVF dominates both pre-HD AVG and CVC-only policies. Literature-based estimates for HD, AVF, and AVG utilities are limited. The cost-effectiveness of vascular access referral is largely driven by the annual costs of HD, erythropoietin costs, and access-specific utilities. Further research is needed in the field of dialysis-related quality of life to inform decision making regarding vascular access referral. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Calibration of limited-area ensemble precipitation forecasts for hydrological predictions
NASA Astrophysics Data System (ADS)
Diomede, Tommaso; Marsigli, Chiara; Montani, Andrea; Nerozzi, Fabrizio; Paccagnella, Tiziana
2015-04-01
The main objective of this study is to investigate the impact of calibration for limited-area ensemble precipitation forecasts, to be used for driving discharge predictions up to 5 days in advance. A reforecast dataset, which spans 30 years, based on the Consortium for Small Scale Modeling Limited-Area Ensemble Prediction System (COSMO-LEPS) was used for testing the calibration strategy. Three calibration techniques were applied: quantile-to-quantile mapping, linear regression, and analogs. The performance of these methodologies was evaluated in terms of statistical scores for the precipitation forecasts operationally provided by COSMO-LEPS in the years 2003-2007 over Germany, Switzerland, and the Emilia-Romagna region (northern Italy). The analog-based method seemed to be preferred because of its capability of correct position errors and spread deficiencies. A suitable spatial domain for the analog search can help to handle model spatial errors as systematic errors. However, the performance of the analog-based method may degrade in cases where a limited training dataset is available. A sensitivity test on the length of the training dataset over which to perform the analog search has been performed. The quantile-to-quantile mapping and linear regression methods were less effective, mainly because the forecast-analysis relation was not so strong for the available training dataset. A comparison between the calibration based on the deterministic reforecast and the calibration based on the full operational ensemble used as training dataset has been considered, with the aim to evaluate whether reforecasts are really worthy for calibration, given that their computational cost is remarkable. The verification of the calibration process was then performed by coupling ensemble precipitation forecasts with a distributed rainfall-runoff model. This test was carried out for a medium-sized catchment located in Emilia-Romagna, showing a beneficial impact of the analog-based method on the reduction of missed events for discharge predictions.
Analysis of user cost and service trade-offs in transit and paratransit services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louviere, J.; Kocur, G.
1979-08-01
The Xenia Model Transit Service served as a test of several alternative transit services operated in a small city setting. Research was designed to test a new method for assessing user tradeoffs in costs and service based on attitudinal methods. Termed direct response assessment, the methods were developed in psychology and have been extended to application in utility assessment. A tradeoff survey was administered as part of a home interview survey. Data from the tradeoff survey were used to develop separate equations for each sample respondent to explain and describe their tradeoffs over transit fare, travel time, walk distance, typemore » of service, and headway. An aggregate equation was also developed, assuming that all respondents shared common tradeoffs. These equations were employed to retrospectively predict changes in transit system patronage since system inception in 1974. Both sets of models performed well, producing forecasts that were in the same direction and range of experience, although magnitudes were somewhat different. Coefficients of the individual tradeoff equations were then analyzed to see if they could be predicted on the basis of interpersonal characteristics of the respondents. Results indicated that differences in coefficients could be attributed to some differences in individuals such as income and auto ownership. Overall results were promising for policy evaluation and forecasting.« less
NASA Astrophysics Data System (ADS)
Chen, L. C.; Mo, K. C.; Zhang, Q.; Huang, J.
2014-12-01
Drought prediction from monthly to seasonal time scales is of critical importance to disaster mitigation, agricultural planning, and multi-purpose reservoir management. Starting in December 2012, NOAA Climate Prediction Center (CPC) has been providing operational Standardized Precipitation Index (SPI) Outlooks using the North American Multi-Model Ensemble (NMME) forecasts, to support CPC's monthly drought outlooks and briefing activities. The current NMME system consists of six model forecasts from U.S. and Canada modeling centers, including the CFSv2, CM2.1, GEOS-5, CCSM3.0, CanCM3, and CanCM4 models. In this study, we conduct an assessment of the predictive skill of meteorological drought using real-time NMME forecasts for the period from May 2012 to May 2014. The ensemble SPI forecasts are the equally weighted mean of the six model forecasts. Two performance measures, the anomaly correlation coefficient and root-mean-square errors against the observations, are used to evaluate forecast skill.Similar to the assessment based on NMME retrospective forecasts, predictive skill of monthly-mean precipitation (P) forecasts is generally low after the second month and errors vary among models. Although P forecast skill is not large, SPI predictive skill is high and the differences among models are small. The skill mainly comes from the P observations appended to the model forecasts. This factor also contributes to the similarity of SPI prediction among the six models. Still, NMME SPI ensemble forecasts have higher skill than those based on individual models or persistence, and the 6-month SPI forecasts are skillful out to four months. The three major drought events occurred during the 2012-2014 period, the 2012 Central Great Plains drought, the 2013 Upper Midwest flash drought, and 2013-2014 California drought, are used as examples to illustrate the system's strength and limitations. For precipitation-driven drought events, such as the 2012 Central Great Plains drought, NMME SPI forecasts perform well in predicting drought severity and spatial patterns. For fast-developing drought events, such as the 2013 Upper Midwest flash drought, the system failed to capture the onset of the drought.
A data-driven multi-model methodology with deep feature selection for short-term wind forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias
With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less
Cost for the treatment of actinic keratosis on the rise in Australia
Perera, Eshini; McGuigan, Sean; Sinclair, Rodney
2014-01-01
Objectives: To report the burden and cost of actinic keratosis (AK) treatment in Australia and to forecast the number of AK treatments and the associated costs to 2020. Design and setting: A retrospective study of data obtained from medicare Australia for AK treated by cryotherapy between 1 January 1994 and 31 December 2012, by year and by state or territory. Results: The total number of AK cryotherapy treatments increased from 247,515 in 1994 to 643,622 in 2012, and we estimate that the number of treatments will increase to 831,952 (95% CI 676,919 to 986,987) by 2020. The total Medicare Benefits Schedule (MBS) benefits paid out for AK in 2012 was $19.6 million and we forecast that this will increase to $24.7 million by 2020 (without inflation). Conclusion: The number of AK cryotherapy treatments increased by 160% between 1994 and 2012. we forecast that the number of treatments will increase by 30% between 2012 and 2020. The rates of non-melanoma skin cancer (NMSC) and AK appear to be increasing at the same rate. During the period 2010 to 2015 AK is anticipated to increase by 17.8% which follows a similar trend to published data that forecasts an increase in NMSC treatments of 22.3%. PMID:25309734
Test operation of a real-time tsunami inundation forecast system using actual data observed by S-net
NASA Astrophysics Data System (ADS)
Suzuki, W.; Yamamoto, N.; Miyoshi, T.; Aoi, S.
2017-12-01
If the tsunami inundation information can be rapidly and stably forecast before the large tsunami attacks, the information would have effectively people realize the impeding danger and necessity of evacuation. Toward that goal, we have developed a prototype system to perform the real-time tsunami inundation forecast for Chiba prefecture, eastern Japan, using off-shore ocean bottom pressure data observed by the seafloor observation network for earthquakes and tsunamis along the Japan Trench (S-net) (Aoi et al., 2015, AGU). Because tsunami inundation simulation requires a large computation cost, we employ a database approach searching the pre-calculated tsunami scenarios that reasonably explain the observed S-net pressure data based on the multi-index method (Yamamoto et al., 2016, EPS). The scenario search is regularly repeated, not triggered by the occurrence of the tsunami event, and the forecast information is generated from the selected scenarios that meet the criterion. Test operation of the prototype system using the actual observation data started in April, 2017 and the performance and behavior of the system during non-tsunami event periods have been examined. It is found that the treatment of the noises affecting the observed data is the main issue to be solved toward the improvement of the system. Even if the observed pressure data are filtered to extract the tsunami signals, the noises in ordinary times or unusually large noises like high ocean waves due to storm affect the comparison between the observed and scenario data. Due to the noises, the tsunami scenarios are selected and the tsunami is forecast although any tsunami event does not actually occur. In most cases, the selected scenarios due to the noises have the fault models in the region along the Kurile or Izu-Bonin Trenches, far from the S-net region, or the fault models below the land. Based on the parallel operation of the forecast system with a different scenario search condition and examination of the fault models, we improve the stability and performance of the forecast system.This work was supported by Council for Science, Technology and Innovation(CSTI), Cross-ministerial Strategic Innovation Promotion Program (SIP), "Enhancement of societal resiliency against natural disasters"(Funding agency: JST).
A Pro-active Real-time Forecasting and Decision Support System for Daily Management of Marine Works
NASA Astrophysics Data System (ADS)
Bollen, Mark; Leyssen, Gert; Smets, Steven; De Wachter, Tom
2016-04-01
Marine Works involving turbidity generating activities (eg. dredging, dredge spoil placement) can generate environmental stress in and around a project area in the form of sediment plumes causing light reduction and sedimentation. If these works are situated near sensitive habitats like sea-grass beds, coral reefs or sensitive human activities eg. aquaculture farms or water intakes, or if contaminants are present in the water soil environmental scrutiny is advised. Environmental Regulations can impose limitations to these activities in the form of turbidity thresholds, spill budgets, contaminant levels. Breaching environmental regulations can result in increased monitoring, adaptation of the works planning and production rates and ultimately in a (temporary) stop of activities all of which entail time and cost impacts for a contractor and/or client. Sediment plume behaviour is governed by the dredging process, soil properties and ambient conditions (currents, water depth) and can be modelled. Usually this is done during the preparatory EIA phase of a project, for estimation of environmental impact based on climatic scenarios. An operational forecasting tool is developed to adapt marine work schedules to the real-time circumstances and thus evade exceedance of critical threshold levels at sensitive areas. The forecasting system is based on a Python-based workflow manager with a MySQL database and a Django frontend web tool for user interaction and visualisation of the model results. The core consists of a numerical hydrodynamic model with sediment transport module (Mike21 from DHI). This model is driven by space and time varying wind fields and wave boundary conditions, and turbidity inputs (suspended sediment source terms) based on marine works production rates and soil properties. The resulting threshold analysis allows the operator to indicate potential impact at the sensitive areas and instigate an adaption of the marine work schedule if needed. In order to use this toolbox in real-time situations and facilitate forecasting of impacts of planned dredge works, the following operational online functionalities are implemented: • Automated fetch and preparation of the input data, including 7 day forecast wind and wave fields and real-time measurements, and user defined the turbidity inputs based on scheduled marine works. • Generate automated forecasts and running user configurable scenarios at the same time in parallel. • Export and convert the model results, time series and maps, into a standardized format (netcdf). • Automatic analysis and processing of model results, including the calculation of indicator turbidity values and the exceedance analysis of threshold levels at the different sensitive areas. Data assimilation with the real time on site turbidity measurements is implemented in this threshold analysis. • Pre-programmed generation of animated sediment plumes, specific charts and pdf reports to allow a rapid interpretation of the model results by the operators and facilitating decision making in the operational planning. The performed marine works, resulting from the marine work schedule proposed by the forecasting system, are evaluated by a threshold analysis on the validated turbidity measurements on the sensitive sites. This machine learning loop allows a check of the system in order to evaluate forecast and model uncertainties.
Next-Day Earthquake Forecasts for California
NASA Astrophysics Data System (ADS)
Werner, M. J.; Jackson, D. D.; Kagan, Y. Y.
2008-12-01
We implemented a daily forecast of m > 4 earthquakes for California in the format suitable for testing in community-based earthquake predictability experiments: Regional Earthquake Likelihood Models (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). The forecast is based on near-real time earthquake reports from the ANSS catalog above magnitude 2 and will be available online. The model used to generate the forecasts is based on the Epidemic-Type Earthquake Sequence (ETES) model, a stochastic model of clustered and triggered seismicity. Our particular implementation is based on the earlier work of Helmstetter et al. (2006, 2007), but we extended the forecast to all of Cali-fornia, use more data to calibrate the model and its parameters, and made some modifications. Our forecasts will compete against the Short-Term Earthquake Probabilities (STEP) forecasts of Gersten-berger et al. (2005) and other models in the next-day testing class of the CSEP experiment in California. We illustrate our forecasts with examples and discuss preliminary results.
NASA Astrophysics Data System (ADS)
Lehner, F.; Wood, A.; Llewellyn, D.; Blatchford, D. B.; Goodbody, A. G.; Pappenberger, F.
2017-12-01
Recent studies have documented the influence of increasing temperature on streamflow across the American West, including snow-melt driven rivers such as the Colorado or Rio Grande. At the same time, some basins are reporting decreasing skill in seasonal streamflow forecasts, termed water supply forecasts (WSFs), over the recent decade. While the skill in seasonal precipitation forecasts from dynamical models remains low, their skill in predicting seasonal temperature variations could potentially be harvested for WSFs to account for non-stationarity in regional temperatures. Here, we investigate whether WSF skill can be improved by incorporating seasonal temperature forecasts from dynamical forecasting models (from the North American Multi Model Ensemble and the European Centre for Medium-Range Weather Forecast System 4) into traditional statistical forecast models. We find improved streamflow forecast skill relative to traditional WSF approaches in a majority of headwater locations in the Colorado and Rio Grande basins. Incorporation of temperature into WSFs thus provides a promising avenue to increase the robustness of current forecasting techniques in the face of continued regional warming.
NASA Astrophysics Data System (ADS)
Cobourn, W. Geoffrey
2010-08-01
An enhanced PM 2.5 air quality forecast model based on nonlinear regression (NLR) and back-trajectory concentrations has been developed for use in the Louisville, Kentucky metropolitan area. The PM 2.5 air quality forecast model is designed for use in the warm season, from May through September, when PM 2.5 air quality is more likely to be critical for human health. The enhanced PM 2.5 model consists of a basic NLR model, developed for use with an automated air quality forecast system, and an additional parameter based on upwind PM 2.5 concentration, called PM24. The PM24 parameter is designed to be determined manually, by synthesizing backward air trajectory and regional air quality information to compute 24-h back-trajectory concentrations. The PM24 parameter may be used by air quality forecasters to adjust the forecast provided by the automated forecast system. In this study of the 2007 and 2008 forecast seasons, the enhanced model performed well using forecasted meteorological data and PM24 as input. The enhanced PM 2.5 model was compared with three alternative models, including the basic NLR model, the basic NLR model with a persistence parameter added, and the NLR model with persistence and PM24. The two models that included PM24 were of comparable accuracy. The two models incorporating back-trajectory concentrations had lower mean absolute errors and higher rates of detecting unhealthy PM2.5 concentrations compared to the other models.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, WanYin; Zhang, Jie; Florita, Anthony
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance,more » cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.« less
Forecast first: An argument for groundwater modeling in reverse
White, Jeremy
2017-01-01
Numerical groundwater models are important compo-nents of groundwater analyses that are used for makingcritical decisions related to the management of ground-water resources. In this support role, models are oftenconstructed to serve a specific purpose that is to provideinsights, through simulation, related to a specific func-tion of a complex aquifer system that cannot be observeddirectly (Anderson et al. 2015).For any given modeling analysis, several modelinput datasets must be prepared. Herein, the datasetsrequired to simulate the historical conditions are referredto as the calibration model, and the datasets requiredto simulate the model’s purpose are referred to as theforecast model. Future groundwater conditions or otherunobserved aspects of the groundwater system may besimulated by the forecast model—the outputs of interestfrom the forecast model represent the purpose of themodeling analysis. Unfortunately, the forecast model,needed to simulate the purpose of the modeling analysis,is seemingly an afterthought—calibration is where themajority of time and effort are expended and calibrationis usually completed before the forecast model is evenconstructed. Herein, I am proposing a new groundwatermodeling workflow, referred to as the “forecast first”workflow, where the forecast model is constructed at anearlier stage in the modeling analysis and the outputsof interest from the forecast model are evaluated duringsubsequent tasks in the workflow.
Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity
NASA Astrophysics Data System (ADS)
Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján
2017-06-01
It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.
Minimal Residual Disease Evaluation in Childhood Acute Lymphoblastic Leukemia: An Economic Analysis
Gajic-Veljanoski, O.; Pham, B.; Pechlivanoglou, P.; Krahn, M.; Higgins, Caroline; Bielecki, Joanna
2016-01-01
Background Minimal residual disease (MRD) testing by higher performance techniques such as flow cytometry and polymerase chain reaction (PCR) can be used to detect the proportion of remaining leukemic cells in bone marrow or peripheral blood during and after the first phases of chemotherapy in children with acute lymphoblastic leukemia (ALL). The results of MRD testing are used to reclassify these patients and guide changes in treatment according to their future risk of relapse. We conducted a systematic review of the economic literature, cost-effectiveness analysis, and budget-impact analysis to ascertain the cost-effectiveness and economic impact of MRD testing by flow cytometry for management of childhood precursor B-cell ALL in Ontario. Methods A systematic literature search (1998–2014) identified studies that examined the incremental cost-effectiveness of MRD testing by either flow cytometry or PCR. We developed a lifetime state-transition (Markov) microsimulation model to quantify the cost-effectiveness of MRD testing followed by risk-directed therapy to no MRD testing and to estimate its marginal effect on health outcomes and on costs. Model input parameters were based on the literature, expert opinion, and data from the Pediatric Oncology Group of Ontario Networked Information System. Using predictions from our Markov model, we estimated the 1-year cost burden of MRD testing versus no testing and forecasted its economic impact over 3 and 5 years. Results In a base-case cost-effectiveness analysis, compared with no testing, MRD testing by flow cytometry at the end of induction and consolidation was associated with an increased discounted survival of 0.0958 quality-adjusted life-years (QALYs) and increased discounted costs of $4,180, yielding an incremental cost-effectiveness ratio (ICER) of $43,613/QALY gained. After accounting for parameter uncertainty, incremental cost-effectiveness of MRD testing was associated with an ICER of $50,249/QALY gained. In the budget-impact analysis, the 1-year cost expenditure for MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL was estimated at $340,760. We forecasted that the province would have to pay approximately $1.3 million over 3 years and $2.4 million over 5 years for MRD testing by flow cytometry in this population. Conclusions Compared with no testing, MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL represents good value for money at commonly used willingness-to-pay thresholds of $50,000/QALY and $100,000/QALY. PMID:27099644
Minimal Residual Disease Evaluation in Childhood Acute Lymphoblastic Leukemia: An Economic Analysis.
2016-01-01
Minimal residual disease (MRD) testing by higher performance techniques such as flow cytometry and polymerase chain reaction (PCR) can be used to detect the proportion of remaining leukemic cells in bone marrow or peripheral blood during and after the first phases of chemotherapy in children with acute lymphoblastic leukemia (ALL). The results of MRD testing are used to reclassify these patients and guide changes in treatment according to their future risk of relapse. We conducted a systematic review of the economic literature, cost-effectiveness analysis, and budget-impact analysis to ascertain the cost-effectiveness and economic impact of MRD testing by flow cytometry for management of childhood precursor B-cell ALL in Ontario. A systematic literature search (1998-2014) identified studies that examined the incremental cost-effectiveness of MRD testing by either flow cytometry or PCR. We developed a lifetime state-transition (Markov) microsimulation model to quantify the cost-effectiveness of MRD testing followed by risk-directed therapy to no MRD testing and to estimate its marginal effect on health outcomes and on costs. Model input parameters were based on the literature, expert opinion, and data from the Pediatric Oncology Group of Ontario Networked Information System. Using predictions from our Markov model, we estimated the 1-year cost burden of MRD testing versus no testing and forecasted its economic impact over 3 and 5 years. In a base-case cost-effectiveness analysis, compared with no testing, MRD testing by flow cytometry at the end of induction and consolidation was associated with an increased discounted survival of 0.0958 quality-adjusted life-years (QALYs) and increased discounted costs of $4,180, yielding an incremental cost-effectiveness ratio (ICER) of $43,613/QALY gained. After accounting for parameter uncertainty, incremental cost-effectiveness of MRD testing was associated with an ICER of $50,249/QALY gained. In the budget-impact analysis, the 1-year cost expenditure for MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL was estimated at $340,760. We forecasted that the province would have to pay approximately $1.3 million over 3 years and $2.4 million over 5 years for MRD testing by flow cytometry in this population. Compared with no testing, MRD testing by flow cytometry in newly diagnosed patients with precursor B-cell ALL represents good value for money at commonly used willingness-to-pay thresholds of $50,000/QALY and $100,000/QALY.
A national-scale seasonal hydrological forecast system: development and evaluation over Britain
NASA Astrophysics Data System (ADS)
Bell, Victoria A.; Davies, Helen N.; Kay, Alison L.; Brookshaw, Anca; Scaife, Adam A.
2017-09-01
Skilful winter seasonal predictions for the North Atlantic circulation and northern Europe have now been demonstrated and the potential for seasonal hydrological forecasting in the UK is now being explored. One of the techniques being used combines seasonal rainfall forecasts provided by operational weather forecast systems with hydrological modelling tools to provide estimates of seasonal mean river flows up to a few months ahead. The work presented here shows how spatial information contained in a distributed hydrological model typically requiring high-resolution (daily or better) rainfall data can be used to provide an initial condition for a much simpler forecast model tailored to use low-resolution monthly rainfall forecasts. Rainfall forecasts (hindcasts
) from the GloSea5 model (1996 to 2009) are used to provide the first assessment of skill in these national-scale flow forecasts. The skill in the combined modelling system is assessed for different seasons and regions of Britain, and compared to what might be achieved using other approaches such as use of an ensemble of historical rainfall in a hydrological model, or a simple flow persistence forecast. The analysis indicates that only limited forecast skill is achievable for Spring and Summer seasonal hydrological forecasts; however, Autumn and Winter flows can be reasonably well forecast using (ensemble mean) rainfall forecasts based on either GloSea5 forecasts or historical rainfall (the preferred type of forecast depends on the region). Flow forecasts using ensemble mean GloSea5 rainfall perform most consistently well across Britain, and provide the most skilful forecasts overall at the 3-month lead time. Much of the skill (64 %) in the 1-month ahead seasonal flow forecasts can be attributed to the hydrological initial condition (particularly in regions with a significant groundwater contribution to flows), whereas for the 3-month ahead lead time, GloSea5 forecasts account for ˜ 70 % of the forecast skill (mostly in areas of high rainfall to the north and west) and only 30 % of the skill arises from hydrological memory (typically groundwater-dominated areas). Given the high spatial heterogeneity in typical patterns of UK rainfall and evaporation, future development of skilful spatially distributed seasonal forecasts could lead to substantial improvements in seasonal flow forecast capability, potentially benefitting practitioners interested in predicting hydrological extremes, not only in the UK but also across Europe.
Remote sensing validation through SOOP technology: implementation of Spectra system
NASA Astrophysics Data System (ADS)
Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Consalvi, Natalizia; Caligiore, Aurelio; Falcone, Daniela; Puri, Pio; Sarti, Fabio; Spaccavento, Giovanni; Lucarini, Diego; Pacci, Giacomo; Amitrano, Luigi; Iacullo, Salvatore; D'Andrea, Salvatore; Marcelli, Marco
2017-04-01
The development of low-cost instrumentation plays a key role in marine environmental studies and represents one of the most innovative aspects of marine research. The availability of low-cost technologies allows the realization of extended observatory networks for the study of marine phenomena through an integrated approach merging observations, remote sensing and operational oceanography. Marine services and practical applications critically depends on the availability of large amount of data collected with sufficiently dense spatial and temporal sampling. This issue directly influences the robustness both of ocean forecasting models and remote sensing observations through data assimilation and validation processes, particularly in the biological domain. For this reason it is necessary the development of cheap, small and integrated smart sensors, which could be functional both for satellite data validation and forecasting models data assimilation as well as to support early warning systems for environmental pollution control and prevention. This is particularly true in coastal areas, which are subjected to multiple anthropic pressures. Moreover, coastal waters can be classified like case 2 waters, where the optical properties of inorganic suspended matter and chromophoric dissolved organic matter must be considered and separated by the chlorophyll a contribution. Due to the high costs of mooring systems, research vessels, measure platforms and instrumentation a big effort was dedicated to the design, development and realization of a new low cost mini-FerryBox system: Spectra. Thanks to the modularity and user-friendly employment of the system, Spectra allows to acquire continuous in situ measures of temperature, conductivity, turbidity, chlorophyll a and chromophoric dissolved organic matter (CDOM) fluorescences from voluntary vessels, even by non specialized operators (Marcelli et al., 2014; 2016). This work shows the preliminary application of this technology to remote sensing data validation.
NASA Astrophysics Data System (ADS)
Kobos, Peter Holmes
This dissertation analyzes the current and potential future costs of renewable energy technology from an institutional perspective. The central hypothesis is that reliable technology cost forecasting can be achieved through standard and modified experience curves implemented in a dynamic simulation model. Additionally, drawing upon region-specific institutional lessons highlights the role of market, social, and political institutions throughout an economy. Socio-political influences and government policy pathways drive resource allocation decisions that may be predominately influenced by factors other than those considered in a traditional market-driven, mechanistic approach. Learning in economic systems as a research topic is an attractive complement to the notion of institutional pathways. The economic implications of learning by doing, as first outlined by Arrow (1962), highlight decreasing production costs as individuals, or more generally the firm, become more familiar with a production process. The standard approach in the literature has been to employ a common experience curve where cumulative production is the only independent variable affecting costs. This dissertation develops a two factor experience curve, adding research, development and demonstration (RD&D) expenditures as a second variable. To illustrate the concept in the context of energy planning, two factor experience curves are developed for wind energy technology and solar photovoltaic (PV) modules under different assumptions on learning rates for cumulative capacity and the knowledge stock (a function of past RD&D efforts). Additionally, a one factor experience curve and cost trajectory scenarios are developed for concentrated solar power and geothermal energy technology, respectively. Cost forecasts are then developed for all four of these technologies in a dynamic simulation model. Combining the theoretical framework of learning by doing with the fields of organizational learning and institutional economics, this dissertation argues that the current state of renewable energy technology costs is largely due to the past production efforts (learning by doing) and RD&D efforts (learning by searching) in these global industries. This cost pathway, however, may be altered through several policy process feedback mechanisms including targeted RD&D expenditures, maintenance of RD&D to promote learning effects, and financial incentive programs that support energy production from renewable energy technologies.
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
Flexible NO(x) abatement from power plants in the eastern United States.
Sun, Lin; Webster, Mort; McGaughey, Gary; McDonald-Buller, Elena C; Thompson, Tammy; Prinn, Ronald; Ellerman, A Denny; Allen, David T
2012-05-15
Emission controls that provide incentives for maximizing reductions in emissions of ozone precursors on days when ozone concentrations are highest have the potential to be cost-effective ozone management strategies. Conventional prescriptive emissions controls or cap-and-trade programs consider all emissions similarly regardless of when they occur, despite the fact that contributions to ozone formation may vary. In contrast, a time-differentiated approach targets emissions reductions on forecasted high ozone days without imposition of additional costs on lower ozone days. This work examines simulations of such dynamic air quality management strategies for NO(x) emissions from electric generating units. Results from a model of day-specific NO(x) pricing applied to the Pennsylvania-New Jersey-Maryland (PJM) portion of the northeastern U.S. electrical grid demonstrate (i) that sufficient flexibility in electricity generation is available to allow power production to be switched from high to low NO(x) emitting facilities, (ii) that the emission price required to induce EGUs to change their strategies for power generation are competitive with other control costs, (iii) that dispatching strategies, which can change the spatial and temporal distribution of emissions, lead to ozone concentration reductions comparable to other control technologies, and (iv) that air quality forecasting is sufficiently accurate to allow EGUs to adapt their power generation strategies.
NASA Astrophysics Data System (ADS)
Bao, Hongjun; Zhao, Linna
2012-02-01
A coupled atmospheric-hydrologic-hydraulic ensemble flood forecasting model, driven by The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data, has been developed for flood forecasting over the Huaihe River. The incorporation of numerical weather prediction (NWP) information into flood forecasting systems may increase forecast lead time from a few hours to a few days. A single NWP model forecast from a single forecast center, however, is insufficient as it involves considerable non-predictable uncertainties and leads to a high number of false alarms. The availability of global ensemble NWP systems through TIGGE offers a new opportunity for flood forecast. The Xinanjiang model used for hydrological rainfall-runoff modeling and the one-dimensional unsteady flow model applied to channel flood routing are coupled with ensemble weather predictions based on the TIGGE data from the Canadian Meteorological Centre (CMC), the European Centre for Medium-Range Weather Forecasts (ECMWF), the UK Met Office (UKMO), and the US National Centers for Environmental Prediction (NCEP). The developed ensemble flood forecasting model is applied to flood forecasting of the 2007 flood season as a test case. The test case is chosen over the upper reaches of the Huaihe River above Lutaizi station with flood diversion and retarding areas. The input flood discharge hydrograph from the main channel to the flood diversion area is estimated with the fixed split ratio of the main channel discharge. The flood flow inside the flood retarding area is calculated as a reservoir with the water balance method. The Muskingum method is used for flood routing in the flood diversion area. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE ensemble forecasts. The results demonstrate satisfactory flood forecasting with clear signals of probability of floods up to a few days in advance, and show that TIGGE ensemble forecast data are a promising tool for forecasting of flood inundation, comparable with that driven by raingauge observations.
Economic analysis for transmission operation and planning
NASA Astrophysics Data System (ADS)
Zhou, Qun
2011-12-01
Restructuring of the electric power industry has caused dramatic changes in the use of transmission system. The increasing congestion conditions as well as the necessity of integrating renewable energy introduce new challenges and uncertainties to transmission operation and planning. Accurate short-term congestion forecasting facilitates market traders in bidding and trading activities. Cost sharing and recovery issue is a major impediment for long-term transmission investment to integrate renewable energy. In this research, a new short-term forecasting algorithm is proposed for predicting congestion, LMPs, and other power system variables based on the concept of system patterns. The advantage of this algorithm relative to standard statistical forecasting methods is that structural aspects underlying power market operations are exploited to reduce the forecasting error. The advantage relative to previously proposed structural forecasting methods is that data requirements are substantially reduced. Forecasting results based on a NYISO case study demonstrate the feasibility and accuracy of the proposed algorithm. Moreover, a negotiation methodology is developed to guide transmission investment for integrating renewable energy. Built on Nash Bargaining theory, the negotiation of investment plans and payment rate can proceed between renewable generation and transmission companies for cost sharing and recovery. The proposed approach is applied to Garver's six bus system. The numerical results demonstrate fairness and efficiency of the approach, and hence can be used as guidelines for renewable energy investors. The results also shed light on policy-making of renewable energy subsidies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Wu, Hongyu; Florita, Anthony R.
The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less
Wang, Qin; Wu, Hongyu; Florita, Anthony R.; ...
2016-11-11
The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less
A new scoring method for evaluating the performance of earthquake forecasts and predictions
NASA Astrophysics Data System (ADS)
Zhuang, J.
2009-12-01
This study presents a new method, namely the gambling score, for scoring the performance of earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. A fair scoring scheme should reward the success in a way that is compatible with the risk taken. Suppose that we have the reference model, usually the Poisson model for usual cases or Omori-Utsu formula for the case of forecasting aftershocks, which gives probability p0 that at least 1 event occurs in a given space-time-magnitude window. The forecaster, similar to a gambler, who starts with a certain number of reputation points, bets 1 reputation point on ``Yes'' or ``No'' according to his forecast, or bets nothing if he performs a NA-prediction. If the forecaster bets 1 reputation point of his reputations on ``Yes" and loses, the number of his reputation points is reduced by 1; if his forecasts is successful, he should be rewarded (1-p0)/p0 reputation points. The quantity (1-p0)/p0 is the return (reward/bet) ratio for bets on ``Yes''. In this way, if the reference model is correct, the expected return that he gains from this bet is 0. This rule also applies to probability forecasts. Suppose that p is the occurrence probability of an earthquake given by the forecaster. We can regard the forecaster as splitting 1 reputation point by betting p on ``Yes'' and 1-p on ``No''. In this way, the forecaster's expected pay-off based on the reference model is still 0. From the viewpoints of both the reference model and the forecaster, the rule for rewarding and punishment is fair. This method is also extended to the continuous case of point process models, where the reputation points bet by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.
Oil Production Capacity Expansion Costs for the Persian Gulf
1996-01-01
Provides estimates of development and operating costs for various size fields in countries surrounding the Persian Gulf. In addition, a forecast of the required reserve development and associated costs to meet the expected demand through the year 2010 is presented.
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2015-08-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.
Evaluation and Applications of the Prediction of Intensity Model Error (PRIME) Model
NASA Astrophysics Data System (ADS)
Bhatia, K. T.; Nolan, D. S.; Demaria, M.; Schumacher, A.
2015-12-01
Forecasters and end users of tropical cyclone (TC) intensity forecasts would greatly benefit from a reliable expectation of model error to counteract the lack of consistency in TC intensity forecast performance. As a first step towards producing error predictions to accompany each TC intensity forecast, Bhatia and Nolan (2013) studied the relationship between synoptic parameters, TC attributes, and forecast errors. In this study, we build on previous results of Bhatia and Nolan (2013) by testing the ability of the Prediction of Intensity Model Error (PRIME) model to forecast the absolute error and bias of four leading intensity models available for guidance in the Atlantic basin. PRIME forecasts are independently evaluated at each 12-hour interval from 12 to 120 hours during the 2007-2014 Atlantic hurricane seasons. The absolute error and bias predictions of PRIME are compared to their respective climatologies to determine their skill. In addition to these results, we will present the performance of the operational version of PRIME run during the 2015 hurricane season. PRIME verification results show that it can reliably anticipate situations where particular models excel, and therefore could lead to a more informed protocol for hurricane evacuations and storm preparations. These positive conclusions suggest that PRIME forecasts also have the potential to lower the error in the original intensity forecasts of each model. As a result, two techniques are proposed to develop a post-processing procedure for a multimodel ensemble based on PRIME. The first approach is to inverse-weight models using PRIME absolute error predictions (higher predicted absolute error corresponds to lower weights). The second multimodel ensemble applies PRIME bias predictions to each model's intensity forecast and the mean of the corrected models is evaluated. The forecasts of both of these experimental ensembles are compared to those of the equal-weight ICON ensemble, which currently provides the most reliable forecasts in the Atlantic basin.
verification statistics Grumbine, R. W., Virtual Floe Ice Drift Forecast Model Intercomparison, Weather and Forecasting, 13, 886-890, 1998. MMAB Note: Virtual Floe Ice Drift Forecast Model Intercomparison 1996 pdf ~47
Liu, Dong-jun; Li, Li
2015-01-01
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332
Liu, Dong-jun; Li, Li
2015-06-23
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.
Assimilation of Wave Imaging Radar Observations for Real-time Wave-by-Wave Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, Alexandra; Haller, Merrick; Walker, David
This project addressed Topic 3: “Wave Measurement Instrumentation for Feed Forward Controls” under the FOA number DE-FOA-0000971. The overall goal of the program was to develop a phase-resolving wave forecasting technique for application to the active control of Wave Energy Conversion (WEC) devices. We have developed an approach that couples a wave imaging marine radar with a phase-resolving linear wave model for real-time wave field reconstruction and forward propagation of the wave field in space and time. The scope of the project was to develop and assess the performance of this novel forecasting system. Specific project goals were as follows:more » Develop and verify a fast, GPU-based (Graphical Processing Unit) wave propagation model suitable for phase-resolved computation of nearshore wave transformation over variable bathymetry; Compare the accuracy and speed of performance of the wave model against a deep water model in their ability to predict wave field transformation in the intermediate water depths (50 to 70 m) typical of planned WEC sites; Develop and implement a variational assimilation algorithm that can ingest wave imaging radar observations and estimate the time-varying wave conditions offshore of the domain of interest such that the observed wave field is best reconstructed throughout the domain and then use this to produce model forecasts for a given WEC location; Collect wave-resolving marine radar data, along with relevant in situ wave data, at a suitable wave energy test site, apply the algorithm to the field data, assess performance, and identify any necessary improvements; and Develop a production cost estimate that addresses the affordability of the wave forecasting technology and include in the Final Report. The developed forecasting algorithm (“Wavecast”) was evaluated for both speed and accuracy against a substantial synthetic dataset. Early in the project, performance tests definitively demonstrated that the system was capable of forecasting in real-time, as the GPU-based wave model backbone was very computationally efficient. The data assimilation algorithm was developed on a polar grid domain in order to match the sampling characteristics of the observation system (wave imaging marine radar). For verification purposes, a substantial set of synthetic wave data (i.e. forward runs of the wave model) were generated to be used as ground truth for comparison to the reconstructions and forecasts produced by Wavecast. For these synthetic cases, Wavecast demonstrated very good accuracy, for example, typical forecast correlation coefficients were between 0.84-0.95 when compared to the input data. Dependencies on shadowing, observational noise, and forecast horizon were also identified. During the second year of the project, a short field deployment was conducted in order to assess forecast accuracy under field conditions. For this, a radar was installed on a fishing vessel and observations were collected at the South Energy Test Site (SETS) off the coast of Newport, OR. At the SETS site, simultaneous in situ wave observations were also available owing to an ongoing field project funded separately. Unfortunately, the position and heading information that was available for the fishing vessel were not of sufficient accuracy in order to validate the forecast in a phase-resolving sense. Instead, a spectral comparison was made between the Wavecast forecast and the data from the in situ wave buoy. Although the wave and wind conditions during the field test were complex, the comparison showed a promising reconstruction of the wave spectral shape, where both peaks in the bimodal spectrum were represented. However, the total reconstructed spectral energy (across all directions and frequencies) was limited to 44% of the observed spectrum. Overall, wave-by-wave forecasting using a data assimilation approach based on wave imaging radar observations and a physics-based wave model shows promise for short-term phase-resolved predictions. Two recommendations for future work are as follows: first, we would recommend additional focused field campaigns for algorithm validation. The field campaign should be long enough to capture a range of wave conditions relevant to the target application and WEC site. In addition, it will be crucial to make sure the vessel of choice has high accuracy position and heading instrumentation (this instrumentation is commercially available but not standard on commercial fishing vessels). The second recommendation is to expand the model physics in the wave model backbone to include some nonlinear effects. Specifically, the third-order correction to the wave speed due to amplitude dispersion would be the next step in order to more accurately represent the phase speeds of large amplitude waves.« less
Stochastic Model of Seasonal Runoff Forecasts
NASA Astrophysics Data System (ADS)
Krzysztofowicz, Roman; Watada, Leslie M.
1986-03-01
Each year the National Weather Service and the Soil Conservation Service issue a monthly sequence of five (or six) categorical forecasts of the seasonal snowmelt runoff volume. To describe uncertainties in these forecasts for the purposes of optimal decision making, a stochastic model is formulated. It is a discrete-time, finite, continuous-space, nonstationary Markov process. Posterior densities of the actual runoff conditional upon a forecast, and transition densities of forecasts are obtained from a Bayesian information processor. Parametric densities are derived for the process with a normal prior density of the runoff and a linear model of the forecast error. The structure of the model and the estimation procedure are motivated by analyses of forecast records from five stations in the Snake River basin, from the period 1971-1983. The advantages of supplementing the current forecasting scheme with a Bayesian analysis are discussed.
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
Yoo, Wucherl; Sim, Alex
2016-06-24
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
Model Error Estimation for the CPTEC Eta Model
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; daSilva, Arlindo
1999-01-01
Statistical data assimilation systems require the specification of forecast and observation error statistics. Forecast error is due to model imperfections and differences between the initial condition and the actual state of the atmosphere. Practical four-dimensional variational (4D-Var) methods try to fit the forecast state to the observations and assume that the model error is negligible. Here with a number of simplifying assumption, a framework is developed for isolating the model error given the forecast error at two lead-times. Two definitions are proposed for the Talagrand ratio tau, the fraction of the forecast error due to model error rather than initial condition error. Data from the CPTEC Eta Model running operationally over South America are used to calculate forecast error statistics and lower bounds for tau.
Section on Observed Impacts on El Nino
NASA Technical Reports Server (NTRS)
Rosenzweig, Cynthia
2000-01-01
Agricultural applications of El Nino forecasts are already underway in some countries and need to be evaluated or re-evaluated. For example, in Peru, El Nino forecasts have been incorporated into national planning for the agricultural sector, and areas planted with rice and cotton (cotton being the more drought-tolerant crop) are adjusted accordingly. How well are this and other such programs working? Such evaluations will contribute to the governmental and intergovernmental institutions, including the Inter-American Institute for Global Change Research and the US National Ocean and Atmospheric Agency that are fostering programs to aid the effective use of forecasts. As El Nino climate forecasting grows out of the research mode into operational mode, the research focus shifts to include the design of appropriate modes of utilization. Awareness of and sensitivity to the costs of prediction errors also grow. For example, one major forecasting model failed to predict the very large El Nino event of 1997, when Pacific sea-surface temperatures were the highest on record. Although simple correlations between El Nino events and crop yields may be suggestive, more sophisticated work is needed to understand the subtleties of the interplay among the global climate system, regional climate patterns, and local agricultural systems. Honesty about the limitations of an forecast is essential, especially when human livelihoods are at stake. An end-to-end analysis links tools and expertise from the full sequence of ENSO cause-and-effect processes. Representatives from many disciplines are needed to achieve insights, e.g, oceanographers and atmospheric scientists who predict El Nino events, climatologists who drive global climate models with sea-surface temperature predictions, agronomists who translate regional climate connections in to crop yield forecasts, and economists who analyze market adjustments to the vagaries of climate and determine the value of climate forecasts. Methods include historical studies to understand past patterns and to test hindcasts of the prediction tools, crop modeling, spatial analysis and remote sensing. This research involves expanding, deepening, and applying the understanding of physical climate to the fields of agronomy and social science; and the reciprocal understanding of crop growth and farm economics to climatology. Delivery of a regional climate forecast with no information about how the climate forecast was derived limits its effectiveness. Explanation of a region's major climate driving forces helps to place a seasonal forecast in context. Then, a useful approach is to show historical responses to previous El Nino events, and projections, with uncertainty intervals, of crop response from dynamic process crop growth models. Regional ID forecasts should be updated with real-time weather conditions. Since every El Nino event is different, it is important to track, report and advise on each new event as it unfolds. The stability of human enterprises depends on understanding both the potentialities and the limits of predictability. Farmers rely on past experience to anticipate and respond to fluctuations in the biophysical systems on which their livelihoods depend. Now scientists are improving their ability to predict some major elements of climate variability. The improvements in the reliability of El Nino forecasts are encouraging, but seasonal forecasts for agriculture are not, and will probably never be completely infallible, due to the chaotic nature of the climate system. Uncertainties proliferate as we extend beyond Pacific sea-surface temperatures to climate teleconnections and agricultural outcomes. The goal of this research is to shed as a clear light as possible on these inherent uncertainties and thus to contribute to the development of appropriate responses to El Nino and other seasonal forecasts for a range of stakeholders, which, ultimately, includes food consumers everywhere.
Research on Nonlinear Time Series Forecasting of Time-Delay NN Embedded with Bayesian Regularization
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yusheng; Xu, Yuhui; Wang, Jianmin
Based on the idea of nonlinear prediction of phase space reconstruction, this paper presented a time delay BP neural network model, whose generalization capability was improved by Bayesian regularization. Furthermore, the model is applied to forecast the imp&exp trades in one industry. The results showed that the improved model has excellent generalization capabilities, which not only learned the historical curve, but efficiently predicted the trend of business. Comparing with common evaluation of forecasts, we put on a conclusion that nonlinear forecast can not only focus on data combination and precision improvement, it also can vividly reflect the nonlinear characteristic of the forecasting system. While analyzing the forecasting precision of the model, we give a model judgment by calculating the nonlinear characteristic value of the combined serial and original serial, proved that the forecasting model can reasonably 'catch' the dynamic characteristic of the nonlinear system which produced the origin serial.
A scoping review of malaria forecasting: past work and future directions
Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L
2012-01-01
Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venner, Jason; Moreno-Madrinan, Max J.; Delgado, Francisco
2012-01-01
Two projects at NASA Marshall Space Flight Center have collaborated to develop a high resolution weather forecast model for Mesoamerica: The NASA Short-term Prediction Research and Transition (SPoRT) Center, which integrates unique NASA satellite and weather forecast modeling capabilities into the operational weather forecasting community. NASA's SERVIR Program, which integrates satellite observations, ground-based data, and forecast models to improve disaster response in Central America, the Caribbean, Africa, and the Himalayas.
A study for systematic errors of the GLA forecast model in tropical regions
NASA Technical Reports Server (NTRS)
Chen, Tsing-Chang; Baker, Wayman E.; Pfaendtner, James; Corrigan, Martin
1988-01-01
From the sensitivity studies performed with the Goddard Laboratory for Atmospheres (GLA) analysis/forecast system, it was revealed that the forecast errors in the tropics affect the ability to forecast midlatitude weather in some cases. Apparently, the forecast errors occurring in the tropics can propagate to midlatitudes. Therefore, the systematic error analysis of the GLA forecast system becomes a necessary step in improving the model's forecast performance. The major effort of this study is to examine the possible impact of the hydrological-cycle forecast error on dynamical fields in the GLA forecast system.
Lu, Wei-Zhen; Wang, Wen-Jian
2005-04-01
Monitoring and forecasting of air quality parameters are popular and important topics of atmospheric and environmental research today due to the health impact caused by exposing to air pollutants existing in urban air. The accurate models for air pollutant prediction are needed because such models would allow forecasting and diagnosing potential compliance or non-compliance in both short- and long-term aspects. Artificial neural networks (ANN) are regarded as reliable and cost-effective method to achieve such tasks and have produced some promising results to date. Although ANN has addressed more attentions to environmental researchers, its inherent drawbacks, e.g., local minima, over-fitting training, poor generalization performance, determination of the appropriate network architecture, etc., impede the practical application of ANN. Support vector machine (SVM), a novel type of learning machine based on statistical learning theory, can be used for regression and time series prediction and have been reported to perform well by some promising results. The work presented in this paper aims to examine the feasibility of applying SVM to predict air pollutant levels in advancing time series based on the monitored air pollutant database in Hong Kong downtown area. At the same time, the functional characteristics of SVM are investigated in the study. The experimental comparisons between the SVM model and the classical radial basis function (RBF) network demonstrate that the SVM is superior to the conventional RBF network in predicting air quality parameters with different time series and of better generalization performance than the RBF model.
An Integrated Enrollment Forecast Model. IR Applications, Volume 15, January 18, 2008
ERIC Educational Resources Information Center
Chen, Chau-Kuang
2008-01-01
Enrollment forecasting is the central component of effective budget and program planning. The integrated enrollment forecast model is developed to achieve a better understanding of the variables affecting student enrollment and, ultimately, to perform accurate forecasts. The transfer function model of the autoregressive integrated moving average…
Can we use Earth Observations to improve monthly water level forecasts?
NASA Astrophysics Data System (ADS)
Slater, L. J.; Villarini, G.
2017-12-01
Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
Forecasting of indirect consumables for a Job Shop
NASA Astrophysics Data System (ADS)
Shakeel, M.; Khan, S.; Khan, W. A.
2016-08-01
A job shop has an arrangement where similar machines (Direct consumables) are grouped together and use indirect consumables to produce a product. The indirect consumables include hack saw blades, emery paper, painting brush etc. The job shop is serving various orders at a particular time for the optimal operation of job shop. Forecasting is required to predict the demand of direct and indirect consumables in a job shop. Forecasting is also needed to manage lead time, optimize inventory cost and stock outs. The objective of this research is to obtain the forecast for indirect consumables. The paper shows how job shop can manage their indirect consumables more accurately by establishing a new technique of forecasting. This results in profitable use of job shop by multiple users.