Worldwide satellite market demand forecast
NASA Technical Reports Server (NTRS)
Bowyer, J. M.; Frankfort, M.; Steinnagel, K. M.
1981-01-01
The forecast is for the years 1981 - 2000 with benchmark years at 1985, 1990 and 2000. Two typs of markets are considered for this study: Hardware (worldwide total) - satellites, earth stations and control facilities (includes replacements and spares); and non-hardware (addressable by U.S. industry) - planning, launch, turnkey systems and operations. These markets were examined for the INTELSAT System (international systems and domestic and regional systems using leased transponders) and domestic and regional systems. Forecasts were determined for six worldwide regions encompassing 185 countries using actual costs for existing equipment and engineering estimates of costs for advanced systems. Most likely (conservative growth rate estimates) and optimistic (mid range growth rate estimates) scenarios were employed for arriving at the forecasts which are presented in constant 1980 U.S. dollars. The worldwide satellite market demand forecast predicts that the market between 181 and 2000 will range from $35 to $50 billion. Approximately one-half of the world market, $16 to $20 billion, will be generated in the United States.
Worldwide satellite market demand forecast
NASA Astrophysics Data System (ADS)
Bowyer, J. M.; Frankfort, M.; Steinnagel, K. M.
1981-06-01
The forecast is for the years 1981 - 2000 with benchmark years at 1985, 1990 and 2000. Two typs of markets are considered for this study: Hardware (worldwide total) - satellites, earth stations and control facilities (includes replacements and spares); and non-hardware (addressable by U.S. industry) - planning, launch, turnkey systems and operations. These markets were examined for the INTELSAT System (international systems and domestic and regional systems using leased transponders) and domestic and regional systems. Forecasts were determined for six worldwide regions encompassing 185 countries using actual costs for existing equipment and engineering estimates of costs for advanced systems. Most likely (conservative growth rate estimates) and optimistic (mid range growth rate estimates) scenarios were employed for arriving at the forecasts which are presented in constant 1980 U.S. dollars. The worldwide satellite market demand forecast predicts that the market between 181 and 2000 will range from $35 to $50 billion. Approximately one-half of the world market, $16 to $20 billion, will be generated in the United States.
Forecasting the mortality rates using Lee-Carter model and Heligman-Pollard model
NASA Astrophysics Data System (ADS)
Ibrahim, R. I.; Ngataman, N.; Abrisam, W. N. A. Wan Mohd
2017-09-01
Improvement in life expectancies has driven further declines in mortality. The sustained reduction in mortality rates and its systematic underestimation has been attracting the significant interest of researchers in recent years because of its potential impact on population size and structure, social security systems, and (from an actuarial perspective) the life insurance and pensions industry worldwide. Among all forecasting methods, the Lee-Carter model has been widely accepted by the actuarial community and Heligman-Pollard model has been widely used by researchers in modelling and forecasting future mortality. Therefore, this paper only focuses on Lee-Carter model and Heligman-Pollard model. The main objective of this paper is to investigate how accurately these two models will perform using Malaysian data. Since these models involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 8.0 (MATLAB 8.0) software will be used to estimate the parameters of the models. Autoregressive Integrated Moving Average (ARIMA) procedure is applied to acquire the forecasted parameters for both models as the forecasted mortality rates are obtained by using all the values of forecasted parameters. To investigate the accuracy of the estimation, the forecasted results will be compared against actual data of mortality rates. The results indicate that both models provide better results for male population. However, for the elderly female population, Heligman-Pollard model seems to underestimate to the mortality rates while Lee-Carter model seems to overestimate to the mortality rates.
Statistical control in hydrologic forecasting.
H.G. Wilm
1950-01-01
With rapidly growing development and uses of water, a correspondingly great demand has developed for advance estimates of the volumes or rates of flow which are supplied by streams. Therefore much attention is being devoted to hydrologic forecasting, and numerous methods have been tested in efforts to make increasingly reliable estimates of future supplies.
van Baal, Pieter H; Wong, Albert
2012-12-01
Although the effect of time to death (TTD) on health care expenditures (HCE) has been investigated using individual level data, the most profound implications of TTD have been for the forecasting of macro-level HCE. Here we estimate the TTD model using macro-level data from the Netherlands consisting of mortality rates and age- and gender-specific per capita health expenditures for the years 1981-2007. Forecasts for the years 2008-2020 of this macro-level TTD model were compared to forecasts that excluded TTD. Results revealed that the effect of TTD on HCE in our macro model was similar to those found in micro-econometric studies. As the inclusion of TTD pushed growth rate estimates from unidentified causes upwards, however, the two models' forecasts of HCE for the 2008-2020 were similar. We argue that including TTD, if modeled correctly, does not lower forecasts of HCE. Copyright © 2012 Elsevier B.V. All rights reserved.
Forecasting the mortality rates of Malaysian population using Heligman-Pollard model
NASA Astrophysics Data System (ADS)
Ibrahim, Rose Irnawaty; Mohd, Razak; Ngataman, Nuraini; Abrisam, Wan Nur Azifah Wan Mohd
2017-08-01
Actuaries, demographers and other professionals have always been aware of the critical importance of mortality forecasting due to declining trend of mortality and continuous increases in life expectancy. Heligman-Pollard model was introduced in 1980 and has been widely used by researchers in modelling and forecasting future mortality. This paper aims to estimate an eight-parameter model based on Heligman and Pollard's law of mortality. Since the model involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 7.0 (MATLAB 7.0) software will be used in order to estimate the parameters. Statistical Package for the Social Sciences (SPSS) will be applied to forecast all the parameters according to Autoregressive Integrated Moving Average (ARIMA). The empirical data sets of Malaysian population for period of 1981 to 2015 for both genders will be considered, which the period of 1981 to 2010 will be used as "training set" and the period of 2011 to 2015 as "testing set". In order to investigate the accuracy of the estimation, the forecast results will be compared against actual data of mortality rates. The result shows that Heligman-Pollard model fit well for male population at all ages while the model seems to underestimate the mortality rates for female population at the older ages.
Smirnova, Alexandra; deCamp, Linda; Chowell, Gerardo
2017-05-02
Deterministic and stochastic methods relying on early case incidence data for forecasting epidemic outbreaks have received increasing attention during the last few years. In mathematical terms, epidemic forecasting is an ill-posed problem due to instability of parameter identification and limited available data. While previous studies have largely estimated the time-dependent transmission rate by assuming specific functional forms (e.g., exponential decay) that depend on a few parameters, here we introduce a novel approach for the reconstruction of nonparametric time-dependent transmission rates by projecting onto a finite subspace spanned by Legendre polynomials. This approach enables us to effectively forecast future incidence cases, the clear advantage over recovering the transmission rate at finitely many grid points within the interval where the data are currently available. In our approach, we compare three regularization algorithms: variational (Tikhonov's) regularization, truncated singular value decomposition (TSVD), and modified TSVD in order to determine the stabilizing strategy that is most effective in terms of reliability of forecasting from limited data. We illustrate our methodology using simulated data as well as case incidence data for various epidemics including the 1918 influenza pandemic in San Francisco and the 2014-2015 Ebola epidemic in West Africa.
Using strain rates to forecast seismic hazards
Evans, Eileen
2017-01-01
One essential component in forecasting seismic hazards is observing the gradual accumulation of tectonic strain accumulation along faults before this strain is suddenly released as earthquakes. Typically, seismic hazard models are based on geologic estimates of slip rates along faults and historical records of seismic activity, neither of which records actively accumulating strain. But this strain can be estimated by geodesy: the precise measurement of tiny position changes of Earth’s surface, obtained from GPS, interferometric synthetic aperture radar (InSAR), or a variety of other instruments.
NASA Technical Reports Server (NTRS)
Raymond, William H.; Olson, William S.; Callan, Geary
1995-01-01
In this study, diabatic forcing, and liquid water assimilation techniques are tested in a semi-implicit hydrostatic regional forecast model containing explicit representations of grid-scale cloud water and rainwater. Diabatic forcing, in conjunction with diabatic contributions in the initialization, is found to help the forecast retain the diabatic signal found in the liquid water or heating rate data, consequently reducing the spinup time associated with grid-scale precipitation processes. Both observational Special Sensor Microwave/Imager (SSM/I) and model-generated data are used. A physical retrieval method incorporating SSM/I radiance data is utilized to estimate the 3D distribution of precipitating storms. In the retrieval method the relationship between precipitation distributions and upwelling microwave radiances is parameterized, based upon cloud ensemble-radiative model simulations. Regression formulae relating vertically integrated liquid and ice-phase precipitation amounts to latent heating rates are also derived from the cloud ensemble simulations. Thus, retrieved SSM/I precipitation structures can be used in conjunction with the regression-formulas to infer the 3D distribution of latent heating rates. These heating rates are used directly in the forecast model to help initiate Tropical Storm Emily (21 September 1987). The 14-h forecast of Emily's development yields atmospheric precipitation water contents that compare favorably with coincident SSM/I estimates.
Entropy Econometrics for combining regional economic forecasts: A Data-Weighted Prior Estimator
NASA Astrophysics Data System (ADS)
Fernández-Vázquez, Esteban; Moreno, Blanca
2017-10-01
Forecast combination has been studied in econometrics for a long time, and the literature has shown the superior performance of forecast combination over individual predictions. However, there is still controversy on which is the best procedure to specify the forecast weights. This paper explores the possibility of using a procedure based on Entropy Econometrics, which allows setting the weights for the individual forecasts as a mixture of different alternatives. In particular, we examine the ability of the Data-Weighted Prior Estimator proposed by Golan (J Econom 101(1):165-193, 2001) to combine forecasting models in a context of small sample sizes, a relative common scenario when dealing with time series for regional economies. We test the validity of the proposed approach using a simulation exercise and a real-world example that aims at predicting gross regional product growth rates for a regional economy. The forecasting performance of the Data-Weighted Prior Estimator proposed is compared with other combining methods. The simulation results indicate that in scenarios of heavily ill-conditioned datasets the approach suggested dominates other forecast combination strategies. The empirical results are consistent with the conclusions found in the numerical experiment.
NASA Astrophysics Data System (ADS)
Wardah, T.; Abu Bakar, S. H.; Bardossy, A.; Maznorizan, M.
2008-07-01
SummaryFrequent flash-floods causing immense devastation in the Klang River Basin of Malaysia necessitate an improvement in the real-time forecasting systems being used. The use of meteorological satellite images in estimating rainfall has become an attractive option for improving the performance of flood forecasting-and-warning systems. In this study, a rainfall estimation algorithm using the infrared (IR) information from the Geostationary Meteorological Satellite-5 (GMS-5) is developed for potential input in a flood forecasting system. Data from the records of GMS-5 IR images have been retrieved for selected convective cells to be trained with the radar rain rate in a back-propagation neural network. The selected data as inputs to the neural network, are five parameters having a significant correlation with the radar rain rate: namely, the cloud-top brightness-temperature of the pixel of interest, the mean and the standard deviation of the temperatures of the surrounding five by five pixels, the rate of temperature change, and the sobel operator that indicates the temperature gradient. In addition, three numerical weather prediction (NWP) products, namely the precipitable water content, relative humidity, and vertical wind, are also included as inputs. The algorithm is applied for the areal rainfall estimation in the upper Klang River Basin and compared with another technique that uses power-law regression between the cloud-top brightness-temperature and radar rain rate. Results from both techniques are validated against previously recorded Thiessen areal-averaged rainfall values with coefficient correlation values of 0.77 and 0.91 for the power-law regression and the artificial neural network (ANN) technique, respectively. An extra lead time of around 2 h is gained when the satellite-based ANN rainfall estimation is coupled with a rainfall-runoff model to forecast a flash-flood event in the upper Klang River Basin.
Evaluation of flash-flood discharge forecasts in complex terrain using precipitation
Yates, D.; Warner, T.T.; Brandes, E.A.; Leavesley, G.H.; Sun, Jielun; Mueller, C.K.
2001-01-01
Operational prediction of flash floods produced by thunderstorm (convective) precipitation in mountainous areas requires accurate estimates or predictions of the precipitation distribution in space and time. The details of the spatial distribution are especially critical in complex terrain because the watersheds are generally small in size, and small position errors in the forecast or observed placement of the precipitation can distribute the rain over the wrong watershed. In addition to the need for good precipitation estimates and predictions, accurate flood prediction requires a surface-hydrologic model that is capable of predicting stream or river discharge based on the precipitation-rate input data. Different techniques for the estimation and prediction of convective precipitation will be applied to the Buffalo Creek, Colorado flash flood of July 1996, where over 75 mm of rain from a thunderstorm fell on the watershed in less than 1 h. The hydrologic impact of the precipitation was exacerbated by the fact that a significant fraction of the watershed experienced a wildfire approximately two months prior to the rain event. Precipitation estimates from the National Weather Service's operational Weather Surveillance Radar-Doppler 1988 and the National Center for Atmospheric Research S-band, research, dual-polarization radar, colocated to the east of Denver, are compared. In addition, very short range forecasts from a convection-resolving dynamic model, which is initialized variationally using the radar reflectivity and Doppler winds, are compared with forecasts from an automated-algorithmic forecast system that also employs the radar data. The radar estimates of rain rate, and the two forecasting systems that employ the radar data, have degraded accuracy by virtue of the fact that they are applied in complex terrain. Nevertheless, the radar data and forecasts from the dynamic model and the automated algorithm could be operationally useful for input to surface-hydrologic models employed for flood warning. Precipitation data provided by these various techniques at short time scales and at fine spatial resolutions are employed as detailed input to a distributed-parameter hydrologic model for flash-flood prediction and analysis. With the radar-based precipitation estimates employed as input, the simulated flood discharge was similar to that observed. The dynamic-model precipitation forecast showed the most promise in providing a significant discharge-forecast lead time. The algorithmic system's precipitation forecast did not demonstrate as much skill, but the associated discharge forecast would still have been sufficient to have provided an alert of impending flood danger.
Assessing a 3D smoothed seismicity model of induced earthquakes
NASA Astrophysics Data System (ADS)
Zechar, Jeremy; Király, Eszter; Gischig, Valentin; Wiemer, Stefan
2016-04-01
As more energy exploration and extraction efforts cause earthquakes, it becomes increasingly important to control induced seismicity. Risk management schemes must be improved and should ultimately be based on near-real-time forecasting systems. With this goal in mind, we propose a test bench to evaluate models of induced seismicity based on metrics developed by the CSEP community. To illustrate the test bench, we consider a model based on the so-called seismogenic index and a rate decay; to produce three-dimensional forecasts, we smooth past earthquakes in space and time. We explore four variants of this model using the Basel 2006 and Soultz-sous-Forêts 2004 datasets to make short-term forecasts, test their consistency, and rank the model variants. Our results suggest that such a smoothed seismicity model is useful for forecasting induced seismicity within three days, and giving more weight to recent events improves forecast performance. Moreover, the location of the largest induced earthquake is forecast well by this model. Despite the good spatial performance, the model does not estimate the seismicity rate well: it frequently overestimates during stimulation and during the early post-stimulation period, and it systematically underestimates around shut-in. In this presentation, we also describe a robust estimate of information gain, a modification that can also benefit forecast experiments involving tectonic earthquakes.
Validation of the CME Geomagnetic Forecast Alerts Under the COMESEP Alert System
NASA Astrophysics Data System (ADS)
Dumbović, Mateja; Srivastava, Nandita; Rao, Yamini K.; Vršnak, Bojan; Devos, Andy; Rodriguez, Luciano
2017-08-01
Under the European Union 7th Framework Programme (EU FP7) project Coronal Mass Ejections and Solar Energetic Particles (COMESEP, http://comesep.aeronomy.be), an automated space weather alert system has been developed to forecast solar energetic particles (SEP) and coronal mass ejection (CME) risk levels at Earth. The COMESEP alert system uses the automated detection tool called Computer Aided CME Tracking (CACTus) to detect potentially threatening CMEs, a drag-based model (DBM) to predict their arrival, and a CME geoeffectiveness tool (CGFT) to predict their geomagnetic impact. Whenever CACTus detects a halo or partial halo CME and issues an alert, the DBM calculates its arrival time at Earth and the CGFT calculates its geomagnetic risk level. The geomagnetic risk level is calculated based on an estimation of the CME arrival probability and its likely geoeffectiveness, as well as an estimate of the geomagnetic storm duration. We present the evaluation of the CME risk level forecast with the COMESEP alert system based on a study of geoeffective CMEs observed during 2014. The validation of the forecast tool is made by comparing the forecasts with observations. In addition, we test the success rate of the automatic forecasts (without human intervention) against the forecasts with human intervention using advanced versions of the DBM and CGFT (independent tools available at the Hvar Observatory website, http://oh.geof.unizg.hr). The results indicate that the success rate of the forecast in its current form is unacceptably low for a realistic operation system. Human intervention improves the forecast, but the false-alarm rate remains unacceptably high. We discuss these results and their implications for possible improvement of the COMESEP alert system.
Forecasting seasonal outbreaks of influenza.
Shaman, Jeffrey; Karspeck, Alicia
2012-12-11
Influenza recurs seasonally in temperate regions of the world; however, our ability to predict the timing, duration, and magnitude of local seasonal outbreaks of influenza remains limited. Here we develop a framework for initializing real-time forecasts of seasonal influenza outbreaks, using a data assimilation technique commonly applied in numerical weather prediction. The availability of real-time, web-based estimates of local influenza infection rates makes this type of quantitative forecasting possible. Retrospective ensemble forecasts are generated on a weekly basis following assimilation of these web-based estimates for the 2003-2008 influenza seasons in New York City. The findings indicate that real-time skillful predictions of peak timing can be made more than 7 wk in advance of the actual peak. In addition, confidence in those predictions can be inferred from the spread of the forecast ensemble. This work represents an initial step in the development of a statistically rigorous system for real-time forecast of seasonal influenza.
Forecasting seasonal outbreaks of influenza
Shaman, Jeffrey; Karspeck, Alicia
2012-01-01
Influenza recurs seasonally in temperate regions of the world; however, our ability to predict the timing, duration, and magnitude of local seasonal outbreaks of influenza remains limited. Here we develop a framework for initializing real-time forecasts of seasonal influenza outbreaks, using a data assimilation technique commonly applied in numerical weather prediction. The availability of real-time, web-based estimates of local influenza infection rates makes this type of quantitative forecasting possible. Retrospective ensemble forecasts are generated on a weekly basis following assimilation of these web-based estimates for the 2003–2008 influenza seasons in New York City. The findings indicate that real-time skillful predictions of peak timing can be made more than 7 wk in advance of the actual peak. In addition, confidence in those predictions can be inferred from the spread of the forecast ensemble. This work represents an initial step in the development of a statistically rigorous system for real-time forecast of seasonal influenza. PMID:23184969
Uncertainty in forecasts of long-run economic growth.
Christensen, P; Gillingham, K; Nordhaus, W
2018-05-22
Forecasts of long-run economic growth are critical inputs into policy decisions being made today on the economy and the environment. Despite its importance, there is a sparse literature on long-run forecasts of economic growth and the uncertainty in such forecasts. This study presents comprehensive probabilistic long-run projections of global and regional per-capita economic growth rates, comparing estimates from an expert survey and a low-frequency econometric approach. Our primary results suggest a median 2010-2100 global growth rate in per-capita gross domestic product of 2.1% per year, with a standard deviation (SD) of 1.1 percentage points, indicating substantially higher uncertainty than is implied in existing forecasts. The larger range of growth rates implies a greater likelihood of extreme climate change outcomes than is currently assumed and has important implications for social insurance programs in the United States.
Estimating the budget impact of orphan drugs in Sweden and France 2013–2020
2014-01-01
Background The growth in expenditure on orphan medicinal products (OMP) across Europe has been identified as a concern. Estimates of future expenditure in Europe have suggested that OMPs could account for a significant proportion of total pharmaceutical expenditure in some countries, but few of these forecasts have been well validated. This analysis aims to establish a robust forecast of the future budget impact of OMPs on the healthcare systems in Sweden and France. Methods A dynamic forecasting model was created to estimate the budget impact of OMPs in Sweden and France between 2013 and 2020. The model used historical data on OMP designation and approval rates to predict the number of new OMPs coming to the market. Average OMP sales were estimated for each year post-launch by regression analysis of historical sales data. Total forecast sales were compared with expected sales of all pharmaceuticals in each country to quantify the relative budget impact. Results The model predicts that by 2020, 152 OMPs will have marketing authorization in Europe. The base case OMP budget impacts are forecast to grow from 2.7% in Sweden and 3.2% in France of total drug expenditure in 2013 to 4.1% in Sweden and 4.9% in France by 2020. The principal driver of expenditure growth is the number of new OMPs obtaining OMP designation. This is tempered by the slowing success rate for new approvals and the loss of intellectual property protection on existing orphan medicines. Given the forward-looking nature of the analysis, uncertainty exists around model parameters and sensitivity analysis found peak year budget impact varying between 2% and 11%. Conclusion The budget impact of OMPs in Sweden and France is likely to remain sustainable over time and a relatively small proportion of total pharmaceutical expenditure. This forecast could be affected by changes in the success rate for OMP approvals, average cost of OMPs, and the type of companies developing OMPs. PMID:24524281
Estimating the budget impact of orphan drugs in Sweden and France 2013-2020.
Hutchings, Adam; Schey, Carina; Dutton, Richard; Achana, Felix; Antonov, Karolina
2014-02-13
The growth in expenditure on orphan medicinal products (OMP) across Europe has been identified as a concern. Estimates of future expenditure in Europe have suggested that OMPs could account for a significant proportion of total pharmaceutical expenditure in some countries, but few of these forecasts have been well validated. This analysis aims to establish a robust forecast of the future budget impact of OMPs on the healthcare systems in Sweden and France. A dynamic forecasting model was created to estimate the budget impact of OMPs in Sweden and France between 2013 and 2020. The model used historical data on OMP designation and approval rates to predict the number of new OMPs coming to the market. Average OMP sales were estimated for each year post-launch by regression analysis of historical sales data. Total forecast sales were compared with expected sales of all pharmaceuticals in each country to quantify the relative budget impact. The model predicts that by 2020, 152 OMPs will have marketing authorization in Europe. The base case OMP budget impacts are forecast to grow from 2.7% in Sweden and 3.2% in France of total drug expenditure in 2013 to 4.1% in Sweden and 4.9% in France by 2020. The principal driver of expenditure growth is the number of new OMPs obtaining OMP designation. This is tempered by the slowing success rate for new approvals and the loss of intellectual property protection on existing orphan medicines. Given the forward-looking nature of the analysis, uncertainty exists around model parameters and sensitivity analysis found peak year budget impact varying between 2% and 11%. The budget impact of OMPs in Sweden and France is likely to remain sustainable over time and a relatively small proportion of total pharmaceutical expenditure. This forecast could be affected by changes in the success rate for OMP approvals, average cost of OMPs, and the type of companies developing OMPs.
Reither, Eric N; Olshansky, S Jay; Yang, Yang
2011-08-01
Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.
Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US
NASA Astrophysics Data System (ADS)
Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.
2015-12-01
Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.
Time-dependent earthquake forecasting: Method and application to the Italian region
NASA Astrophysics Data System (ADS)
Chan, C.; Sorensen, M. B.; Grünthal, G.; Hakimhashemi, A.; Heidbach, O.; Stromeyer, D.; Bosse, C.
2009-12-01
We develop a new approach for time-dependent earthquake forecasting and apply it to the Italian region. In our approach, the seismicity density is represented by a bandwidth function as a smoothing Kernel in the neighboring region of earthquakes. To consider the fault-interaction-based forecasting, we calculate the Coulomb stress change imparted by each earthquake in the study area. From this, the change of seismicity rate as a function of time can be estimated by the concept of rate-and-state stress transfer. We apply our approach to the region of Italy and earthquakes that occurred before 2003 to generate the seismicity density. To validate our approach, we compare our estimated seismicity density with the distribution of earthquakes with M≥3.8 after 2004. A positive correlation is found and all of the examined earthquakes locate in the area of the highest 66 percentile of seismicity density in the study region. Furthermore, the seismicity density corresponding to the epicenter of the 2009 April 6, Mw = 6.3, L’Aquila earthquake is in the area of the highest 5 percentile. For the time-dependent seismicity rate change, we estimate the rate-and-state stress transfer imparted by the M≥5.0 earthquakes occurred in the past 50 years. It suggests that the seismicity rate has increased at the locations of 65% of the examined earthquakes. Applying this approach to the L’Aquila sequence by considering seven M≥5.0 aftershocks as well as the main shock, not only spatial but also temporal forecasting of the aftershock distribution is significant.
Novel methodology for pharmaceutical expenditure forecast.
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.
Stochastic Forecasting of Labor Supply and Population: An Integrated Model.
Fuchs, Johann; Söhnlein, Doris; Weber, Brigitte; Weber, Enzo
2018-01-01
This paper presents a stochastic model to forecast the German population and labor supply until 2060. Within a cohort-component approach, our population forecast applies principal components analysis to birth, mortality, emigration, and immigration rates, which allows for the reduction of dimensionality and accounts for correlation of the rates. Labor force participation rates are estimated by means of an econometric time series approach. All time series are forecast by stochastic simulation using the bootstrap method. As our model also distinguishes between German and foreign nationals, different developments in fertility, migration, and labor participation could be predicted. The results show that even rising birth rates and high levels of immigration cannot break the basic demographic trend in the long run. An important finding from an endogenous modeling of emigration rates is that high net migration in the long run will be difficult to achieve. Our stochastic perspective suggests therefore a high probability of substantially decreasing the labor supply in Germany.
Giovannelli, J; Loury, P; Lainé, M; Spaccaferri, G; Hubert, B; Chaud, P
2015-05-01
To describe and evaluate the forecasts of the load that pandemic A(H1N1)2009 influenza would have on the general practitioners (GP) and hospital care systems, especially during its peak, in the Nord-Pas-de-Calais (NPDC) region, France. Modelling study. The epidemic curve was modelled using an assumption of normal distribution of cases. The values for the forecast parameters were estimated from a literature review of observed data from the Southern hemisphere and French Overseas Territories, where the pandemic had already occurred. Two scenarios were considered, one realistic, the other pessimistic, enabling the authors to evaluate the 'reasonable worst case'. Forecasts were then assessed by comparing them with observed data in the NPDC region--of 4 million people. The realistic scenarios forecasts estimated 300,000 cases, 1500 hospitalizations, 225 intensive care units (ICU) admissions for the pandemic wave; 115 hospital beds and 45 ICU beds would be required per day during the peak. The pessimistic scenario's forecasts were 2-3 times higher than the realistic scenario's forecasts. Observed data were: 235,000 cases, 1585 hospitalizations, 58 ICU admissions; and a maximum of 11.6 ICU beds per day. The realistic scenario correctly estimated the temporal distribution of GP and hospitalized cases but overestimated the number of cases admitted to ICU. Obtaining more robust data for parameters estimation--particularly the rate of ICU admission among the population that the authors recommend to use--may provide better forecasts. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Comparison between GSTAR and GSTAR-Kalman Filter models on inflation rate forecasting in East Java
NASA Astrophysics Data System (ADS)
Rahma Prillantika, Jessica; Apriliani, Erna; Wahyuningsih, Nuri
2018-03-01
Up to now, we often find data which have correlation between time and location. This data also known as spatial data. Inflation rate is one type of spatial data because it is not only related to the events of the previous time, but also has relevance to the other location or elsewhere. In this research, we do comparison between GSTAR model and GSTAR-Kalman Filter to get prediction which have small error rate. Kalman Filter is one estimator that estimates state changes due to noise from white noise. The final result shows that Kalman Filter is able to improve the GSTAR forecast result. This is shown through simulation results in the form of graphs and clarified with smaller RMSE values.
Validation of the CME Geomagnetic forecast alerts under COMESEP alert system
NASA Astrophysics Data System (ADS)
Dumbovic, Mateja; Srivastava, Nandita; Khodia, Yamini; Vršnak, Bojan; Devos, Andy; Rodriguez, Luciano
2017-04-01
An automated space weather alert system has been developed under the EU FP7 project COMESEP (COronal Mass Ejections and Solar Energetic Particles: http://comesep.aeronomy.be) to forecast solar energetic particles (SEP) and coronal mass ejection (CME) risk levels at Earth. COMESEP alert system uses automated detection tool CACTus to detect potentially threatening CMEs, drag-based model (DBM) to predict their arrival and CME geo-effectiveness tool (CGFT) to predict their geomagnetic impact. Whenever CACTus detects a halo or partial halo CME and issues an alert, DBM calculates its arrival time at Earth and CGFT calculates its geomagnetic risk level. Geomagnetic risk level is calculated based on an estimation of the CME arrival probability and its likely geo-effectiveness, as well as an estimate of the geomagnetic-storm duration. We present the evaluation of the CME risk level forecast with COMESEP alert system based on a study of geo-effective CMEs observed during 2014. The validation of the forecast tool is done by comparing the forecasts with observations. In addition, we test the success rate of the automatic forecasts (without human intervention) against the forecasts with human intervention using advanced versions of DBM and CGFT (self standing tools available at Hvar Observatory website: http://oh.geof.unizg.hr). The results implicate that the success rate of the forecast is higher with human intervention and using more advanced tools. This work has received funding from the European Commission FP7 Project COMESEP (263252). We acknowledge the support of Croatian Science Foundation under the project 6212 „Solar and Stellar Variability".
Estimation of Eddy Dissipation Rates from Mesoscale Model Simulations
NASA Technical Reports Server (NTRS)
Ahmad, Nashat N.; Proctor, Fred H.
2012-01-01
The Eddy Dissipation Rate is an important metric for representing the intensity of atmospheric turbulence and is used as an input parameter for predicting the decay of aircraft wake vortices. In this study, the forecasts of eddy dissipation rates obtained from the current state-of-the-art mesoscale model are evaluated for terminal area applications. The Weather Research and Forecast mesoscale model is used to simulate the planetary boundary layer at high horizontal and vertical mesh resolutions. The Bougeault-Lacarrer and the Mellor-Yamada-Janji schemes implemented in the Weather Research and Forecast model are evaluated against data collected during the National Aeronautics and Space Administration s Memphis Wake Vortex Field Experiment. Comparisons with other observations are included as well.
1992 five year battery forecast
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amistadi, D.
1992-12-01
Five-year trends for automotive and industrial batteries are projected. Topic covered include: SLI shipments; lead consumption; automotive batteries (5-year annual growth rates); industrial batteries (standby power and motive power); estimated average battery life by area/country for 1989; US motor vehicle registrations; replacement battery shipments; potential lead consumption in electric vehicles; BCI recycling rates for lead-acid batteries; US average car/light truck battery life; channels of distribution; replacement battery inventory end July; 2nd US battery shipment forecast.
Error models for official mortality forecasts.
Alho, J M; Spencer, B D
1990-09-01
"The Office of the Actuary, U.S. Social Security Administration, produces alternative forecasts of mortality to reflect uncertainty about the future.... In this article we identify the components and assumptions of the official forecasts and approximate them by stochastic parametric models. We estimate parameters of the models from past data, derive statistical intervals for the forecasts, and compare them with the official high-low intervals. We use the models to evaluate the forecasts rather than to develop different predictions of the future. Analysis of data from 1972 to 1985 shows that the official intervals for mortality forecasts for males or females aged 45-70 have approximately a 95% chance of including the true mortality rate in any year. For other ages the chances are much less than 95%." excerpt
NASA Astrophysics Data System (ADS)
Garcin, Matthieu
2017-10-01
Hurst exponents depict the long memory of a time series. For human-dependent phenomena, as in finance, this feature may vary in the time. It justifies modelling dynamics by multifractional Brownian motions, which are consistent with time-dependent Hurst exponents. We improve the existing literature on estimating time-dependent Hurst exponents by proposing a smooth estimate obtained by variational calculus. This method is very general and not restricted to the sole Hurst framework. It is globally more accurate and easier than other existing non-parametric estimation techniques. Besides, in the field of Hurst exponents, it makes it possible to make forecasts based on the estimated multifractional Brownian motion. The application to high-frequency foreign exchange markets (GBP, CHF, SEK, USD, CAD, AUD, JPY, CNY and SGD, all against EUR) shows significantly good forecasts. When the Hurst exponent is higher than 0.5, what depicts a long-memory feature, the accuracy is higher.
NASA Astrophysics Data System (ADS)
Nomura, Shunichi; Ogata, Yosihiko
2016-04-01
We propose a Bayesian method of probability forecasting for recurrent earthquakes of inland active faults in Japan. Renewal processes with the Brownian Passage Time (BPT) distribution are applied for over a half of active faults in Japan by the Headquarters for Earthquake Research Promotion (HERP) of Japan. Long-term forecast with the BPT distribution needs two parameters; the mean and coefficient of variation (COV) for recurrence intervals. The HERP applies a common COV parameter for all of these faults because most of them have very few specified paleoseismic events, which is not enough to estimate reliable COV values for respective faults. However, different COV estimates are proposed for the same paleoseismic catalog by some related works. It can make critical difference in forecast to apply different COV estimates and so COV should be carefully selected for individual faults. Recurrence intervals on a fault are, on the average, determined by the long-term slip rate caused by the tectonic motion but fluctuated by nearby seismicities which influence surrounding stress field. The COVs of recurrence intervals depend on such stress perturbation and so have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus we introduce a spatial structure on its COV parameter by Bayesian modeling with a Gaussian process prior. The COVs on active faults are correlated and take similar values for closely located faults. It is found that the spatial trends in the estimated COV values coincide with the density of active faults in Japan. We also show Bayesian forecasts by the proposed model using Markov chain Monte Carlo method. Our forecasts are different from HERP's forecast especially on the active faults where HERP's forecasts are very high or low.
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
Hossain, Md. Kamrul; Kamil, Anton Abdulbasah; Baten, Md. Azizul; Mustafa, Adli
2012-01-01
The objective of this paper is to apply the Translog Stochastic Frontier production model (SFA) and Data Envelopment Analysis (DEA) to estimate efficiencies over time and the Total Factor Productivity (TFP) growth rate for Bangladeshi rice crops (Aus, Aman and Boro) throughout the most recent data available comprising the period 1989–2008. Results indicate that technical efficiency was observed as higher for Boro among the three types of rice, but the overall technical efficiency of rice production was found around 50%. Although positive changes exist in TFP for the sample analyzed, the average growth rate of TFP for rice production was estimated at almost the same levels for both Translog SFA with half normal distribution and DEA. Estimated TFP from SFA is forecasted with ARIMA (2, 0, 0) model. ARIMA (1, 0, 0) model is used to forecast TFP of Aman from DEA estimation. PMID:23077500
Validation of Satellite-based Rainfall Estimates for Severe Storms (Hurricanes & Tornados)
NASA Astrophysics Data System (ADS)
Nourozi, N.; Mahani, S.; Khanbilvardi, R.
2005-12-01
Severe storms such as hurricanes and tornadoes cause devastating damages, almost every year, over a large section of the United States. More accurate forecasting intensity and track of a heavy storm can help to reduce if not to prevent its damages to lives, infrastructure, and economy. Estimating accurate high resolution quantitative precipitation (QPE) from a hurricane, required to improve the forecasting and warning capabilities, is still a challenging problem because of physical characteristics of the hurricane even when it is still over the ocean. Satellite imagery seems to be a valuable source of information for estimating and forecasting heavy precipitation and also flash floods, particularly for over the oceans where the traditional ground-based gauge and radar sources cannot provide any information. To improve the capability of a rainfall retrieval algorithm for estimating QPE of severe storms, its product is evaluated in this study. High (hourly 4km x 4km) resolutions satellite infrared-based rainfall products, from the NESDIS Hydro-Estimator (HE) and also PERSIANN (Precipitation Estimation from Remotely Sensed Information using an Artificial Neural Networks) algorithms, have been tested against NEXRAD stage-IV and rain gauge observations in this project. Three strong hurricanes: Charley (category 4), Jeanne (category 3), and Ivan (category 3) that caused devastating damages over Florida in the summer 2004, have been considered to be investigated. Preliminary results demonstrate that HE tends to underestimate rain rates when NEXRAD shows heavy storm (rain rates greater than 25 mm/hr) and to overestimate when NEXRAD gives low rainfall amounts, but PERSIANN tends to underestimate rain rates, in general.
Potential use of multiple surveillance data in the forecast of hospital admissions
Lau, Eric H.Y.; Ip, Dennis K.M.; Cowling, Benjamin J.
2013-01-01
Objective This paper describes the potential use of multiple influenza surveillance data to forecast hospital admissions for respiratory diseases. Introduction A sudden surge in hospital admissions in public hospital during influenza peak season has been a challenge to healthcare and manpower planning. In Hong Kong, the timing of influenza peak seasons are variable and early short-term indication of possible surge may facilitate preparedness which could be translated into strategies such as early discharge or reallocation of extra hospital beds. In this study we explore the potential use of multiple routinely collected syndromic data in the forecast of hospital admissions. Methods A multivariate dynamic linear time series model was fitted to multiple syndromic data including influenza-like illness (ILI) rates among networks of public and private general practitioners (GP), and school absenteeism rates, plus drop-in fever count data from designated flu clinics (DFC) that were created during the pandemic. The latent process derived from the model has been used as a measure of the influenza activity [1]. We compare the cross-correlations between estimated influenza level based on multiple surveillance data and GP ILI data, versus accident and emergency hospital admissions with principal diagnoses of respiratory diseases and pneumonia & influenza (P&I). Results The estimated influenza activity has higher cross-correlation with respiratory and P&I admissions (ρ=0.66 and 0.73 respectively) compared to that of GP ILI rates (Table 1). Cross correlations drop distinctly after lag 2 for both estimated influenza activity and GP ILI rates. Conclusions The use of a multivariate method to integrate information from multiple sources of influenza surveillance data may have the potential to improve forecasting of admission surge of respiratory diseases.
Predicting spatio-temporal failure in large scale observational and micro scale experimental systems
NASA Astrophysics Data System (ADS)
de las Heras, Alejandro; Hu, Yong
2006-10-01
Forecasting has become an essential part of modern thought, but the practical limitations still are manifold. We addressed future rates of change by comparing models that take into account time, and models that focus more on space. Cox regression confirmed that linear change can be safely assumed in the short-term. Spatially explicit Poisson regression, provided a ceiling value for the number of deforestation spots. With several observed and estimated rates, it was decided to forecast using the more robust assumptions. A Markov-chain cellular automaton thus projected 5-year deforestation in the Amazonian Arc of Deforestation, showing that even a stable rate of change would largely deplete the forest area. More generally, resolution and implementation of the existing models could explain many of the modelling difficulties still affecting forecasting.
Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics
NASA Astrophysics Data System (ADS)
Kuchment, L.
2012-04-01
Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.
Inference and forecast of H7N9 influenza in China, 2013 to 2015.
Li, Ruiyun; Bai, Yuqi; Heaney, Alex; Kandula, Sasikiran; Cai, Jun; Zhao, Xuyi; Xu, Bing; Shaman, Jeffrey
2017-02-16
The recent emergence of A(H7N9) avian influenza poses a significant challenge to public health in China and around the world; however, understanding of the transmission dynamics and progression of influenza A(H7N9) infection in domestic poultry, as well as spillover transmission to humans, remains limited. Here, we develop a mathematical model-Bayesian inference system which combines a simple epidemic model and data assimilation method, and use it in conjunction with data on observed human influenza A(H7N9) cases from 19 February 2013 to 19 September 2015 to estimate key epidemiological parameters and to forecast infection in both poultry and humans. Our findings indicate a high outbreak attack rate of 33% among poultry but a low rate of chicken-to-human spillover transmission. In addition, we generated accurate forecasts of the peak timing and magnitude of human influenza A(H7N9) cases. This work demonstrates that transmission dynamics within an avian reservoir can be estimated and that real-time forecast of spillover avian influenza in humans is possible. This article is copyright of The Authors, 2017.
Potential barge transportation for inbound corn and grain
DOT National Transportation Integrated Search
1997-12-31
This research develops a model for estimating future barge and rail rates for decision making. The Box-Jenkins and the Regression Analysis with ARIMA errors forecasting methods were used to develop appropriate models for determining future rates. A s...
Alwee, Razana; Hj Shamsuddin, Siti Mariyam; Sallehuddin, Roselina
2013-01-01
Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models. PMID:23766729
Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Sallehuddin, Roselina
2013-01-01
Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models.
Large earthquake rates from geologic, geodetic, and seismological perspectives
NASA Astrophysics Data System (ADS)
Jackson, D. D.
2017-12-01
Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes up to about magnitude 7. Regional forecasts for a few decades, like those in UCERF3, could be improved by calibrating tectonic moment rate to past seismicity rates. Century-long forecasts must be speculative. Estimates of maximum magnitude and rate of giant earthquakes over geologic time scales require more than science.
Mean Bias in Seasonal Forecast Model and ENSO Prediction Error.
Kim, Seon Tae; Jeong, Hye-In; Jin, Fei-Fei
2017-07-20
This study uses retrospective forecasts made using an APEC Climate Center seasonal forecast model to investigate the cause of errors in predicting the amplitude of El Niño Southern Oscillation (ENSO)-driven sea surface temperature variability. When utilizing Bjerknes coupled stability (BJ) index analysis, enhanced errors in ENSO amplitude with forecast lead times are found to be well represented by those in the growth rate estimated by the BJ index. ENSO amplitude forecast errors are most strongly associated with the errors in both the thermocline slope response and surface wind response to forcing over the tropical Pacific, leading to errors in thermocline feedback. This study concludes that upper ocean temperature bias in the equatorial Pacific, which becomes more intense with increasing lead times, is a possible cause of forecast errors in the thermocline feedback and thus in ENSO amplitude.
The Hawaiian Volcano Observatory's current approach to forecasting lava flow hazards (Invited)
NASA Astrophysics Data System (ADS)
Kauahikaua, J. P.
2013-12-01
Hawaiian Volcanoes are best known for their frequent basaltic eruptions, which typically start with fast-moving channelized `a`a flows fed by high eruptions rates. If the flows continue, they generally transition into pahoehoe flows, fed by lower eruption rates, after a few days to weeks. Kilauea Volcano's ongoing eruption illustrates this--since 1986, effusion at Kilauea has mostly produced pahoehoe. The current state of lava flow simulation is quite advanced, but the simplicity of the models mean that they are most appropriately used during the first, most vigorous, days to weeks of an eruption - during the effusion of `a`a flows. Colleagues at INGV in Catania have shown decisively that MAGFLOW simulations utilizing satellite-derived eruption rates can be effective at estimating hazards during the initial periods of an eruption crisis. However, the algorithms do not simulate the complexity of pahoehoe flows. Forecasts of lava flow hazards are the most common form of volcanic hazard assessments made in Hawai`i. Communications with emergency managers over the last decade have relied on simple steepest-descent line maps, coupled with empirical lava flow advance rate information, to portray the imminence of lava flow hazard to nearby communities. Lavasheds, calculated as watersheds, are used as a broader context for the future flow paths and to advise on the utility of diversion efforts, should they be contemplated. The key is to communicate the uncertainty of any approach used to formulate a forecast and, if the forecast uses simple tools, these communications can be fairly straightforward. The calculation of steepest-descent paths and lavasheds relies on the accuracy of the digital elevation model (DEM) used, so the choice of DEM is critical. In Hawai`i, the best choice is not the most recent but is a 1980s-vintage 10-m DEM--more recent LIDAR and satellite radar DEM are referenced to the ellipsoid and include vegetation effects. On low-slope terrain, steepest descent lines calculated on a geoid-based DEM may differ significantly from those calculated on an ellipsoid-based DEM. Good estimates of lava flow advance rates can be obtained from empirical compilations of historical advance rates of Hawaiian lava flows. In this way, rates appropriate for observed flow types (`a`a or pahoehoe, channelized or not) can be applied. Eruption rate is arguably the most important factor, while slope is also significant for low eruption rates. Eruption rate, however, remains the most difficult parameter to estimate during an active eruption. The simplicity of the HVO approach is its major benefit. How much better can lava-flow advance be forecast for all types of lava flows? Will the improvements outweigh the increased uncertainty propagated through the simulation calculations? HVO continues to improve and evaluate its lava flow forecasting tools to provide better hazard assessments to emergency personnel.
NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
Forecasting Cause-Specific Mortality in Korea up to Year 2032.
Yun, Jae-Won; Son, Mia
2016-08-01
Forecasting cause-specific mortality can help estimate the future burden of diseases and provide a clue for preventing diseases. Our objective was to forecast the mortality for causes of death in the future (2013-2032) based on the past trends (1983-2012) in Korea. The death data consisted of 12 major causes of death from 1983 to 2012 and the population data consisted of the observed and estimated populations (1983-2032) in Korea. The modified age-period-cohort model with an R-based program, nordpred software, was used to forecast future mortality. Although the age-standardized rates for the world standard population for both sexes are expected to decrease from 2008-2012 to 2028-2032 (males: -31.4%, females: -32.3%), the crude rates are expected to increase (males: 46.3%, females: 33.4%). The total number of deaths is also estimated to increase (males: 52.7%, females: 41.9%). Additionally, the largest contribution to the overall change in deaths was the change in the age structures. Several causes of death are projected to increase in both sexes (cancer, suicide, heart diseases, pneumonia and Alzheimer's disease), while others are projected to decrease (cerebrovascular diseases, liver diseases, diabetes mellitus, traffic accidents, chronic lower respiratory diseases, and pulmonary tuberculosis). Cancer is expected to be the highest cause of death for both the 2008-2012 and 2028-2032 time periods in Korea. To reduce the disease burden, projections of the future cause-specific mortality should be used as fundamental data for developing public health policies.
Novel methodology for pharmaceutical expenditure forecast
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. Conclusions This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making. PMID:27226843
The Role of Type and Source of Uncertainty on the Processing of Climate Models Projections.
Benjamin, Daniel M; Budescu, David V
2018-01-01
Scientists agree that the climate is changing due to human activities, but there is less agreement about the specific consequences and their timeline. Disagreement among climate projections is attributable to the complexity of climate models that differ in their structure, parameters, initial conditions, etc. We examine how different sources of uncertainty affect people's interpretation of, and reaction to, information about climate change by presenting participants forecasts from multiple experts. Participants viewed three types of sets of sea-level rise projections: (1) precise, but conflicting ; (2) imprecise , but agreeing, and (3) hybrid that were both conflicting and imprecise. They estimated the most likely sea-level rise, provided a range of possible values and rated the sets on several features - ambiguity, credibility, completeness, etc. In Study 1, everyone saw the same hybrid set. We found that participants were sensitive to uncertainty between sources, but not to uncertainty about which model was used. The impacts of conflict and imprecision were combined for estimation tasks and compromised for feature ratings . Estimates were closer to the experts' original projections, and sets were rated more favorably under imprecision. Estimates were least consistent with (narrower than) the experts in the hybrid condition, but participants rated the conflicting set least favorably. In Study 2, we investigated the hybrid case in more detail by creating several distinct interval sets that combine conflict and imprecision. Two factors drive perceptual differences: overlap - the structure of the forecast set (whether intersecting, nested, tangent, or disjoint) - and a symmetry - the balance of the set. Estimates were primarily driven by asymmetry, and preferences were primarily driven by overlap. Asymmetric sets were least consistent with the experts: estimated ranges were narrower, and estimates of the most likely value were shifted further below the set mean. Intersecting and nested sets were rated similarly to imprecision, and ratings of disjoint and tangent sets were rated like conflict. Our goal was to determine which underlying factors of information sets drive perceptions of uncertainty in consistent, predictable ways. The two studies lead us to conclude that perceptions of agreement require intersection and balance, and overly precise forecasts lead to greater perceptions of disagreement and a greater likelihood of the public discrediting and misinterpreting information.
The Role of Type and Source of Uncertainty on the Processing of Climate Models Projections
Benjamin, Daniel M.; Budescu, David V.
2018-01-01
Scientists agree that the climate is changing due to human activities, but there is less agreement about the specific consequences and their timeline. Disagreement among climate projections is attributable to the complexity of climate models that differ in their structure, parameters, initial conditions, etc. We examine how different sources of uncertainty affect people’s interpretation of, and reaction to, information about climate change by presenting participants forecasts from multiple experts. Participants viewed three types of sets of sea-level rise projections: (1) precise, but conflicting; (2) imprecise, but agreeing, and (3) hybrid that were both conflicting and imprecise. They estimated the most likely sea-level rise, provided a range of possible values and rated the sets on several features – ambiguity, credibility, completeness, etc. In Study 1, everyone saw the same hybrid set. We found that participants were sensitive to uncertainty between sources, but not to uncertainty about which model was used. The impacts of conflict and imprecision were combined for estimation tasks and compromised for feature ratings. Estimates were closer to the experts’ original projections, and sets were rated more favorably under imprecision. Estimates were least consistent with (narrower than) the experts in the hybrid condition, but participants rated the conflicting set least favorably. In Study 2, we investigated the hybrid case in more detail by creating several distinct interval sets that combine conflict and imprecision. Two factors drive perceptual differences: overlap – the structure of the forecast set (whether intersecting, nested, tangent, or disjoint) – and asymmetry – the balance of the set. Estimates were primarily driven by asymmetry, and preferences were primarily driven by overlap. Asymmetric sets were least consistent with the experts: estimated ranges were narrower, and estimates of the most likely value were shifted further below the set mean. Intersecting and nested sets were rated similarly to imprecision, and ratings of disjoint and tangent sets were rated like conflict. Our goal was to determine which underlying factors of information sets drive perceptions of uncertainty in consistent, predictable ways. The two studies lead us to conclude that perceptions of agreement require intersection and balance, and overly precise forecasts lead to greater perceptions of disagreement and a greater likelihood of the public discrediting and misinterpreting information. PMID:29636717
Greenberg, L; Cultice, J M
1997-01-01
OBJECTIVE: The Health Resources and Services Administration's Bureau of Health Professions developed a demographic utilization-based model of physician specialty requirements to explore the consequences of a broad range of scenarios pertaining to the nation's health care delivery system on need for physicians. DATA SOURCE/STUDY SETTING: The model uses selected data primarily from the National Center for Health Statistics, the American Medical Association, and the U.S. Bureau of Census. Forecasts are national estimates. STUDY DESIGN: Current (1989) utilization rates for ambulatory and inpatient medical specialty services were obtained for the population according to age, gender, race/ethnicity, and insurance status. These rates are used to estimate specialty-specific total service utilization expressed in patient care minutes for future populations and converted to physician requirements by applying per-physician productivity estimates. DATA COLLECTION/EXTRACTION METHODS: Secondary data were analyzed and put into matrixes for use in the mainframe computer-based model. Several missing data points, e.g., for HMO-enrolled populations, were extrapolated from available data by the project's contractor. PRINCIPAL FINDINGS: The authors contend that the Bureau's demographic utilization model represents improvements over other data-driven methodologies that rely on staffing ratios and similar supply-determined bases for estimating requirements. The model's distinct utility rests in offering national-level physician specialty requirements forecasts. Images Figure 1 PMID:9018213
Ji, Eun Sook; Park, Kyu-Hyun
2012-12-01
This study was conducted to evaluate methane (CH4) and nitrous oxide (N2O) emissions from livestock agriculture in 16 local administrative districts of Korea from 1990 to 2030. National Inventory Report used 3 yr averaged livestock population but this study used 1 yr livestock population to find yearly emission fluctuations. Extrapolation of the livestock population from 1990 to 2009 was used to forecast future livestock population from 2010 to 2030. Past (yr 1990 to 2009) and forecasted (yr 2010 to 2030) averaged enteric CH4 emissions and CH4 and N2O emissions from manure treatment were estimated. In the section of enteric fermentation, forecasted average CH4 emissions from 16 local administrative districts were estimated to increase by 4%-114% compared to that of the past except for Daejeon (-63%), Seoul (-36%) and Gyeonggi (-7%). As for manure treatment, forecasted average CH4 emissions from the 16 local administrative districts were estimated to increase by 3%-124% compared to past average except for Daejeon (-77%), Busan (-60%), Gwangju (-48%) and Seoul (-8%). For manure treatment, forecasted average N2O emissions from the 16 local administrative districts were estimated to increase by 10%-153% compared to past average CH4 emissions except for Daejeon (-60%), Seoul (-4.0%), and Gwangju (-0.2%). With the carbon dioxide equivalent emissions (CO2-Eq), forecasted average CO2-Eq from the 16 local administrative districts were estimated to increase by 31%-120% compared to past average CH4 emissions except Daejeon (-65%), Seoul (-24%), Busan (-18%), Gwangju (-8%) and Gyeonggi (-1%). The decreased CO2-Eq from 5 local administrative districts was only 34 kt, which was insignificantly small compared to increase of 2,809 kt from other 11 local administrative districts. Annual growth rates of enteric CH4 emissions, CH4 and N2O emissions from manure management in Korea from 1990 to 2009 were 1.7%, 2.6%, and 3.2%, respectively. The annual growth rate of total CO2-Eq was 2.2%. Efforts by the local administrative offices to improve the accuracy of activity data are essential to improve GHG inventories. Direct measurements of GHG emissions from enteric fermentation and manure treatment systems will further enhance the accuracy of the GHG data. (Key Words: Greenhouse Gas, Methane, Nitrous Oxide, Carbon Dioxide Equivalent Emission, Climate Change).
On the adaptive daily forecasting of seismic aftershock hazard
NASA Astrophysics Data System (ADS)
Ebrahimian, Hossein; Jalayer, Fatemeh; Asprone, Domenico; Lombardi, Anna Maria; Marzocchi, Warner; Prota, Andrea; Manfredi, Gaetano
2013-04-01
Post-earthquake ground motion hazard assessment is a fundamental initial step towards time-dependent seismic risk assessment for buildings in a post main-shock environment. Therefore, operative forecasting of seismic aftershock hazard forms a viable support basis for decision-making regarding search and rescue, inspection, repair, and re-occupation in a post main-shock environment. Arguably, an adaptive procedure for integrating the aftershock occurrence rate together with suitable ground motion prediction relations is key to Probabilistic Seismic Aftershock Hazard Assessment (PSAHA). In the short-term, the seismic hazard may vary significantly (Jordan et al., 2011), particularly after the occurrence of a high magnitude earthquake. Hence, PSAHA requires a reliable model that is able to track the time evolution of the earthquake occurrence rates together with suitable ground motion prediction relations. This work focuses on providing adaptive daily forecasts of the mean daily rate of exceeding various spectral acceleration values (the aftershock hazard). Two well-established earthquake occurrence models suitable for daily seismicity forecasts associated with the evolution of an aftershock sequence, namely, the modified Omori's aftershock model and the Epidemic Type Aftershock Sequence (ETAS) are adopted. The parameters of the modified Omori model are updated on a daily basis using Bayesian updating and based on the data provided by the ongoing aftershock sequence based on the methodology originally proposed by Jalayer et al. (2011). The Bayesian updating is used also to provide sequence-based parameter estimates for a given ground motion prediction model, i.e. the aftershock events in an ongoing sequence are exploited in order to update in an adaptive manner the parameters of an existing ground motion prediction model. As a numerical example, the mean daily rates of exceeding specific spectral acceleration values are estimated adaptively for the L'Aquila 2009 aftershock catalog. The parameters of the modified Omori model are estimated in an adaptive manner using the Bayesian updating based on the aftershock events that had already taken place at each day elapsed and using the Italian generic sequence (Lolli and Gasperini 2003) as prior information. For the ETAS model, the real-time daily forecast of the spatio-temporal evolution of the L'Aquila sequence provided for the Italian Civil Protection for managing the emergency (Marzocchi and Lombardi, 2009) is utilized. Moreover, the parameters of the ground motion prediction relation proposed by Sabetta and Pugliese (1996) are updated adaptively and on a daily basis using Bayesian updating based on the ongoing aftershock sequence. Finally, the forecasted daily rates of exceeding (first-mode) spectral acceleration values are compared with observed rates of exceedance calculated based on the wave-forms that have actually taken place. References Jalayer, F., Asprone, D., Prota, A., Manfredi, G. (2011). A decision support system for post-earthquake reliability assessment of structures subjected to after-shocks: an application to L'Aquila earthquake, 2009. Bull. Earthquake Eng. 9(4) 997-1014. Jordan, T.H., Chen Y-T., Gasparini P., Madariaga R., Main I., Marzocchi W., Papadopoulos G., Sobolev G., Yamaoka K., and J. Zschau (2011). Operational earthquake forecasting: State of knowledge and guidelines for implementation, Ann. Geophys. 54(4) 315-391, doi 10.4401/ag-5350. Lolli, B., and P. Gasperini (2003). Aftershocks hazard in Italy part I: estimation of time-magnitude distribution model parameters and computation of probabilities of occurrence. Journal of Seismology 7(2) 235-257. Marzocchi, W., and A.M. Lombardi (2009). Real-time forecasting following a damaging earthquake, Geophys. Res. Lett. 36, L21302, doi: 10.1029/2009GL040233. Sabetta F., A. Pugliese (1996) Estimation of response spectra and simulation of nonstationary earthquake ground motions. Bull Seismol Soc Am 86(2) 337-352.
NASA Astrophysics Data System (ADS)
Zavodsky, B.; Le Roy, A.; Smith, M. R.; Case, J.
2016-12-01
In support of NASA's recently launched GPM `core' satellite, the NASA-SPoRT project is leveraging experience in research-to-operations transitions and training to provide feedback on the operational utility of GPM products. Thus far, SPoRT has focused on evaluating the Level 2 GPROF passive microwave and IMERG rain rate estimates. Formal evaluations with end-users have occurred, as well as internal evaluations of the datasets. One set of end users for these products is National Weather Service Forecast Offices (WFOs) and National Weather Service River Forecast Centers (RFCs), comprising forecasters and hydrologists. SPoRT has hosted a series of formal assessments to determine uses and utility of these datasets for NWS operations at specific offices. Forecasters primarily have used Level 2 swath rain rates to observe rainfall in otherwise data-void regions and to confirm model QPF for their nowcasting or short-term forecasting. Hydrologists have been evaluating both the Level 2 rain rates and the IMERG rain rates, including rain rate accumulations derived from IMERG; hydrologists have used these data to supplement gauge data for post-event analysis as well as for longer-term forecasting. Results from specific evaluations will be presented. Another evaluation of the GPM passive microwave rain rates has been in using the data within other products that are currently transitioned to end-users, rather than as stand-alone observations. For example, IMERG Early data is being used as a forcing mechanism in the NASA Land Information System (LIS) for real-time soil moisture product over eastern Africa. IMERG is providing valuable precipitation information to LIS in an otherwise data-void region. Results and caveats will briefly be discussed. A third application of GPM data is using the IMERG Late and Final products for model verification in remote regions where high-quality gridded precipitation fields are not readily available. These datasets can now be used to verify NWP model forecasts over Eastern Africa using the SPoRT-MET scripts verification package, a wrapper around the NCAR Model Evaluation Toolkit (MET) verification software.
Effective Presentation of Metabolic Rate Information for Lunar Extravehicular Activity (EVA)
NASA Technical Reports Server (NTRS)
Mackin, Michael A.; Gonia, Philip; Lombay-Gonzalez, Jose
2010-01-01
During human exploration of the lunar surface, a suited crewmember needs effective and accurate information about consumable levels remaining in their life support system. The information must be presented in a manner that supports real-time consumable monitoring and route planning. Since consumable usage is closely tied to metabolic rate, the lunar suit must estimate metabolic rate from life support sensors, such as oxygen tank pressures, carbon dioxide partial pressure, and cooling water inlet and outlet temperatures. To provide adequate warnings that account for traverse time for a crewmember to return to a safe haven, accurate forecasts of consumable depletion rates are required. The forecasts must be presented to the crewmember in a straightforward, effective manner. In order to evaluate methods for displaying consumable forecasts, a desktop-based simulation of a lunar Extravehicular Activity (EVA) has been developed for the Constellation lunar suite s life-support system. The program was used to compare the effectiveness of several different data presentation methods.
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Heap, Michael J.; Main, Ian G.
2011-08-01
Power-law accelerations in the mean rate of strain, earthquakes and other precursors have been widely reported prior to material failure phenomena, including volcanic eruptions, landslides and laboratory deformation experiments, as predicted by several theoretical models. The Failure Forecast Method (FFM), which linearizes the power-law trend, has been routinely used to forecast the failure time in retrospective analyses; however, its performance has never been formally evaluated. Here we use synthetic and real data, recorded in laboratory brittle creep experiments and at volcanoes, to show that the assumptions of the FFM are inconsistent with the error structure of the data, leading to biased and imprecise forecasts. We show that a Generalized Linear Model method provides higher-quality forecasts that converge more accurately to the eventual failure time, accounting for the appropriate error distributions. This approach should be employed in place of the FFM to provide reliable quantitative forecasts and estimate their associated uncertainties.
Validating induced seismicity forecast models—Induced Seismicity Test Bench
NASA Astrophysics Data System (ADS)
Király-Proag, Eszter; Zechar, J. Douglas; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph
2016-08-01
Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in but is only mediocre at forecasting the spatial distribution. On the other hand, SaSS forecasts the spatial distribution better and gives better seismicity rate estimates before shut-in. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in.
NASA Astrophysics Data System (ADS)
Hiemer, S.; Woessner, J.; Basili, R.; Danciu, L.; Giardini, D.; Wiemer, S.
2014-08-01
We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project `Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20 yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European region.
NASA Technical Reports Server (NTRS)
Meng, Huan; Ferraro, Ralph; Kongoli, Cezar; Yan, Banghua; Zavodsky, Bradley; Zhao, Limin; Dong, Jun; Wang, Nai-Yu
2015-01-01
(AMSU), Microwave Humidity Sounder (MHS) and Advance Technology Microwave Sounder (ATMS). ATMS is the follow-on sensor to AMSU and MHS. Currently, an AMSU and MHS based land snowfall rate (SFR) product is running operationally at NOAA/NESDIS. Based on the AMSU/MHS SFR, an ATMS SFR algorithm has also been developed. The algorithm performs retrieval in three steps: snowfall detection, retrieval of cloud properties, and estimation of snow particle terminal velocity and snowfall rate. The snowfall detection component utilizes principal component analysis and a logistic regression model. It employs a combination of temperature and water vapor sounding channels to detect the scattering signal from falling snow and derives the probability of snowfall. Cloud properties are retrieved using an inversion method with an iteration algorithm and a two-stream radiative transfer model. A method adopted to calculate snow particle terminal velocity. Finally, snowfall rate is computed by numerically solving a complex integral. The SFR products are being used mainly in two communities: hydrology and weather forecast. Global blended precipitation products traditionally do not include snowfall derived from satellites because such products were not available operationally in the past. The ATMS and AMSU/MHS SFR now provide the winter precipitation information for these blended precipitation products. Weather forecasters mainly rely on radar and station observations for snowfall forecast. The SFR products can fill in gaps where no conventional snowfall data are available to forecasters. The products can also be used to confirm radar and gauge snowfall data and increase forecasters' confidence in their prediction.
Forecasting Cause-Specific Mortality in Korea up to Year 2032
2016-01-01
Forecasting cause-specific mortality can help estimate the future burden of diseases and provide a clue for preventing diseases. Our objective was to forecast the mortality for causes of death in the future (2013-2032) based on the past trends (1983-2012) in Korea. The death data consisted of 12 major causes of death from 1983 to 2012 and the population data consisted of the observed and estimated populations (1983-2032) in Korea. The modified age-period-cohort model with an R-based program, nordpred software, was used to forecast future mortality. Although the age-standardized rates for the world standard population for both sexes are expected to decrease from 2008-2012 to 2028-2032 (males: -31.4%, females: -32.3%), the crude rates are expected to increase (males: 46.3%, females: 33.4%). The total number of deaths is also estimated to increase (males: 52.7%, females: 41.9%). Additionally, the largest contribution to the overall change in deaths was the change in the age structures. Several causes of death are projected to increase in both sexes (cancer, suicide, heart diseases, pneumonia and Alzheimer’s disease), while others are projected to decrease (cerebrovascular diseases, liver diseases, diabetes mellitus, traffic accidents, chronic lower respiratory diseases, and pulmonary tuberculosis). Cancer is expected to be the highest cause of death for both the 2008-2012 and 2028-2032 time periods in Korea. To reduce the disease burden, projections of the future cause-specific mortality should be used as fundamental data for developing public health policies. PMID:27478326
An Analysis on the Unemployment Rate in the Philippines: A Time Series Data Approach
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Tampis, R. L.; E Atienza, JB
2017-03-01
This study aims to formulate a mathematical model for forecasting and estimating unemployment rate in the Philippines. Also, factors which can predict the unemployment is to be determined among the considered variables namely Labor Force Rate, Population, Inflation Rate, Gross Domestic Product, and Gross National Income. Granger-causal relationship and integration among the dependent and independent variables are also examined using Pairwise Granger-causality test and Johansen Cointegration Test. The data used were acquired from the Philippine Statistics Authority, National Statistics Office, and Bangko Sentral ng Pilipinas. Following the Box-Jenkins method, the formulated model for forecasting the unemployment rate is SARIMA (6, 1, 5) × (0, 1, 1)4 with a coefficient of determination of 0.79. The actual values are 99 percent identical to the predicted values obtained through the model, and are 72 percent closely relative to the forecasted ones. According to the results of the regression analysis, Labor Force Rate and Population are the significant factors of unemployment rate. Among the independent variables, Population, GDP, and GNI showed to have a granger-causal relationship with unemployment. It is also found that there are at least four cointegrating relations between the dependent and independent variables.
Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model
NASA Astrophysics Data System (ADS)
Anderson, K. R.
2016-12-01
Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics-based, mixed deterministic-probabilistic eruption forecasting approach in reducing and quantifying these uncertainties.
Policy issues and data communications for NASA earth observation missions until 1985
NASA Technical Reports Server (NTRS)
Corte, A. B.; Warren, C. J.
1975-01-01
The series of LANDSAT sensors with the highest potential data rates of the missions were examined. An examination of LANDSAT imagery uses shows that relatively few require transmission of the full resolution data on a repetitive quasi real time basis. Accuracy of global crop size forecasting can possibly be improved through information derived from LANDSAT imagery. A current forecasting experiment uses the imagery for crop area estimation only, yield being derived from other data sources.
NASA Astrophysics Data System (ADS)
Khade, Vikram; Kurian, Jaison; Chang, Ping; Szunyogh, Istvan; Thyng, Kristen; Montuoro, Raffaele
2017-05-01
This paper demonstrates the potential of ocean ensemble forecasting in the Gulf of Mexico (GoM). The Bred Vector (BV) technique with one week rescaling frequency is implemented on a 9 km resolution version of the Regional Ocean Modelling System (ROMS). Numerical experiments are carried out by using the HYCOM analysis products to define the initial conditions and the lateral boundary conditions. The growth rates of the forecast uncertainty are estimated to be about 10% of initial amplitude per week. By carrying out ensemble forecast experiments with and without perturbed surface forcing, it is demonstrated that in the coastal regions accounting for uncertainties in the atmospheric forcing is more important than accounting for uncertainties in the ocean initial conditions. In the Loop Current region, the initial condition uncertainties, are the dominant source of the forecast uncertainty. The root-mean-square error of the Lagrangian track forecasts at the 15-day forecast lead time can be reduced by about 10 - 50 km using the ensemble mean Eulerian forecast of the oceanic flow for the computation of the tracks, instead of the single-initial-condition Eulerian forecast.
Impact of TRMM and SSM/I-derived Precipitation and Moisture Data on the GEOS Global Analysis
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Zhang, Sara Q.; daSilva, Arlindo M.; Olson, William S.
1999-01-01
Current global analyses contain significant errors in primary hydrological fields such as precipitation, evaporation, and related cloud and moisture in the tropics. The Data Assimilation Office at NASA's Goddard Space Flight Center has been exploring the use of space-based rainfall and total precipitable water (TPW) estimates to constrain these hydrological parameters in the Goddard Earth Observing System (GEOS) global data assimilation system. We present results showing that assimilating the 6-hour averaged rain rates and TPW estimates from the Tropical Rainfall Measuring Mission (TRMM) and Special Sensor Microwave/Imager (SSM/I) instruments improves not only the precipitation and moisture estimates but also reduce state-dependent systematic errors in key climate parameters directly linked to convection such as the outgoing longwave radiation, clouds, and the large-scale circulation. The improved analysis also improves short-range forecasts beyond 1 day, but the impact is relatively modest compared with improvements in the time-averaged analysis. The study shows that, in the presence of biases and other errors of the forecast model, improving the short-range forecast is not necessarily prerequisite for improving the assimilation as a climate data set. The full impact of a given type of observation on the assimilated data set should not be measured solely in terms of forecast skills.
Prospective Tests of Southern California Earthquake Forecasts
NASA Astrophysics Data System (ADS)
Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.
2004-12-01
We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we estimate by simulations. In this scheme, each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results will be archived and posted on the RELM web site. Major problems under discussion include how to treat aftershocks, which clearly violate the variable-rate Poissonian hypotheses that we employ, and how to deal with the temporal variations in catalog completeness that follow large earthquakes.
Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices
Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling
2008-01-01
The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...
Diabatic heating rate estimates from European Centre for Medium-Range Weather Forecasts analyses
NASA Technical Reports Server (NTRS)
Christy, John R.
1991-01-01
Vertically integrated diabatic heating rate estimates (H) calculated from 32 months of European Center for Medium-Range Weather Forecasts daily analyses (May 1985-December 1987) are determined as residuals of the thermodynamic equation in pressure coordinates. Values for global, hemispheric, zonal, and grid point H are given as they vary over the time period examined. The distribution of H is compared with previous results and with outgoing longwave radiation (OLR) measurements. The most significant negative correlations between H and OLR occur for (1) tropical and Northern-Hemisphere mid-latitude oceanic areas and (2) zonal and hemispheric mean values for periods less than 90 days. Largest positive correlations are seen in periods greater than 90 days for the Northern Hemispheric mean and continental areas of North Africa, North America, northern Asia, and Antarctica. The physical basis for these relationships is discussed. An interyear comparison between 1986 and 1987 reveals the ENSO signal.
Interevent times in a new alarm-based earthquake forecasting model
NASA Astrophysics Data System (ADS)
Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed
2013-09-01
This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the occurrence region of the 2011 Mw 9.0 Tohoku earthquake, whereas the RI method did not. Cases where a period of quiescent seismicity occurred before the target event often lead to low MR scores, meaning that the target event was not predicted and indicating that our model could be further improved by taking into account quiescent periods in the alarm strategy.
The Economic Value of Air Quality Forecasting
NASA Astrophysics Data System (ADS)
Anderson-Sumo, Tasha
Both long-term and daily air quality forecasts provide an essential component to human health and impact costs. According the American Lung Association, the estimated current annual cost of air pollution related illness in the United States, adjusted for inflation (3% per year), is approximately $152 billion. Many of the risks such as hospital visits and morality are associated with poor air quality days (where the Air Quality Index is greater than 100). Groups such as sensitive groups become more susceptible to the resulting conditions and more accurate forecasts would help to take more appropriate precautions. This research focuses on evaluating the utility of air quality forecasting in terms of its potential impacts by building on air quality forecasting and economical metrics. Our analysis includes data collected during the summertime ozone seasons between 2010 and 2012 from air quality models for the Washington, DC/Baltimore, MD region. The metrics that are relevant to our analysis include: (1) The number of times that a high ozone or particulate matter (PM) episode is correctly forecasted, (2) the number of times that high ozone or PM episode is forecasted when it does not occur and (3) the number of times when the air quality forecast predicts a cleaner air episode when the air was observed to have high ozone or PM. Our collection of data included available air quality model forecasts of ozone and particulate matter data from the U.S. Environmental Protection Agency (EPA)'s AIRNOW as well as observational data of ozone and particulate matter from Clean Air Partners. We evaluated the performance of the air quality forecasts with that of the observational data and found that the forecast models perform well for the Baltimore/Washington region and the time interval observed. We estimate the potential amount for the Baltimore/Washington region accrues to a savings of up to 5,905 lives and 5.9 billion dollars per year. This total assumes perfect compliance with bad air quality warning and forecast air quality forecasts. There is a difficulty presented with evaluating the economic utility of the forecasts. All may not comply and even with a low compliance rate of 5% and 72% as the average probability of detection of poor air quality days by the air quality models, we estimate that the forecasting program saves 412 lives or 412 million dollars per year for the region. The totals we found are great or greater than other typical yearly meteorological hazard programs such as tornado or hurricane forecasting and it is clear that the economic value of air quality forecasting in the Baltimore/Washington region is vital.
NASA Astrophysics Data System (ADS)
Sokol, Zbyněk; Mejsnar, Jan; Pop, Lukáš; Bližňák, Vojtěch
2017-09-01
A new method for the probabilistic nowcasting of instantaneous rain rates (ENS) based on the ensemble technique and extrapolation along Lagrangian trajectories of current radar reflectivity is presented. Assuming inaccurate forecasts of the trajectories, an ensemble of precipitation forecasts is calculated and used to estimate the probability that rain rates will exceed a given threshold in a given grid point. Although the extrapolation neglects the growth and decay of precipitation, their impact on the probability forecast is taken into account by the calibration of forecasts using the reliability component of the Brier score (BS). ENS forecasts the probability that the rain rates will exceed thresholds of 0.1, 1.0 and 3.0 mm/h in squares of 3 km by 3 km. The lead times were up to 60 min, and the forecast accuracy was measured by the BS. The ENS forecasts were compared with two other methods: combined method (COM) and neighbourhood method (NEI). NEI considered the extrapolated values in the square neighbourhood of 5 by 5 grid points of the point of interest as ensemble members, and the COM ensemble was comprised of united ensemble members of ENS and NEI. The results showed that the calibration technique significantly improves bias of the probability forecasts by including additional uncertainties that correspond to neglected processes during the extrapolation. In addition, the calibration can also be used for finding the limits of maximum lead times for which the forecasting method is useful. We found that ENS is useful for lead times up to 60 min for thresholds of 0.1 and 1 mm/h and approximately 30 to 40 min for a threshold of 3 mm/h. We also found that a reasonable size of the ensemble is 100 members, which provided better scores than ensembles with 10, 25 and 50 members. In terms of the BS, the best results were obtained by ENS and COM, which are comparable. However, ENS is better calibrated and thus preferable.
A Bayesian Assessment of Seismic Semi-Periodicity Forecasts
NASA Astrophysics Data System (ADS)
Nava, F.; Quinteros, C.; Glowacka, E.; Frez, J.
2016-01-01
Among the schemes for earthquake forecasting, the search for semi-periodicity during large earthquakes in a given seismogenic region plays an important role. When considering earthquake forecasts based on semi-periodic sequence identification, the Bayesian formalism is a useful tool for: (1) assessing how well a given earthquake satisfies a previously made forecast; (2) re-evaluating the semi-periodic sequence probability; and (3) testing other prior estimations of the sequence probability. A comparison of Bayesian estimates with updated estimates of semi-periodic sequences that incorporate new data not used in the original estimates shows extremely good agreement, indicating that: (1) the probability that a semi-periodic sequence is not due to chance is an appropriate estimate for the prior sequence probability estimate; and (2) the Bayesian formalism does a very good job of estimating corrected semi-periodicity probabilities, using slightly less data than that used for updated estimates. The Bayesian approach is exemplified explicitly by its application to the Parkfield semi-periodic forecast, and results are given for its application to other forecasts in Japan and Venezuela.
NASA Astrophysics Data System (ADS)
Gass, S. I.
1982-05-01
The theoretical and applied state of the art of oil and gas supply models was discussed. The following areas were addressed: the realities of oil and gas supply, prediction of oil and gas production, problems in oil and gas modeling, resource appraisal procedures, forecasting field size and production, investment and production strategies, estimating cost and production schedules for undiscovered fields, production regulations, resource data, sensitivity analysis of forecasts, econometric analysis of resource depletion, oil and gas finding rates, and various models of oil and gas supply.
NASA Astrophysics Data System (ADS)
Webley, P. W.; Dehn, J.; Mastin, L. G.; Steensen, T. S.
2011-12-01
Volcanic ash plumes and the dispersing clouds into the atmosphere are a hazard for local populations as well as for the aviation industry. Volcanic ash transport and dispersion (VATD) models, used to forecast the movement of these hazardous ash emissions, require eruption source parameters (ESP) such as plume height, eruption rate and duration. To estimate mass eruption rate, empirical relationships with observed plume height have been applied. Theoretical relationships defined by Morton et al. (1956) and Wilson et al. (1976) use default values for the environmental lapse rate (ELR), thermal efficiency, density of ash, specific heat capacity, initial temperature of the erupted material and final temperature of the material. Each volcano, based on its magma type, has a different density, specific heat capacity and initial eruptive temperature compared to these default parameters, and local atmospheric conditions can produce a very different ELR. Our research shows that a relationship between plume height and mass eruption rate can be defined for each eruptive event for each volcano. Additionally, using the one-dimensional modeling program, Plumeria, our analysis assesses the importance of factors such as vent diameter and eruption velocity on the relationship between the eruption rate and measured plume height. Coupling such a tool with a VATD model should improve pre-eruptive forecasts of ash emissions downwind and lead to improvements in ESP data that VATD models use for operational volcanic ash cloud forecasting.
A research model--forecasting incident rates from optimized safety program intervention strategies.
Iyer, P S; Haight, J M; Del Castillo, E; Tink, B W; Hawkins, P W
2005-01-01
INTRODUCTION/PROBLEM: Property damage incidents, workplace injuries, and safety programs designed to prevent them, are expensive aspects of doing business in contemporary industry. The National Safety Council (2002) estimated that workplace injuries cost $146.6 billion per year. Because companies are resource limited, optimizing intervention strategies to decrease incidents with less costly programs can contribute to improved productivity. Systematic data collection methods were employed and the forecasting ability of a time-lag relationship between interventions and incident rates was studied using various statistical methods (an intervention is not expected to have an immediate nor infinitely lasting effect on the incident rate). As a follow up to the initial work, researchers developed two models designed to forecast incident rates. One is based on past incident rate performance and the other on the configuration and level of effort applied to the safety and health program. Researchers compared actual incident performance to the prediction capability of each model over 18 months in the forestry operations at an electricity distribution company and found the models to allow accurate prediction of incident rates. These models potentially have powerful implications as a business-planning tool for human resource allocation and for designing an optimized safety and health intervention program to minimize incidents. Depending on the mathematical relationship, one can determine what interventions, where and how much to apply them, and when to increase or reduce human resource input as determined by the forecasted performance.
Performance of Trajectory Models with Wind Uncertainty
NASA Technical Reports Server (NTRS)
Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.
2009-01-01
Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.
Stock price prediction using geometric Brownian motion
NASA Astrophysics Data System (ADS)
Farida Agustini, W.; Restu Affianti, Ika; Putri, Endah RM
2018-03-01
Geometric Brownian motion is a mathematical model for predicting the future price of stock. The phase that done before stock price prediction is determine stock expected price formulation and determine the confidence level of 95%. On stock price prediction using geometric Brownian Motion model, the algorithm starts from calculating the value of return, followed by estimating value of volatility and drift, obtain the stock price forecast, calculating the forecast MAPE, calculating the stock expected price and calculating the confidence level of 95%. Based on the research, the output analysis shows that geometric Brownian motion model is the prediction technique with high rate of accuracy. It is proven with forecast MAPE value ≤ 20%.
ERIC Educational Resources Information Center
Cuadra, Ernesto; Crouch, Luis
Student promotion, repetition, and dropout rates constitute the basic data needed to forecast future enrollment and new resources. Information on student flow is significantly related to policy formulation aimed at improving internal efficiency, because dropping out and grade repetition increase per pupil cost, block access to eligible school-age…
NASA Astrophysics Data System (ADS)
Boichu, Marie; Clarisse, Lieven; Khvorostyanov, Dmitry; Clerbaux, Cathy
2014-04-01
Forecasting the dispersal of volcanic clouds during an eruption is of primary importance, especially for ensuring aviation safety. As volcanic emissions are characterized by rapid variations of emission rate and height, the (generally) high level of uncertainty in the emission parameters represents a critical issue that limits the robustness of volcanic cloud dispersal forecasts. An inverse modeling scheme, combining satellite observations of the volcanic cloud with a regional chemistry-transport model, allows reconstructing this source term at high temporal resolution. We demonstrate here how a progressive assimilation of freshly acquired satellite observations, via such an inverse modeling procedure, allows for delivering robust sulfur dioxide (SO2) cloud dispersal forecasts during the eruption. This approach provides a computationally cheap estimate of the expected location and mass loading of volcanic clouds, including the identification of SO2-rich parts.
NASA Astrophysics Data System (ADS)
Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.
2017-12-01
Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability of the streamflow forecasts produced with ensemble meteorological forcings.
Statistical earthquake focal mechanism forecasts
NASA Astrophysics Data System (ADS)
Kagan, Yan Y.; Jackson, David D.
2014-04-01
Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.
Load Forecasting in Electric Utility Integrated Resource Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H
Integrated resource planning (IRP) is a process used by many vertically-integrated U.S. electric utilities to determine least-cost/risk supply and demand-side resources that meet government policy objectives and future obligations to customers and, in many cases, shareholders. Forecasts of energy and peak demand are a critical component of the IRP process. There have been few, if any, quantitative studies of IRP long-run (planning horizons of two decades) load forecast performance and its relationship to resource planning and actual procurement decisions. In this paper, we evaluate load forecasting methods, assumptions, and outcomes for 12 Western U.S. utilities by examining and comparing plansmore » filed in the early 2000s against recent plans, up to year 2014. We find a convergence in the methods and data sources used. We also find that forecasts in more recent IRPs generally took account of new information, but that there continued to be a systematic over-estimation of load growth rates during the period studied. We compare planned and procured resource expansion against customer load and year-to-year load growth rates, but do not find a direct relationship. Load sensitivities performed in resource plans do not appear to be related to later procurement strategies even in the presence of large forecast errors. These findings suggest that resource procurement decisions may be driven by other factors than customer load growth. Our results have important implications for the integrated resource planning process, namely that load forecast accuracy may not be as important for resource procurement as is generally believed, that load forecast sensitivities could be used to improve the procurement process, and that management of load uncertainty should be prioritized over more complex forecasting techniques.« less
Paul, Susannah; Mgbere, Osaro; Arafat, Raouf; Yang, Biru; Santos, Eunice
2017-01-01
Objective The objective was to forecast and validate prediction estimates of influenza activity in Houston, TX using four years of historical influenza-like illness (ILI) from three surveillance data capture mechanisms. Background Using novel surveillance methods and historical data to estimate future trends of influenza-like illness can lead to early detection of influenza activity increases and decreases. Anticipating surges gives public health professionals more time to prepare and increase prevention efforts. Methods Data was obtained from three surveillance systems, Flu Near You, ILINet, and hospital emergency center (EC) visits, with diverse data capture mechanisms. Autoregressive integrated moving average (ARIMA) models were fitted to data from each source for week 27 of 2012 through week 26 of 2016 and used to forecast influenza-like activity for the subsequent 10 weeks. Estimates were then compared to actual ILI percentages for the same period. Results Forecasted estimates had wide confidence intervals that crossed zero. The forecasted trend direction differed by data source, resulting in lack of consensus about future influenza activity. ILINet forecasted estimates and actual percentages had the least differences. ILINet performed best when forecasting influenza activity in Houston, TX. Conclusion Though the three forecasted estimates did not agree on the trend directions, and thus, were considered imprecise predictors of long-term ILI activity based on existing data, pooling predictions and careful interpretations may be helpful for short term intervention efforts. Further work is needed to improve forecast accuracy considering the promise forecasting holds for seasonal influenza prevention and control, and pandemic preparedness.
NASA Astrophysics Data System (ADS)
Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.; Arámbula-Mendoza, R.; Budi-Santoso, A.
2016-11-01
Most attempts of deterministic eruption forecasting are based on the material Failure Forecast Method (FFM). This method assumes that a precursory observable, such as the rate of seismic activity, can be described by a simple power law which presents a singularity at a time close to the eruption onset. Until now, this method has been applied only in a small number of cases, generally for forecasts in hindsight. In this paper, a rigorous Bayesian approach of the FFM designed for real-time applications is applied. Using an automatic recognition system, seismo-volcanic events are detected and classified according to their physical mechanism and time series of probability distributions of the rates of events are calculated. At each time of observation, a Bayesian inversion provides estimations of the exponent of the power law and of the time of eruption, together with their probability density functions. Two criteria are defined in order to evaluate the quality and reliability of the forecasts. Our automated procedure has allowed the analysis of long, continuous seismic time series: 13 years from Volcán de Colima, Mexico, 10 years from Piton de la Fournaise, Reunion Island, France, and several months from Merapi volcano, Java, Indonesia. The new forecasting approach has been applied to 64 pre-eruptive sequences which present various types of dominant seismic activity (volcano-tectonic or long-period events) and patterns of seismicity with different level of complexity. This has allowed us to test the FFM assumptions, to determine in which conditions the method can be applied, and to quantify the success rate of the forecasts. 62% of the precursory sequences analysed are suitable for the application of FFM and half of the total number of eruptions are successfully forecast in hindsight. In real-time, the method allows for the successful forecast of 36% of all the eruptions considered. Nevertheless, real-time forecasts are successful for 83% of the cases that fulfil the reliability criteria. Therefore, good confidence on the method is obtained when the reliability criteria are met.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagnon, Pieter; Barbose, Galen L.; Stoll, Brady
Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities; forecasting capabilities can be improved, but generally at a cost. This paper informs this decision-space by using a suite of models to explore the capacity expansion and operation of the Western Interconnection over a 15-year period across a wide range of DPV growth rates and misforecast severities. The system costs under a misforecast are compared against the costs under a perfect forecast, to quantify the costs of misforecasting. Using a simplified probabilistic method applied to these modeling results, an analyst can make a first-ordermore » estimate of the financial benefit of improving a utility’s forecasting capabilities, and thus be better informed about whether to make such an investment. For example, under our base assumptions, a utility with 10 TWh per year of retail electric sales who initially estimates that DPV growth could range from 2% to 7.5% of total generation over the next 15 years could expect total present-value savings of approximately $4 million if they could reduce the severity of misforecasting to within ±25%. Utility resource planners can compare those savings against the costs needed to achieve that level of precision, to guide their decision on whether to make an investment in tools or resources.« less
Morabito, Marco; Pavlinic, Daniela Z; Crisci, Alfonso; Capecchi, Valerio; Orlandini, Simone; Mekjavic, Igor B
2011-07-01
Military and civil defense personnel are often involved in complex activities in a variety of outdoor environments. The choice of appropriate clothing ensembles represents an important strategy to establish the success of a military mission. The main aim of this study was to compare the known clothing insulation of the garment ensembles worn by soldiers during two winter outdoor field trials (hike and guard duty) with the estimated optimal clothing thermal insulations recommended to maintain thermoneutrality, assessed by using two different biometeorological procedures. The overall aim was to assess the applicability of such biometeorological procedures to weather forecast systems, thereby developing a comprehensive biometeorological tool for military operational forecast purposes. Military trials were carried out during winter 2006 in Pokljuka (Slovenia) by Slovene Armed Forces personnel. Gastrointestinal temperature, heart rate and environmental parameters were measured with portable data acquisition systems. The thermal characteristics of the clothing ensembles worn by the soldiers, namely thermal resistance, were determined with a sweating thermal manikin. Results showed that the clothing ensemble worn by the military was appropriate during guard duty but generally inappropriate during the hike. A general under-estimation of the biometeorological forecast model in predicting the optimal clothing insulation value was observed and an additional post-processing calibration might further improve forecast accuracy. This study represents the first step in the development of a comprehensive personalized biometeorological forecast system aimed at improving recommendations regarding the optimal thermal insulation of military garment ensembles for winter activities.
NASA Astrophysics Data System (ADS)
Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio
2015-04-01
The negative effects of severe flood events are usually contrasted through structural measures that, however, do not fully eliminate flood risk. Non-structural measures, such as real-time flood forecasting and warning, are also required. Accurate stage/discharge future predictions with appropriate forecast lead-time are sought by decision-makers for implementing strategies to mitigate the adverse effects of floods. Traditionally, flood forecasting has been approached by using rainfall-runoff and/or flood routing modelling. Indeed, both types of forecasts, cannot be considered perfectly representing future outcomes because of lacking of a complete knowledge of involved processes (Todini, 2004). Nonetheless, although aware that model forecasts are not perfectly representing future outcomes, decision makers are de facto implicitly assuming the forecast of water level/discharge/volume, etc. as "deterministic" and coinciding with what is going to occur. Recently the concept of Predictive Uncertainty (PU) was introduced in hydrology (Krzysztofowicz, 1999), and several uncertainty processors were developed (Todini, 2008). PU is defined as the probability of occurrence of the future realization of a predictand (water level/discharge/volume) conditional on: i) prior observations and knowledge, ii) the available information obtained on the future value, typically provided by one or more forecast models. Unfortunately, PU has been frequently interpreted as a measure of lack of accuracy rather than the appropriate tool allowing to take the most appropriate decisions, given a model or several models' forecasts. With the aim to shed light on the benefits for appropriately using PU, a multi-temporal approach based on the MCP approach (Todini, 2008; Coccia and Todini, 2011) is here applied to stage forecasts at sites along the Upper Tiber River. Specifically, the STAge Forecasting-Rating Curve Model Muskingum-based (STAFOM-RCM) (Barbetta et al., 2014) along with the Rating-Curve Model in Real Time (RCM-RT) (Barbetta and Moramarco, 2014) are used to this end. Both models without considering rainfall information explicitly considers, at each time of forecast, the estimate of lateral contribution along the river reach for which the stage forecast is performed at downstream end. The analysis is performed for several reaches using different lead times according to the channel length. Barbetta, S., Moramarco, T., Brocca, L., Franchini, M. and Melone, F. 2014. Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3),729-743. Barbetta, S. and Moramarco, T. 2014. Real-time flood forecasting by relating local stage and remote discharge. Hydrological Sciences Journal, 59(9 ), 1656-1674. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999. Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Todini, E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746. Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.
Short-term forecasts gain in accuracy. [Regression technique using ''Box-Jenkins'' analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Box-Jenkins time-series models offer accuracy for short-term forecasts that compare with large-scale macroeconomic forecasts. Utilities need to be able to forecast peak demand in order to plan their generating, transmitting, and distribution systems. This new method differs from conventional models by not assuming specific data patterns, but by fitting available data into a tentative pattern on the basis of auto-correlations. Three types of models (autoregressive, moving average, or mixed autoregressive/moving average) can be used according to which provides the most appropriate combination of autocorrelations and related derivatives. Major steps in choosing a model are identifying potential models, estimating the parametersmore » of the problem, and running a diagnostic check to see if the model fits the parameters. The Box-Jenkins technique is well suited for seasonal patterns, which makes it possible to have as short as hourly forecasts of load demand. With accuracy up to two years, the method will allow electricity price-elasticity forecasting that can be applied to facility planning and rate design. (DCK)« less
Sufficient Forecasting Using Factor Models
Fan, Jianqing; Xue, Lingzhou; Yao, Jiawei
2017-01-01
We consider forecasting a single time series when there is a large number of predictors and a possible nonlinear effect. The dimensionality was first reduced via a high-dimensional (approximate) factor model implemented by the principal component analysis. Using the extracted factors, we develop a novel forecasting method called the sufficient forecasting, which provides a set of sufficient predictive indices, inferred from high-dimensional predictors, to deliver additional predictive power. The projected principal component analysis will be employed to enhance the accuracy of inferred factors when a semi-parametric (approximate) factor model is assumed. Our method is also applicable to cross-sectional sufficient regression using extracted factors. The connection between the sufficient forecasting and the deep learning architecture is explicitly stated. The sufficient forecasting correctly estimates projection indices of the underlying factors even in the presence of a nonparametric forecasting function. The proposed method extends the sufficient dimension reduction to high-dimensional regimes by condensing the cross-sectional information through factor models. We derive asymptotic properties for the estimate of the central subspace spanned by these projection directions as well as the estimates of the sufficient predictive indices. We further show that the natural method of running multiple regression of target on estimated factors yields a linear estimate that actually falls into this central subspace. Our method and theory allow the number of predictors to be larger than the number of observations. We finally demonstrate that the sufficient forecasting improves upon the linear forecasting in both simulation studies and an empirical study of forecasting macroeconomic variables. PMID:29731537
Kelly, Scott P; Anderson, William F; Rosenberg, Philip S; Cook, Michael B
2017-11-18
Metastatic prostate cancer (PCA) remains a highly lethal malignancy in the USA. As prostate-specific antigen testing declines nationally, detailed assessment of current age- and race-specific incidence trends and quantitative forecasts are needed. To evaluate the current trends of metastatic PCA by age and race, and forecast the number of new cases (annual burden) and future trends. We derived incidence data for men aged ≥45 yr who were diagnosed with metastatic PCA from the population-based Surveillance, Epidemiology, and End Results registries. We examined the current trends of metastatic PCA from 2004 to 2014, and forecast the annual burden and incidence rates by age and race for 2015-2025, using age-period-cohort models and population projections. We also examined alternative forecasts (2012-2025) using trends prior to the revised screening guidelines issued in 2012. Metastatic PCA, steadily declining from 2004 to 2007 by 1.45%/yr, began to increase by 0.58%/yr after 2008, which accelerated to 2.74%/yr following the 2012 United States Preventive Services Task Force recommendations-a pattern that was magnified among men aged ≤69 yr and white men. Forecasts project the incidence to increase by 1.03%/yr through 2025, with men aged 45-54 yr (2.29%/yr) and 55-69 yr (1.53%/yr) increasing more rapidly. Meanwhile, the annual burden is expected to increase 42% by 2025. Our forecasts estimated an additional 15 891 metastatic cases from 2015 to 2025 compared with alternative forecasts using trends prior to 2012. The recent uptick in metastatic PCA rates has resulted in forecasts that project increasing rates through 2025, particularly among men aged ≤69 yr. Moreover, racial disparities are expected to persist and the annual burden will increase considerably. The impact of the prior and current PCA screening recommendations on metastatic PCA rates requires continued examination. In this report, we assessed how the incidence of metastatic prostate cancer has changed over recent years, and forecast future incidence trends and the number of new cases expected each year. We found that the incidence of metastatic prostate cancer has been increasing more rapidly since 2012, resulting in a rise in both future incidence and the number of new cases by 2025. Future incidence rates and the number of new cases were reduced in alternative forecasts using data prior to the 2012 United States Preventive Services Task Force (USPSTF) recommendations against prostate-specific antigen (PSA) testing for prostate cancer. There is a need for additional research that examines whether national declines in PSA testing contributed to increases in rates of metastatic disease. The incidence of metastatic disease in black men is still expected to occur at considerably higher rates compared with that in white men. Published by Elsevier B.V.
Forecasting extinction risk with nonstationary matrix models.
Gotelli, Nicholas J; Ellison, Aaron M
2006-02-01
Matrix population growth models are standard tools for forecasting population change and for managing rare species, but they are less useful for predicting extinction risk in the face of changing environmental conditions. Deterministic models provide point estimates of lambda, the finite rate of increase, as well as measures of matrix sensitivity and elasticity. Stationary matrix models can be used to estimate extinction risk in a variable environment, but they assume that the matrix elements are randomly sampled from a stationary (i.e., non-changing) distribution. Here we outline a method for using nonstationary matrix models to construct realistic forecasts of population fluctuation in changing environments. Our method requires three pieces of data: (1) field estimates of transition matrix elements, (2) experimental data on the demographic responses of populations to altered environmental conditions, and (3) forecasting data on environmental drivers. These three pieces of data are combined to generate a series of sequential transition matrices that emulate a pattern of long-term change in environmental drivers. Realistic estimates of population persistence and extinction risk can be derived from stochastic permutations of such a model. We illustrate the steps of this analysis with data from two populations of Sarracenia purpurea growing in northern New England. Sarracenia purpurea is a perennial carnivorous plant that is potentially at risk of local extinction because of increased nitrogen deposition. Long-term monitoring records or models of environmental change can be used to generate time series of driver variables under different scenarios of changing environments. Both manipulative and natural experiments can be used to construct a linking function that describes how matrix parameters change as a function of the environmental driver. This synthetic modeling approach provides quantitative estimates of extinction probability that have an explicit mechanistic basis.
Testing hypotheses of earthquake occurrence
NASA Astrophysics Data System (ADS)
Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.
2003-12-01
We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we will estimate by simulations. Each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results would be archived and posted on the RELM web site. The same methods can be applied to any region with adequate monitoring and sufficient earthquakes. If fewer than ten events are forecasted, the likelihood tests may not give definitive results. The tests do force certain requirements on the forecast models. Because the tests are based on absolute rates, stress models must be explicit about how stress increments affect past seismicity rates. Aftershocks of triggered events must be accounted for. Furthermore, the tests are sensitive to magnitude, so forecast models must specify the magnitude distribution of triggered events. Models should account for probable errors in magnitude and location by appropriate smoothing of the probabilities, as the tests will be "cold hearted:" near misses won't count.
NASA Astrophysics Data System (ADS)
Norbeck, J. H.; Rubinstein, J. L.
2018-04-01
The earthquake activity in Oklahoma and Kansas that began in 2008 reflects the most widespread instance of induced seismicity observed to date. We develop a reservoir model to calculate the hydrologic conditions associated with the activity of 902 saltwater disposal wells injecting into the Arbuckle aquifer. Estimates of basement fault stressing conditions inform a rate-and-state friction earthquake nucleation model to forecast the seismic response to injection. Our model replicates many salient features of the induced earthquake sequence, including the onset of seismicity, the timing of the peak seismicity rate, and the reduction in seismicity following decreased disposal activity. We present evidence for variable time lags between changes in injection and seismicity rates, consistent with the prediction from rate-and-state theory that seismicity rate transients occur over timescales inversely proportional to stressing rate. Given the efficacy of the hydromechanical model, as confirmed through a likelihood statistical test, the results of this study support broader integration of earthquake physics within seismic hazard analysis.
NASA Astrophysics Data System (ADS)
Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.
2017-08-01
A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.
Application of SeaWinds Scatterometer and TMI-SSM/I Rain Rates to Hurricane Analysis and Forecasting
NASA Technical Reports Server (NTRS)
Atlas, Robert; Hou, Arthur; Reale, Oreste
2004-01-01
Results provided by two different assimilation methodologies involving data from passive and active space-borne microwave instruments are presented. The impact of the precipitation estimates produced by the TRMM Microwave Imager (TMI) and Special Sensor Microwave/Imager (SSM/I) in a previously developed 1D variational continuous assimilation algorithm for assimilating tropical rainfall is shown on two hurricane cases. Results on the impact of the SeaWinds scatterometer on the intensity and track forecast of a mid-Atlantic hurricane are also presented. This work is the outcome of a collaborative effort between NASA and NOAA and indicates the substantial improvement in tropical cyclone forecasting that can result from the assimilation of space-based data in global atmospheric models.
International Aftershock Forecasting: Lessons from the Gorkha Earthquake
NASA Astrophysics Data System (ADS)
Michael, A. J.; Blanpied, M. L.; Brady, S. R.; van der Elst, N.; Hardebeck, J.; Mayberry, G. C.; Page, M. T.; Smoczyk, G. M.; Wein, A. M.
2015-12-01
Following the M7.8 Gorhka, Nepal, earthquake of April 25, 2015 the USGS issued a series of aftershock forecasts. The initial impetus for these forecasts was a request from the USAID Office of US Foreign Disaster Assistance to support their Disaster Assistance Response Team (DART) which coordinated US Government disaster response, including search and rescue, with the Government of Nepal. Because of the possible utility of the forecasts to people in the region and other response teams, the USGS released these forecasts publicly through the USGS Earthquake Program web site. The initial forecast used the Reasenberg and Jones (Science, 1989) model with generic parameters developed for active deep continental regions based on the Garcia et al. (BSSA, 2012) tectonic regionalization. These were then updated to reflect a lower productivity and higher decay rate based on the observed aftershocks, although relying on teleseismic observations, with a high magnitude-of-completeness, limited the amount of data. After the 12 May M7.3 aftershock, the forecasts used an Epidemic Type Aftershock Sequence model to better characterize the multiple sources of earthquake clustering. This model provided better estimates of aftershock uncertainty. These forecast messages were crafted based on lessons learned from the Christchurch earthquake along with input from the U.S. Embassy staff in Kathmandu. Challenges included how to balance simple messaging with forecasts over a variety of time periods (week, month, and year), whether to characterize probabilities with words such as those suggested by the IPCC (IPCC, 2010), how to word the messages in a way that would translate accurately into Nepali and not alarm the public, and how to present the probabilities of unlikely but possible large and potentially damaging aftershocks, such as the M7.3 event, which had an estimated probability of only 1-in-200 for the week in which it occurred.
An earthquake rate forecast for Europe based on smoothed seismicity and smoothed fault contribution
NASA Astrophysics Data System (ADS)
Hiemer, Stefan; Woessner, Jochen; Basili, Roberto; Wiemer, Stefan
2013-04-01
The main objective of project SHARE (Seismic Hazard Harmonization in Europe) is to develop a community-based seismic hazard model for the Euro-Mediterranean region. The logic tree of earthquake rupture forecasts comprises several methodologies including smoothed seismicity approaches. Smoothed seismicity thus represents an alternative concept to express the degree of spatial stationarity of seismicity and provides results that are more objective, reproducible, and testable. Nonetheless, the smoothed-seismicity approach suffers from the common drawback of being generally based on earthquake catalogs alone, i.e. the wealth of knowledge from geology is completely ignored. We present a model that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults and subductions. The result is mainly driven by the data, being independent of subjective delineation of seismic source zones. The core parts of our model are two distinct location probability densities: The first is computed by smoothing past seismicity (using variable kernel smoothing to account for varying data density). The second is obtained by smoothing fault moment rate contributions. The fault moment rates are calculated by summing the moment rate of each fault patch on a fully parameterized and discretized fault as available from the SHARE fault database. We assume that the regional frequency-magnitude distribution of the entire study area is well known and estimate the a- and b-value of a truncated Gutenberg-Richter magnitude distribution based on a maximum likelihood approach that considers the spatial and temporal completeness history of the seismic catalog. The two location probability densities are linearly weighted as a function of magnitude assuming that (1) the occurrence of past seismicity is a good proxy to forecast occurrence of future seismicity and (2) future large-magnitude events occur more likely in the vicinity of known faults. Consequently, the underlying location density of our model depends on the magnitude. We scale the density with the estimated a-value in order to construct a forecast that specifies the earthquake rate in each longitude-latitude-magnitude bin. The model is intended to be one branch of SHARE's logic tree of rupture forecasts and provides rates of events in the magnitude range of 5 <= m <= 8.5 for the entire region of interest and is suitable for comparison with other long-term models in the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP).
NASA Astrophysics Data System (ADS)
Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki
2015-04-01
Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.
United States Geological Survey fire science: fire danger monitoring and forecasting
Eidenshink, Jeff C.; Howard, Stephen M.
2012-01-01
Each day, the U.S. Geological Survey produces 7-day forecasts for all Federal lands of the distributions of number of ignitions, number of fires above a given size, and conditional probabilities of fires growing larger than a specified size. The large fire probability map is an estimate of the likelihood that ignitions will become large fires. The large fire forecast map is a probability estimate of the number of fires on federal lands exceeding 100 acres in the forthcoming week. The ignition forecast map is a probability estimate of the number of fires on Federal land greater than 1 acre in the forthcoming week. The extreme event forecast is the probability estimate of the number of fires on Federal land that may exceed 5,000 acres in the forthcoming week.
Impact of hindcast length on estimates of seasonal climate predictability.
Shi, W; Schaller, N; MacLeod, D; Palmer, T N; Weisheimer, A
2015-03-16
It has recently been argued that single-model seasonal forecast ensembles are overdispersive, implying that the real world is more predictable than indicated by estimates of so-called perfect model predictability, particularly over the North Atlantic. However, such estimates are based on relatively short forecast data sets comprising just 20 years of seasonal predictions. Here we study longer 40 year seasonal forecast data sets from multimodel seasonal forecast ensemble projects and show that sampling uncertainty due to the length of the hindcast periods is large. The skill of forecasting the North Atlantic Oscillation during winter varies within the 40 year data sets with high levels of skill found for some subperiods. It is demonstrated that while 20 year estimates of seasonal reliability can show evidence of overdispersive behavior, the 40 year estimates are more stable and show no evidence of overdispersion. Instead, the predominant feature on these longer time scales is underdispersion, particularly in the tropics. Predictions can appear overdispersive due to hindcast length sampling errorLonger hindcasts are more robust and underdispersive, especially in the tropicsTwenty hindcasts are an inadequate sample size to assess seasonal forecast skill.
DOT National Transportation Integrated Search
2009-01-01
In 1992, Pickrell published a seminal piece examining the accuracy of ridership forecasts and capital cost estimates for fixed-guideway transit systems in the US. His research created heated discussions in the transit industry regarding the ability o...
Fire danger rating over Mediterranean Europe based on fire radiative power derived from Meteosat
NASA Astrophysics Data System (ADS)
Pinto, Miguel M.; DaCamara, Carlos C.; Trigo, Isabel F.; Trigo, Ricardo M.; Feridun Turkman, K.
2018-02-01
We present a procedure that allows the operational generation of daily forecasts of fire danger over Mediterranean Europe. The procedure combines historical information about radiative energy released by fire events with daily meteorological forecasts, as provided by the Satellite Application Facility for Land Surface Analysis (LSA SAF) and the European Centre for Medium-Range Weather Forecasts (ECMWF). Fire danger is estimated based on daily probabilities of exceedance of daily energy released by fires occurring at the pixel level. Daily probability considers meteorological factors by means of the Canadian Fire Weather Index (FWI) and is estimated using a daily model based on a generalized Pareto distribution. Five classes of fire danger are then associated with daily probability estimated by the daily model. The model is calibrated using 13 years of data (2004-2016) and validated against the period of January-September 2017. Results obtained show that about 72 % of events releasing daily energy above 10 000 GJ belong to the extreme
class of fire danger, a considerably high fraction that is more than 1.5 times the values obtained when using the currently operational Fire Danger Forecast module of the European Forest Fire Information System (EFFIS) or the Fire Risk Map (FRM) product disseminated by the LSA SAF. Besides assisting in wildfire management, the procedure is expected to help in decision making on prescribed burning within the framework of agricultural and forest management practices.
NWS Operational Requirements for Ensemble-Based Hydrologic Forecasts
NASA Astrophysics Data System (ADS)
Hartman, R. K.
2008-12-01
Ensemble-based hydrologic forecasts have been developed and issued by National Weather Service (NWS) staff at River Forecast Centers (RFCs) for many years. Used principally for long-range water supply forecasts, only the uncertainty associated with weather and climate have been traditionally considered. As technology and societal expectations of resource managers increase, the use and desire for risk-based decision support tools has also increased. These tools require forecast information that includes reliable uncertainty estimates across all time and space domains. The development of reliable uncertainty estimates associated with hydrologic forecasts is being actively pursued within the United States and internationally. This presentation will describe the challenges, components, and requirements for operational hydrologic ensemble-based forecasts from the perspective of a NOAA/NWS River Forecast Center.
Forecasting peaks of seasonal influenza epidemics.
Nsoesie, Elaine; Mararthe, Madhav; Brownstein, John
2013-06-21
We present a framework for near real-time forecast of influenza epidemics using a simulation optimization approach. The method combines an individual-based model and a simple root finding optimization method for parameter estimation and forecasting. In this study, retrospective forecasts were generated for seasonal influenza epidemics using web-based estimates of influenza activity from Google Flu Trends for 2004-2005, 2007-2008 and 2012-2013 flu seasons. In some cases, the peak could be forecasted 5-6 weeks ahead. This study adds to existing resources for influenza forecasting and the proposed method can be used in conjunction with other approaches in an ensemble framework.
A stochastic HMM-based forecasting model for fuzzy time series.
Li, Sheng-Tun; Cheng, Yi-Chung
2010-10-01
Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.
NASA Astrophysics Data System (ADS)
Gochis, D. J.; Busto, J.; Howard, K.; Mickey, J.; Deems, J. S.; Painter, T. H.; Richardson, M.; Dugger, A. L.; Karsten, L. R.; Tang, L.
2015-12-01
Scarcity of spatially- and temporally-continuous observations of precipitation and snowpack conditions in remote mountain watersheds results in fundamental limitations in water supply forecasting. These limitationsin observational capabilities can result in strong biases in total snowmelt-driven runoff amount, the elevational distribution of runoff, river basin tributary contributions to total basin runoff and, equally important for water management, the timing of runoff. The Upper Rio Grande River basin in Colorado and New Mexico is one basin where observational deficiencies are hypothesized to have significant adverse impacts on estimates of snowpack melt-out rates and on water supply forecasts. We present findings from a coordinated observational-modeling study within Upper Rio Grande River basin whose aim was to quanitfy the impact enhanced precipitation, meteorological and snowpack measurements on the simulation and prediction of snowmelt driven streamflow. The Rio Grande SNOwpack and streamFLOW (RIO-SNO-FLOW) Prediction Project conducted enhanced observing activities during the 2014-2015 water year. Measurements from a gap-filling, polarimetric radar (NOXP) and in-situ meteorological and snowpack measurement stations were assimilated into the WRF-Hydro modeling framework to provide continuous analyses of snowpack and streamflow conditions. Airborne lidar estimates of snowpack conditions from the NASA Airborne Snow Observatory during mid-April and mid-May were used as additional independent validations against the various model simulations and forecasts of snowpack conditions during the melt-out season. Uncalibrated WRF-Hydro model performance from simulations and forecasts driven by enhanced observational analyses were compared against results driven by currently operational data inputs. Precipitation estimates from the NOXP research radar validate significantly better against independent in situ observations of precipitation and snow-pack increases. Correcting the operational NLDAS2 forcing data with the experimental observations led to significant improvements in the seasonal accumulation and ablation of mountain snowpack and ultimately led to marked improvement in model simulated streamflow as compared with streamflow observations.
NASA Astrophysics Data System (ADS)
Moore, A. W.; Bock, Y.; Geng, J.; Gutman, S. I.; Laber, J. L.; Morris, T.; Offield, D. G.; Small, I.; Squibb, M. B.
2012-12-01
We describe a system under development for generating ultra-low latency tropospheric delay and precipitable water vapor (PWV) estimates in situ at a prototype network of geodetic GPS sites in southern California, and demonstrating their utility in forecasting severe storms commonly associated with flooding and debris flow events along the west coast of North America through infusion of this meteorological data at NOAA National Weather Service (NWS) Forecast Offices and the NOAA Earth System Research Laboratory (ESRL). The first continuous geodetic GPS network was established in southern California in the early 1990s and much of it was converted to real-time (latency <1s) high-rate (1Hz) mode over the following decades. GPS stations are multi-purpose and can also provide estimates of tropospheric zenith delays, which can be converted into mm-accuracy PWV using collocated pressure and temperature measurements, the basis for GPS meteorology (Bevis et al. 1992, 1994; Duan et al. 1996) as implemented by NOAA with a nationwide distribution of about 300 GPS-Met stations providing PW estimates at subhourly resolution currently used in operational weather forecasting in the U.S. We improve upon the current paradigm of transmitting large quantities of raw data back to a central facility for processing into higher-order products. By operating semi-autonomously, each station will provide low-latency, high-fidelity and compact data products within the constraints of the narrow communications bandwidth that often occurs in the aftermath of natural disasters. The onsite ambiguity-resolved precise point positioning solutions are enabled by a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS and a low-cost MEMS meteorological sensor package. The decreased latency (~5 minutes) PW estimates will provide the detailed knowledge of the distribution and magnitude of PW that NWS forecasters require to monitor and predict severe winter storms, landfalling atmospheric rivers, and summer thunderstorms associated with the North American monsoon. On the national level, the ESRL will evaluate the utility of ultra-low resolution GNSS observations to improve NOAA's warning and forecast capabilities. The overall objective is to better forecast, assess, and mitigate natural hazards through the flow of information from multiple geodetic stations to scientists, mission planners, decision makers, and first responders.
NASA Technical Reports Server (NTRS)
Pauwels, V. R. N.; DeLannoy, G. J. M.; Hendricks Franssen, H.-J.; Vereecken, H.
2013-01-01
In this paper, we present a two-stage hybrid Kalman filter to estimate both observation and forecast bias in hydrologic models, in addition to state variables. The biases are estimated using the discrete Kalman filter, and the state variables using the ensemble Kalman filter. A key issue in this multi-component assimilation scheme is the exact partitioning of the difference between observation and forecasts into state, forecast bias and observation bias updates. Here, the error covariances of the forecast bias and the unbiased states are calculated as constant fractions of the biased state error covariance, and the observation bias error covariance is a function of the observation prediction error covariance. In a series of synthetic experiments, focusing on the assimilation of discharge into a rainfall-runoff model, it is shown that both static and dynamic observation and forecast biases can be successfully estimated. The results indicate a strong improvement in the estimation of the state variables and resulting discharge as opposed to the use of a bias-unaware ensemble Kalman filter. Furthermore, minimal code modification in existing data assimilation software is needed to implement the method. The results suggest that a better performance of data assimilation methods should be possible if both forecast and observation biases are taken into account.
Bayesian analyses of seasonal runoff forecasts
NASA Astrophysics Data System (ADS)
Krzysztofowicz, R.; Reese, S.
1991-12-01
Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.
Brownstein, John S; Chu, Shuyu; Marathe, Achla; Marathe, Madhav V; Nguyen, Andre T; Paolotti, Daniela; Perra, Nicola; Perrotta, Daniela; Santillana, Mauricio; Swarup, Samarth; Tizzoni, Michele; Vespignani, Alessandro; Vullikanti, Anil Kumar S; Wilson, Mandy L; Zhang, Qian
2017-11-01
Influenza outbreaks affect millions of people every year and its surveillance is usually carried out in developed countries through a network of sentinel doctors who report the weekly number of Influenza-like Illness cases observed among the visited patients. Monitoring and forecasting the evolution of these outbreaks supports decision makers in designing effective interventions and allocating resources to mitigate their impact. Describe the existing participatory surveillance approaches that have been used for modeling and forecasting of the seasonal influenza epidemic, and how they can help strengthen real-time epidemic science and provide a more rigorous understanding of epidemic conditions. We describe three different participatory surveillance systems, WISDM (Widely Internet Sourced Distributed Monitoring), Influenzanet and Flu Near You (FNY), and show how modeling and simulation can be or has been combined with participatory disease surveillance to: i) measure the non-response bias in a participatory surveillance sample using WISDM; and ii) nowcast and forecast influenza activity in different parts of the world (using Influenzanet and Flu Near You). WISDM-based results measure the participatory and sample bias for three epidemic metrics i.e. attack rate, peak infection rate, and time-to-peak, and find the participatory bias to be the largest component of the total bias. The Influenzanet platform shows that digital participatory surveillance data combined with a realistic data-driven epidemiological model can provide both short-term and long-term forecasts of epidemic intensities, and the ground truth data lie within the 95 percent confidence intervals for most weeks. The statistical accuracy of the ensemble forecasts increase as the season progresses. The Flu Near You platform shows that participatory surveillance data provide accurate short-term flu activity forecasts and influenza activity predictions. The correlation of the HealthMap Flu Trends estimates with the observed CDC ILI rates is 0.99 for 2013-2015. Additional data sources lead to an error reduction of about 40% when compared to the estimates of the model that only incorporates CDC historical information. While the advantages of participatory surveillance, compared to traditional surveillance, include its timeliness, lower costs, and broader reach, it is limited by a lack of control over the characteristics of the population sample. Modeling and simulation can help overcome this limitation as well as provide real-time and long-term forecasting of influenza activity in data-poor parts of the world. ©John S Brownstein, Shuyu Chu, Achla Marathe, Madhav V Marathe, Andre T Nguyen, Daniela Paolotti, Nicola Perra, Daniela Perrotta, Mauricio Santillana, Samarth Swarup, Michele Tizzoni, Alessandro Vespignani, Anil Kumar S Vullikanti, Mandy L Wilson, Qian Zhang. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 01.11.2017.
NASA Astrophysics Data System (ADS)
Regonda, Satish Kumar; Seo, Dong-Jun; Lawrence, Bill; Brown, James D.; Demargne, Julie
2013-08-01
We present a statistical procedure for generating short-term ensemble streamflow forecasts from single-valued, or deterministic, streamflow forecasts produced operationally by the U.S. National Weather Service (NWS) River Forecast Centers (RFCs). The resulting ensemble streamflow forecast provides an estimate of the predictive uncertainty associated with the single-valued forecast to support risk-based decision making by the forecasters and by the users of the forecast products, such as emergency managers. Forced by single-valued quantitative precipitation and temperature forecasts (QPF, QTF), the single-valued streamflow forecasts are produced at a 6-h time step nominally out to 5 days into the future. The single-valued streamflow forecasts reflect various run-time modifications, or "manual data assimilation", applied by the human forecasters in an attempt to reduce error from various sources in the end-to-end forecast process. The proposed procedure generates ensemble traces of streamflow from a parsimonious approximation of the conditional multivariate probability distribution of future streamflow given the single-valued streamflow forecast, QPF, and the most recent streamflow observation. For parameter estimation and evaluation, we used a multiyear archive of the single-valued river stage forecast produced operationally by the NWS Arkansas-Red River Basin River Forecast Center (ABRFC) in Tulsa, Oklahoma. As a by-product of parameter estimation, the procedure provides a categorical assessment of the effective lead time of the operational hydrologic forecasts for different QPF and forecast flow conditions. To evaluate the procedure, we carried out hindcasting experiments in dependent and cross-validation modes. The results indicate that the short-term streamflow ensemble hindcasts generated from the procedure are generally reliable within the effective lead time of the single-valued forecasts and well capture the skill of the single-valued forecasts. For smaller basins, however, the effective lead time is significantly reduced by short basin memory and reduced skill in the single-valued QPF.
Cohen, Justin M; Singh, Inder; O'Brien, Megan E
2008-01-01
Background An accurate forecast of global demand is essential to stabilize the market for artemisinin-based combination therapy (ACT) and to ensure access to high-quality, life-saving medications at the lowest sustainable prices by avoiding underproduction and excessive overproduction, each of which can have negative consequences for the availability of affordable drugs. A robust forecast requires an understanding of the resources available to support procurement of these relatively expensive antimalarials, in particular from the Global Fund, at present the single largest source of ACT funding. Methods Predictive regression models estimating the timing and rate of disbursements from the Global Fund to recipient countries for each malaria grant were derived using a repeated split-sample procedure intended to avoid over-fitting. Predictions were compared against actual disbursements in a group of validation grants, and forecasts of ACT procurement extrapolated from disbursement predictions were evaluated against actual procurement in two sub-Saharan countries. Results Quarterly forecasts were correlated highly with actual smoothed disbursement rates (r = 0.987, p < 0.0001). Additionally, predicted ACT procurement, extrapolated from forecasted disbursements, was correlated strongly with actual ACT procurement supported by two grants from the Global Fund's first (r = 0.945, p < 0.0001) and fourth (r = 0.938, p < 0.0001) funding rounds. Conclusion This analysis derived predictive regression models that successfully forecasted disbursement patterning for individual Global Fund malaria grants. These results indicate the utility of this approach for demand forecasting of ACT and, potentially, for other commodities procured using funding from the Global Fund. Further validation using data from other countries in different regions and environments will be necessary to confirm its generalizability. PMID:18831742
Extreme geomagnetic storms: Probabilistic forecasts and their uncertainties
Riley, Pete; Love, Jeffrey J.
2017-01-01
Extreme space weather events are low-frequency, high-risk phenomena. Estimating their rates of occurrence, as well as their associated uncertainties, is difficult. In this study, we derive statistical estimates and uncertainties for the occurrence rate of an extreme geomagnetic storm on the scale of the Carrington event (or worse) occurring within the next decade. We model the distribution of events as either a power law or lognormal distribution and use (1) Kolmogorov-Smirnov statistic to estimate goodness of fit, (2) bootstrapping to quantify the uncertainty in the estimates, and (3) likelihood ratio tests to assess whether one distribution is preferred over another. Our best estimate for the probability of another extreme geomagnetic event comparable to the Carrington event occurring within the next 10 years is 10.3% 95% confidence interval (CI) [0.9,18.7] for a power law distribution but only 3.0% 95% CI [0.6,9.0] for a lognormal distribution. However, our results depend crucially on (1) how we define an extreme event, (2) the statistical model used to describe how the events are distributed in intensity, (3) the techniques used to infer the model parameters, and (4) the data and duration used for the analysis. We test a major assumption that the data represent time stationary processes and discuss the implications. If the current trends persist, suggesting that we are entering a period of lower activity, our forecasts may represent upper limits rather than best estimates.
Forecasting Foreign Currency Exchange Rates for Air Force Budgeting
2015-03-26
FORECASTING FOREIGN CURRENCY EXCHANGE RATES FOR AIR FORCE BUDGETING THESIS MARCH 2015...States. AFIT-ENV-MS-15-M-178 FORECASTING FOREIGN CURRENCY EXCHANGE RATES FOR AIR FORCE BUDGETING THESIS Presented to the Faculty...FORECASTING FOREIGN CURRENCY EXCHANGE RATES FOR AIR FORCE BUDGETING Nicholas R. Gardner, BS Captain, USAF Committee Membership: Lt Col Jonathan
On the Specification of Smoke Injection Heights for Aerosol Forecasting
NASA Astrophysics Data System (ADS)
da Silva, A.; Schaefer, C.; Randles, C. A.
2014-12-01
The proper forecasting of biomass burning (BB) aerosols in global or regional transport models requires not only the specification of emission rates with sufficient temporal resolution but also the injection layers of such emissions. While current near realtime biomass burning inventories such as GFAS, QFED, FINN, GBBEP and FLAMBE provide such emission rates, it is left for each modeling system to come up with its own scheme for distributing these emissions in the vertical. A number of operational aerosol forecasting models deposits BB emissions in the near surface model layers, relying on the model's parameterization of turbulent and convective transport to determine the vertical mass distribution of BB aerosols. Despite their simplicity such schemes have been relatively successful reproducing the vertical structure of BB aerosols, except for those large fires that produce enough buoyancy to puncture the PBL and deposit the smoke at higher layers. Plume Rise models such as the so-called 'Freitas model', parameterize this sub-grid buoyancy effect, but require the specification of fire size and heat fluxes, none of which is readily available in near real-time from current remotely-sensed products. In this talk we will introduce a bayesian algorithm for estimating file size and heat fluxes from MODIS brightness temperatures. For small to moderate fires the Freitas model driven by these heat flux estimates produces plume tops that are highly correlated with the GEOS-5 model estimate of PBL height. Comparison to MINX plume height estimates from MISR indicates moderate skill of this scheme predicting the injection height of large fires. As an alternative, we make use of OMPS UV aerosol index data in combination with estimates of Overshooting Convective Tops (from MODIS and Geo-stationary satellites) to detect PyCu events and specify the BB emission vertical mass distribution in such cases. We will present a discussion of case studies during the SEAC4RS field campaign in August-September 2013.
A Canonical Ensemble Correlation Prediction Model for Seasonal Precipitation Anomaly
NASA Technical Reports Server (NTRS)
Shen, Samuel S. P.; Lau, William K. M.; Kim, Kyu-Myong; Li, Guilong
2001-01-01
This report describes an optimal ensemble forecasting model for seasonal precipitation and its error estimation. Each individual forecast is based on the canonical correlation analysis (CCA) in the spectral spaces whose bases are empirical orthogonal functions (EOF). The optimal weights in the ensemble forecasting crucially depend on the mean square error of each individual forecast. An estimate of the mean square error of a CCA prediction is made also using the spectral method. The error is decomposed onto EOFs of the predictand and decreases linearly according to the correlation between the predictor and predictand. This new CCA model includes the following features: (1) the use of area-factor, (2) the estimation of prediction error, and (3) the optimal ensemble of multiple forecasts. The new CCA model is applied to the seasonal forecasting of the United States precipitation field. The predictor is the sea surface temperature.
NASA Technical Reports Server (NTRS)
Raymond, William H.; Olson, William S.
1990-01-01
Delay in the spin-up of precipitation early in numerical atmospheric forecasts is a deficiency correctable by diabatic initialization combined with diabatic forcing. For either to be effective requires some knowledge of the magnitude and vertical placement of the latent heating fields. Until recently the best source of cloud and rain water data was the remotely sensed vertical integrated precipitation rate or liquid water content. Vertical placement of the condensation remains unknown. Some information about the vertical distribution of the heating rates and precipitating liquid water and ice can be obtained from retrieval techniques that use a physical model of precipitating clouds to refine and improve the interpretation of the remotely sensed data. A description of this procedure and an examination of its 3-D liquid water products, along with improved modeling methods that enhance or speed-up storm development is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagnon, Pieter J
Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities - forecasting capabilities can be improved, but generally at a cost. This paper informs this decision-space by quantifying the costs of misforecasting across a wide range of DPV growth rates and misforecast severities. Using a simplified probabilistic method presented within, an analyst can make a first-order estimate of the financial benefit of improving a utility's forecasting capabilities, and thus be better informed about whether to make such an investment. For example, we show that a utility with 10 TWh per year of retail electricmore » sales who initially estimates that the increase in DPV's contribution to total generation could range from 2 to 7.5 percent over the next 15 years could expect total present-value savings of approximately 4 million dollars if they could keep the severity of successive five-year misforecasts within plus or minus 25 percent. We also have more general discussions about how misforecasting DPV impacts the buildout and operation of the bulk power system - for example, we observed that misforecasting DPV most strongly influenced the amount of utility-scale PV that gets built, due to the similarity in the energy and capacity services offered by the two solar technologies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gagnon, Pieter J; Stoll, Brady; Mai, Trieu T
Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities - forecasting capabilities can be improved, but generally at a cost.This paper informs this decision-space by quantifying the costs of misforecasting across a wide range of DPV growth rates and misforecast severities. Using a simplified probabilistic method presented within, an analyst can make a first-order estimate of the financial benefit of improving a utility's forecasting capabilities, and thus be better informed about whether to make such an investment. For example, we show that a utility with 10 TWh per year of retail electric salesmore » who initially estimates that the increase in DPV's contribution to total generation could range from 2 percent to 7.5 percent over the next 15 years could expect total present-value savings of approximately $4 million if they could keep the severity of successive five-year misforecasts within +/- 25 percent. We also have more general discussions about how misforecasting DPV impacts the buildout and operation of the bulk power system - for example, we observed that misforecasting DPV most strongly influenced the amount of utility-scale PV that gets built, due to the similarity in the energy and capacity services offered by the two solar technologies.« less
Uncertainty forecasts improve weather-related decisions and attenuate the effects of forecast error.
Joslyn, Susan L; LeClerc, Jared E
2012-03-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather warning system is used. The work reported here tested the relative benefits of several forecast formats, comparing decisions made with and without uncertainty forecasts. In three experiments, participants assumed the role of a manager of a road maintenance company in charge of deciding whether to pay to salt the roads and avoid a potential penalty associated with icy conditions. Participants used overnight low temperature forecasts accompanied in some conditions by uncertainty estimates and in others by decision advice comparable to categorical warnings. Results suggested that uncertainty information improved decision quality overall and increased trust in the forecast. Participants with uncertainty forecasts took appropriate precautionary action and withheld unnecessary action more often than did participants using deterministic forecasts. When error in the forecast increased, participants with conventional forecasts were reluctant to act. However, this effect was attenuated by uncertainty forecasts. Providing categorical decision advice alone did not improve decisions. However, combining decision advice with uncertainty estimates resulted in the best performance overall. The results reported here have important implications for the development of forecast formats to increase compliance with severe weather warnings as well as other domains in which one must act in the face of uncertainty. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Forecasting production in Liquid Rich Shale plays
NASA Astrophysics Data System (ADS)
Nikfarman, Hanieh
Production from Liquid Rich Shale (LRS) reservoirs is taking center stage in the exploration and production of unconventional reservoirs. Production from the low and ultra-low permeability LRS plays is possible only through multi-fractured horizontal wells (MFHW's). There is no existing workflow that is applicable to forecasting multi-phase production from MFHW's in LRS plays. This project presents a practical and rigorous workflow for forecasting multiphase production from MFHW's in LRS reservoirs. There has been much effort in developing workflows and methodology for forecasting in tight/shale plays in recent years. The existing workflows, however, are applicable only to single phase flow, and are primarily used in shale gas plays. These methodologies do not apply to the multi-phase flow that is inevitable in LRS plays. To account for complexities of multiphase flow in MFHW's the only available technique is dynamic modeling in compositional numerical simulators. These are time consuming and not practical when it comes to forecasting production and estimating reserves for a large number of producers. A workflow was developed, and validated by compositional numerical simulation. The workflow honors physics of flow, and is sufficiently accurate while practical so that an analyst can readily apply it to forecast production and estimate reserves in a large number of producers in a short period of time. To simplify the complex multiphase flow in MFHW, the workflow divides production periods into an initial period where large production and pressure declines are expected, and the subsequent period where production decline may converge into a common trend for a number of producers across an area of interest in the field. Initial period assumes the production is dominated by single-phase flow of oil and uses the tri-linear flow model of Erdal Ozkan to estimate the production history. Commercial software readily available can simulate flow and forecast production in this period. In the subsequent Period, dimensionless rate and dimensionless time functions are introduced that help identify transition from initial period into subsequent period. The production trends in terms of the dimensionless parameters converge for a range of rock permeability and stimulation intensity. This helps forecast production beyond transition to the end of life of well. This workflow is applicable to single fluid system.
Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models
NASA Astrophysics Data System (ADS)
Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock
2017-05-01
Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.
Forecasting Tehran stock exchange volatility; Markov switching GARCH approach
NASA Astrophysics Data System (ADS)
Abounoori, Esmaiel; Elmi, Zahra (Mila); Nademi, Younes
2016-03-01
This paper evaluates several GARCH models regarding their ability to forecast volatility in Tehran Stock Exchange (TSE). These include GARCH models with both Gaussian and fat-tailed residual conditional distribution, concerning their ability to describe and forecast volatility from 1-day to 22-day horizon. Results indicate that AR(2)-MRSGARCH-GED model outperforms other models at one-day horizon. Also, the AR(2)-MRSGARCH-GED as well as AR(2)-MRSGARCH-t models outperform other models at 5-day horizon. In 10 day horizon, three models of AR(2)-MRSGARCH outperform other models. Concerning 22 day forecast horizon, results indicate no differences between MRSGARCH models with that of standard GARCH models. Regarding Risk management out-of-sample evaluation (95% VaR), a few models seem to provide reasonable and accurate VaR estimates at 1-day horizon, with a coverage rate close to the nominal level. According to the risk management loss functions, there is not a uniformly most accurate model.
Evaluation of statistical models for forecast errors from the HBV model
NASA Astrophysics Data System (ADS)
Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur
2010-04-01
SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.
Cost for the treatment of actinic keratosis on the rise in Australia
Perera, Eshini; McGuigan, Sean; Sinclair, Rodney
2014-01-01
Objectives: To report the burden and cost of actinic keratosis (AK) treatment in Australia and to forecast the number of AK treatments and the associated costs to 2020. Design and setting: A retrospective study of data obtained from medicare Australia for AK treated by cryotherapy between 1 January 1994 and 31 December 2012, by year and by state or territory. Results: The total number of AK cryotherapy treatments increased from 247,515 in 1994 to 643,622 in 2012, and we estimate that the number of treatments will increase to 831,952 (95% CI 676,919 to 986,987) by 2020. The total Medicare Benefits Schedule (MBS) benefits paid out for AK in 2012 was $19.6 million and we forecast that this will increase to $24.7 million by 2020 (without inflation). Conclusion: The number of AK cryotherapy treatments increased by 160% between 1994 and 2012. we forecast that the number of treatments will increase by 30% between 2012 and 2020. The rates of non-melanoma skin cancer (NMSC) and AK appear to be increasing at the same rate. During the period 2010 to 2015 AK is anticipated to increase by 17.8% which follows a similar trend to published data that forecasts an increase in NMSC treatments of 22.3%. PMID:25309734
NASA Astrophysics Data System (ADS)
Le Bihan, Guillaume; Payrastre, Olivier; Gaume, Eric; Moncoulon, David; Pons, Frédéric
2017-11-01
Up to now, flash flood monitoring and forecasting systems, based on rainfall radar measurements and distributed rainfall-runoff models, generally aimed at estimating flood magnitudes - typically discharges or return periods - at selected river cross sections. The approach presented here goes one step further by proposing an integrated forecasting chain for the direct assessment of flash flood possible impacts on inhabited areas (number of buildings at risk in the presented case studies). The proposed approach includes, in addition to a distributed rainfall-runoff model, an automatic hydraulic method suited for the computation of flood extent maps on a dense river network and over large territories. The resulting catalogue of flood extent maps is then combined with land use data to build a flood impact curve for each considered river reach, i.e. the number of inundated buildings versus discharge. These curves are finally used to compute estimated impacts based on forecasted discharges. The approach has been extensively tested in the regions of Alès and Draguignan, located in the south of France, where well-documented major flash floods recently occurred. The article presents two types of validation results. First, the automatically computed flood extent maps and corresponding water levels are tested against rating curves at available river gauging stations as well as against local reference or observed flood extent maps. Second, a rich and comprehensive insurance claim database is used to evaluate the relevance of the estimated impacts for some recent major floods.
Forecasting generation of urban solid waste in developing countries--a case study in Mexico.
Buenrostro, O; Bocco, G; Vence, J
2001-01-01
Based on a study of the composition of urban solid waste (USW) and of socioeconomic variables in Morelia, Mexico, generation rates were estimated. In addition, the generation of residential solid waste (RSW) and nonresidential solid waste (NRSW) was forecasted by means of a multiple linear regression (MLR) analysis. For residential sources, the independent variables analyzed were monthly wages, persons per dwelling, age, and educational level of the heads of the household. For nonresidential sources, variables analyzed were number of employees, area of facilities, number of working days, and working hours per day. The forecasted values for residential waste were similar to those observed. This approach may be applied to areas in which available data are scarce, and in which there is an urgent need for the planning of adequate management of USW.
Calibration of Ocean Forcing with satellite Flux Estimates (COFFEE)
NASA Astrophysics Data System (ADS)
Barron, Charlie; Jan, Dastugue; Jackie, May; Rowley, Clark; Smith, Scott; Spence, Peter; Gremes-Cordero, Silvia
2016-04-01
Predicting the evolution of ocean temperature in regional ocean models depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. Within the COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates, real-time satellite observations are used to estimate shortwave, longwave, sensible, and latent air-sea heat flux corrections to a background estimate from the prior day's regional or global model forecast. These satellite-corrected fluxes are used to prepare a corrected ocean hindcast and to estimate flux error covariances to project the heat flux corrections for a 3-5 day forecast. In this way, satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. While traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle, COFFEE endeavors to appropriately partition and reduce among various surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using operational global or regional atmospheric forcing. Experiment cases combine different levels of flux calibration with assimilation alternatives. The cases use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi
2014-09-01
Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.
An investigation into incident duration forecasting for FleetForward
DOT National Transportation Integrated Search
2000-08-01
Traffic condition forecasting is the process of estimating future traffic conditions based on current and archived data. Real-time forecasting is becoming an important tool in Intelligent Transportation Systems (ITS). This type of forecasting allows ...
The MSFC Solar Activity Future Estimation (MSAFE) Model
NASA Technical Reports Server (NTRS)
Suggs, Ron
2017-01-01
The Natural Environments Branch of the Engineering Directorate at Marshall Space Flight Center (MSFC) provides solar cycle forecasts for NASA space flight programs and the aerospace community. These forecasts provide future statistical estimates of sunspot number, solar radio 10.7 cm flux (F10.7), and the geomagnetic planetary index, Ap, for input to various space environment models. For example, many thermosphere density computer models used in spacecraft operations, orbital lifetime analysis, and the planning of future spacecraft missions require as inputs the F10.7 and Ap. The solar forecast is updated each month by executing MSAFE using historical and the latest month's observed solar indices to provide estimates for the balance of the current solar cycle. The forecasted solar indices represent the 13-month smoothed values consisting of a best estimate value stated as a 50 percentile value along with approximate +/- 2 sigma values stated as 95 and 5 percentile statistical values. This presentation will give an overview of the MSAFE model and the forecast for the current solar cycle.
Adnan, Tassha Hilda; Hashim, Nadiah Hanis; Mohan, Kirubashni; Kim Liong, Ang; Ahmad, Ghazali; Bak Leong, Goh; Bavanandan, Sunita; Haniff, Jamaiyah
2017-01-01
Background. The incidence of patients with end-stage renal disease (ESRD) requiring dialysis has been growing rapidly in Malaysia from 18 per million population (pmp) in 1993 to 231 pmp in 2013. Objective. To forecast the incidence and prevalence of ESRD patients who will require dialysis treatment in Malaysia until 2040. Methodology. Univariate forecasting models using the number of new and current dialysis patients, by the Malaysian Dialysis and Transplant Registry from 1993 to 2013 were used. Four forecasting models were evaluated, and the model with the smallest error was selected for the prediction. Result. ARIMA (0, 2, 1) modeling with the lowest error was selected to predict both the incidence (RMSE = 135.50, MAPE = 2.85, and MAE = 87.71) and the prevalence (RMSE = 158.79, MAPE = 1.29, and MAE = 117.21) of dialysis patients. The estimated incidences of new dialysis patients in 2020 and 2040 are 10,208 and 19,418 cases, respectively, while the estimated prevalence is 51,269 and 106,249 cases. Conclusion. The growth of ESRD patients on dialysis in Malaysia can be expected to continue at an alarming rate. Effective steps to address and curb further increase in new patients requiring dialysis are urgently needed, in order to mitigate the expected financial and health catastrophes associated with the projected increase of such patients. PMID:28348890
Hourly Wind Speed Interval Prediction in Arid Regions
NASA Astrophysics Data System (ADS)
Chaouch, M.; Ouarda, T.
2013-12-01
The long and extended warm and dry summers, the low rate of rain and humidity are the main factors that explain the increase of electricity consumption in hot arid regions. In such regions, the ventilating and air-conditioning installations, that are typically the most energy-intensive among energy consumption activities, are essential for securing healthy, safe and suitable indoor thermal conditions for building occupants and stored materials. The use of renewable energy resources such as solar and wind represents one of the most relevant solutions to overcome the increase of the electricity demand challenge. In the recent years, wind energy is gaining more importance among the researchers worldwide. Wind energy is intermittent in nature and hence the power system scheduling and dynamic control of wind turbine requires an estimate of wind energy. Accurate forecast of wind speed is a challenging task for the wind energy research field. In fact, due to the large variability of wind speed caused by the unpredictable and dynamic nature of the earth's atmosphere, there are many fluctuations in wind power production. This inherent variability of wind speed is the main cause of the uncertainty observed in wind power generation. Furthermore, producing wind power forecasts might be obtained indirectly by modeling the wind speed series and then transforming the forecasts through a power curve. Wind speed forecasting techniques have received substantial attention recently and several models have been developed. Basically two main approaches have been proposed in the literature: (1) physical models such as Numerical Weather Forecast and (2) statistical models such as Autoregressive integrated moving average (ARIMA) models, Neural Networks. While the initial focus in the literature has been on point forecasts, the need to quantify forecast uncertainty and communicate the risk of extreme ramp events has led to an interest in producing probabilistic forecasts. In short term context, probabilistic forecasts might be more relevant than point forecasts for the planner to build scenarios In this paper, we are interested in estimating predictive intervals of the hourly wind speed measures in few cities in United Arab emirates (UAE). More precisely, given a wind speed time series, our target is to forecast the wind speed at any specific hour during the day and provide in addition an interval with the coverage probability 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mbamalu, G.A.N.; El-Hawary, M.E.
The authors propose suboptimal least squares or IRWLS procedures for estimating the parameters of a seasonal multiplicative AR model encountered during power system load forecasting. The proposed method involves using an interactive computer environment to estimate the parameters of a seasonal multiplicative AR process. The method comprises five major computational steps. The first determines the order of the seasonal multiplicative AR process, and the second uses the least squares or the IRWLS to estimate the optimal nonseasonal AR model parameters. In the third step one obtains the intermediate series by back forecast, which is followed by using the least squaresmore » or the IRWLS to estimate the optimal season AR parameters. The final step uses the estimated parameters to forecast future load. The method is applied to predict the Nova Scotia Power Corporation's 168 lead time hourly load. The results obtained are documented and compared with results based on the Box and Jenkins method.« less
Estimating debt capacity of New York State Health facilities.
Hogan, A J
1985-01-01
A measure of the capacity to take on new debt is developed for health facilities. This measure is a function of the current financial position of the facility, future financial market conditions (interest rates and bond/loan maturities), and a policy variable (the debt service coverage ratio) to be set by state health policy makers. The quality of this measure was shown to depend on the quality of current health facility financial accounting data, on the quality of forecasts of interest rates and future cashflow, and on the appropriateness of the criterion debt service coverage ratio. Some of the limitations of the estimate are discussed. Consideration of the debt capacity estimate serves to highlight some crucial issues in imposing capital expenditure limits, namely the interrelationships between financial viability, interest rates and access to capital markets.
Space-Time Earthquake Rate Models for One-Year Hazard Forecasts in Oklahoma
NASA Astrophysics Data System (ADS)
Llenos, A. L.; Michael, A. J.
2017-12-01
The recent one-year seismic hazard assessments for natural and induced seismicity in the central and eastern US (CEUS) (Petersen et al., 2016, 2017) rely on earthquake rate models based on declustered catalogs (i.e., catalogs with foreshocks and aftershocks removed), as is common practice in probabilistic seismic hazard analysis. However, standard declustering can remove over 90% of some induced sequences in the CEUS. Some of these earthquakes may still be capable of causing damage or concern (Petersen et al., 2015, 2016). The choices of whether and how to decluster can lead to seismicity rate estimates that vary by up to factors of 10-20 (Llenos and Michael, AGU, 2016). Therefore, in order to improve the accuracy of hazard assessments, we are exploring ways to make forecasts based on full, rather than declustered, catalogs. We focus on Oklahoma, where earthquake rates began increasing in late 2009 mainly in central Oklahoma and ramped up substantially in 2013 with the expansion of seismicity into northern Oklahoma and southern Kansas. We develop earthquake rate models using the space-time Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988; Ogata, AISM, 1998; Zhuang et al., JASA, 2002), which characterizes both the background seismicity rate as well as aftershock triggering. We examine changes in the model parameters over time, focusing particularly on background rate, which reflects earthquakes that are triggered by external driving forces such as fluid injection rather than other earthquakes. After the model parameters are fit to the seismicity data from a given year, forecasts of the full catalog for the following year can then be made using a suite of 100,000 ETAS model simulations based on those parameters. To evaluate this approach, we develop pseudo-prospective yearly forecasts for Oklahoma from 2013-2016 and compare them with the observations using standard Collaboratory for the Study of Earthquake Predictability tests for consistency.
Impact of TRMM and SSM/I Rainfall Assimilation on Global Analysis and QPF
NASA Technical Reports Server (NTRS)
Hou, Arthur; Zhang, Sara; Reale, Oreste
2002-01-01
Evaluation of QPF skills requires quantitatively accurate precipitation analyses. We show that assimilation of surface rain rates derived from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager and Special Sensor Microwave/Imager (SSM/I) improves quantitative precipitation estimates (QPE) and many aspects of global analyses. Short-range forecasts initialized with analyses with satellite rainfall data generally yield significantly higher QPF threat scores and better storm track predictions. These results were obtained using a variational procedure that minimizes the difference between the observed and model rain rates by correcting the moist physics tendency of the forecast model over a 6h assimilation window. In two case studies of Hurricanes Bonnie and Floyd, synoptic analysis shows that this procedure produces initial conditions with better-defined tropical storm features and stronger precipitation intensity associated with the storm.
Using seismic and tilt measurements simultaneously to forecast eruptions of silicic volcanoes
NASA Astrophysics Data System (ADS)
Neuberg, Jurgen; Collinson, Amy; Mothes, Patricia
2016-04-01
Independent interpretations of seismic swarms and tilt measurement on active silicic volcanoes have been successfully used to assess their eruption potential. Swarms of low-frequency seismic events have been associated with brittle failure or stick-slip motion of magma during ascent and have been used to estimate qualitatively the magma ascent rate which typically accelerates before lava dome collapses. Tilt signals are extremely sensitive indicators for volcano deformation and have been often modelled and interpreted as inflation or deflation of a shallow magma reservoir. Here we show that tilt in many cases does not represent inflation or deflation but is directly linked to magma ascent rate.This talk aims to combine these two independent observations, seismicity and deformation, to design and implement a forecasting tool that can be deployed in volcano observatories on an operational level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei, Max; Smith, Sarah J.; Sohn, Michael D.
Fuel cells are both a longstanding and emerging technology for stationary and transportation applications, and their future use will likely be critical for the deep decarbonization of global energy systems. As we look into future applications, a key challenge for policy-makers and technology market forecasters who seek to track and/or accelerate their market adoption is the ability to forecast market costs of the fuel cells as technology innovations are incorporated into market products. Specifically, there is a need to estimate technology learning rates, which are rates of cost reduction versus production volume. Unfortunately, no literature exists for forecasting future learningmore » rates for fuel cells. In this paper, we look retrospectively to estimate learning rates for two fuel cell deployment programs: (1) the micro-combined heat and power (CHP) program in Japan, and (2) the Self-Generation Incentive Program (SGIP) in California. These two examples have a relatively broad set of historical market data and thus provide an informative and international comparison of distinct fuel cell technologies and government deployment programs. We develop a generalized procedure for disaggregating experience-curve cost-reductions in order to disaggregate the Japanese fuel cell micro-CHP market into its constituent components, and we derive and present a range of learning rates that may explain observed market trends. Finally, we explore the differences in the technology development ecosystem and market conditions that may have contributed to the observed differences in cost reduction and draw policy observations for the market adoption of future fuel cell technologies. The scientific and policy contributions of this paper are the first comparative experience curve analysis of past fuel cell technologies in two distinct markets, and the first quantitative comparison of a detailed cost model of fuel cell systems with actual market data. The resulting approach is applicable to analyzing other fuel cell markets and other energy-related technologies, and highlights the data needed for cost modeling and quantitative assessment of key cost reduction components.« less
Study on SOC wavelet analysis for LiFePO4 battery
NASA Astrophysics Data System (ADS)
Liu, Xuepeng; Zhao, Dongmei
2017-08-01
Improving the prediction accuracy of SOC can reduce the complexity of the conservative and control strategy of the strategy such as the scheduling, optimization and planning of LiFePO4 battery system. Based on the analysis of the relationship between the SOC historical data and the external stress factors, the SOC Estimation-Correction Prediction Model based on wavelet analysis is established. Using wavelet neural network prediction model is of high precision to achieve forecast link, external stress measured data is used to update parameters estimation in the model, implement correction link, makes the forecast model can adapt to the LiFePO4 battery under rated condition of charge and discharge the operating point of the variable operation area. The test results show that the method can obtain higher precision prediction model when the input and output of LiFePO4 battery are changed frequently.
Snow mass and river flows modelled using GRACE total water storage observations
NASA Astrophysics Data System (ADS)
Wang, S.
2017-12-01
Snow mass and river flow measurements are difficult and less accurate in cold regions due to the hash environment. Floods in cold regions are commonly a result of snowmelt during the spring break-up. Flooding is projected to increase with climate change in many parts of the world. Forecasting floods from snowmelt remains a challenge due to scarce and quality issues in basin-scale snow observations and lack of knowledge for cold region hydrological processes. This study developed a model for estimating basin-level snow mass (snow water equivalent SWE) and river flows using the total water storage (TWS) observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. The SWE estimation is based on mass balance approach which is independent of in situ snow gauge observations, thus largely eliminates the limitations and uncertainties with traditional in situ or remote sensing snow estimates. The model forecasts river flows by simulating surface runoff from snowmelt and the corresponding baseflow from groundwater discharge. Snowmelt is predicted using a temperature index model. Baseflow is predicted using a modified linear reservoir model. The model also quantifies the hysteresis between the snowmelt and the streamflow rates, or the lump time for water travel in the basin. The model was applied to the Red River Basin, the Mackenzie River Basin, and the Hudson Bay Lowland Basins in Canada. The predicted river flows were compared with the observed values at downstream hydrometric stations. The results were also compared to that for the Lower Fraser River obtained in a separate study to help better understand the roles of environmental factors in determining flood and their variations with different hydroclimatic conditions. This study advances the applications of space-based time-variable gravity measurements in cold region snow mass estimation, river flow and flood forecasting. It demonstrates a relatively simple method that only needs GRACE TWS and temperature data for river flow or flood forecasting. The model can be particularly useful for regions with spare observation networks, and can be used in combination with other available methods to help improve the accuracy in river flow and flood forecasting over cold regions.
Chen, Yeh-Hsin; Schwartz, Joel D.; Rood, Richard B.; O’Neill, Marie S.
2014-01-01
Background: Heat wave and health warning systems are activated based on forecasts of health-threatening hot weather. Objective: We estimated heat–mortality associations based on forecast and observed weather data in Detroit, Michigan, and compared the accuracy of forecast products for predicting heat waves. Methods: We derived and compared apparent temperature (AT) and heat wave days (with heat waves defined as ≥ 2 days of daily mean AT ≥ 95th percentile of warm-season average) from weather observations and six different forecast products. We used Poisson regression with and without adjustment for ozone and/or PM10 (particulate matter with aerodynamic diameter ≤ 10 μm) to estimate and compare associations of daily all-cause mortality with observed and predicted AT and heat wave days. Results: The 1-day-ahead forecast of a local operational product, Revised Digital Forecast, had about half the number of false positives compared with all other forecasts. On average, controlling for heat waves, days with observed AT = 25.3°C were associated with 3.5% higher mortality (95% CI: –1.6, 8.8%) than days with AT = 8.5°C. Observed heat wave days were associated with 6.2% higher mortality (95% CI: –0.4, 13.2%) than non–heat wave days. The accuracy of predictions varied, but associations between mortality and forecast heat generally tended to overestimate heat effects, whereas associations with forecast heat waves tended to underestimate heat wave effects, relative to associations based on observed weather metrics. Conclusions: Our findings suggest that incorporating knowledge of local conditions may improve the accuracy of predictions used to activate heat wave and health warning systems. Citation: Zhang K, Chen YH, Schwartz JD, Rood RB, O’Neill MS. 2014. Using forecast and observed weather data to assess performance of forecast products in identifying heat waves and estimating heat wave effects on mortality. Environ Health Perspect 122:912–918; http://dx.doi.org/10.1289/ehp.1306858 PMID:24833618
NASA Astrophysics Data System (ADS)
Harty, T. M.; Lorenzo, A.; Holmgren, W.; Morzfeld, M.
2017-12-01
The irradiance incident on a solar panel is the main factor in determining the power output of that panel. For this reason, accurate global horizontal irradiance (GHI) estimates and forecasts are critical when determining the optimal location for a solar power plant, forecasting utility scale solar power production, or forecasting distributed, behind the meter rooftop solar power production. Satellite images provide a basis for producing the GHI estimates needed to undertake these objectives. The focus of this work is to combine satellite derived GHI estimates with ground sensor measurements and an advection model. The idea is to use accurate but sparsely distributed ground sensors to improve satellite derived GHI estimates which can cover large areas (the size of a city or a region of the United States). We use a Bayesian framework to perform the data assimilation, which enables us to produce irradiance forecasts and associated uncertainties which incorporate both satellite and ground sensor data. Within this framework, we utilize satellite images taken from the GOES-15 geostationary satellite (available every 15-30 minutes) as well as ground data taken from irradiance sensors and rooftop solar arrays (available every 5 minutes). The advection model, driven by wind forecasts from a numerical weather model, simulates cloud motion between measurements. We use the Local Ensemble Transform Kalman Filter (LETKF) to perform the data assimilation. We present preliminary results towards making such a system useful in an operational context. We explain how localization and inflation in the LETKF, perturbations of wind-fields, and random perturbations of the advection model, affect the accuracy of our estimates and forecasts. We present experiments showing the accuracy of our forecasted GHI over forecast-horizons of 15 mins to 1 hr. The limitations of our approach and future improvements are also discussed.
Expected Rate of Return on the Personal Investment in Education of No-Fee Preservice Students
ERIC Educational Resources Information Center
Zhang, Xuemin
2013-01-01
Return on personal investment is an important factor affecting the decision to invest in education. This article analyzes the personal education costs of no-fee preservice students, estimates and forecasts the return on their personal education investment, and compares the costs and benefits of for-fee preservice students and nonteaching students.…
Cargo/Logistics Airlift System Study (CLASS), Executive Summary
NASA Technical Reports Server (NTRS)
Norman, J. M.; Henderson, R. D.; Macey, F. C.; Tuttle, R. P.
1978-01-01
The current air cargo system is analyzed along with advanced air cargo systems studies. A forecast of advanced air cargo system demand is presented with cost estimates. It is concluded that there is a need for a dedicated advance air cargo system, and with application of advanced technology, reductions of 45% in air freight rates may be achieved.
NASA Astrophysics Data System (ADS)
Versini, Pierre-Antoine; Sempere-Torres, Daniel
2010-05-01
Important damages occur in small headwater catchments when they are hit by severe storms with complex spatio-temporal structure, sometimes resulting in flash floods. As these catchments are mostly not covered by sensor networks, it is difficult to forecast these floods. This is particularly true for road submersions. These are major concerns for flood event managers. The use of Quantitative Precipitation Estimates and Forecasts (QPE/QPF) especially based on radar measurements could particularly be adequate to evaluate rainfall-induced risks. Although their characteristic time and space scales would make them suitable for flash flood modelling, the impact of their uncertainties remain uncertain and have to be evaluated. The Gard region (France) has been chosen as case study. This area is frequently affected by severe flash floods and different kinds of rainfall observations are available in real time: radar rainfall estimates and nowcasts from METEO FRANCE and the CALAMAR system from SPC (state authority in charge of flood forecasting). An application devoted to the road network, has also been recently developed for this region. It combines distributed hydro-meteorological very short range forecasts and vulnerability analysis to provide warnings of road submersions. The first results demonstrate that it is technically possible to provide distributed short-term forecasts for a large number of sites. The study also demonstrates that a reliable estimation of the spatial distribution of rainfall is essential. For this reason, the road submersion warning system can be used to evaluate the quality of rainfall estimates and nowcasts. The warning system has been tested on the specific storm of the 29-30 September 2007. During this event, more than 300mm dropped on the South part of the Gard and many roads were submerged. Each of the mentioned rainfall datasets (i.e. estimates and nowcasts) was available in real time. They have been used to forecast the exact location of road submersions and the results have been compared to the effective road submersions actually occurred during the event as listed by the emergency services. The results confirm that the road submersion warning system represents a promising tool for anticipating and quantifying the consequences of storm events at ground. It rates the submersion risk with an acceptable level of accuracy and a reasonable false alarm ratio. It demonstrates also the quality of high spatial and temporal resolution radar rainfall data in real time, and the possibility to use them despite their uncertainties. However because of the quality of rainfall nowcasts falls drastically with time, it is not often sufficient to provide valuable information for lead times exceeding one hour.
Parameter estimation and forecasting for multiplicative log-normal cascades.
Leövey, Andrés E; Lux, Thomas
2012-04-01
We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing et al. [Physica D 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica D 193, 195 (2004)] and Kiyono et al. [Phys. Rev. E 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono et al.'s procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.
Graphic comparison of reserve-growth models for conventional oil and accumulation
Klett, T.R.
2003-01-01
The U.S. Geological Survey (USGS) periodically assesses crude oil, natural gas, and natural gas liquids resources of the world. The assessment procedure requires estimated recover-able oil and natural gas volumes (field size, cumulative production plus remaining reserves) in discovered fields. Because initial reserves are typically conservative, subsequent estimates increase through time as these fields are developed and produced. The USGS assessment of petroleum resources makes estimates, or forecasts, of the potential additions to reserves in discovered oil and gas fields resulting from field development, and it also estimates the potential fully developed sizes of undiscovered fields. The term ?reserve growth? refers to the commonly observed upward adjustment of reserve estimates. Because such additions are related to increases in the total size of a field, the USGS uses field sizes to model reserve growth. Future reserve growth in existing fields is a major component of remaining U.S. oil and natural gas resources and has therefore become a necessary element of U.S. petroleum resource assessments. Past and currently proposed reserve-growth models compared herein aid in the selection of a suitable set of forecast functions to provide an estimate of potential additions to reserves from reserve growth in the ongoing National Oil and Gas Assessment Project (NOGA). Reserve growth is modeled by construction of a curve that represents annual fractional changes of recoverable oil and natural gas volumes (for fields and reservoirs), which provides growth factors. Growth factors are used to calculate forecast functions, which are sets of field- or reservoir-size multipliers. Comparisons of forecast functions were made based on datasets used to construct the models, field type, modeling method, and length of forecast span. Comparisons were also made between forecast functions based on field-level and reservoir- level growth, and between forecast functions based on older and newer data. The reserve-growth model used in the 1995 USGS National Assessment and the model currently used in the NOGA project provide forecast functions that yield similar estimates of potential additions to reserves. Both models are based on the Oil and Gas Integrated Field File from the Energy Information Administration (EIA), but different vintages of data (from 1977 through 1991 and 1977 through 1996, respectively). The model based on newer data can be used in place of the previous model, providing similar estimates of potential additions to reserves. Fore-cast functions for oil fields vary little from those for gas fields in these models; therefore, a single function may be used for both oil and gas fields, like that used in the USGS World Petroleum Assessment 2000. Forecast functions based on the field-level reserve growth model derived from the NRG Associates databases (from 1982 through 1998) differ from those derived from EIA databases (from 1977 through 1996). However, the difference may not be enough to preclude the use of the forecast functions derived from NRG data in place of the forecast functions derived from EIA data. Should the model derived from NRG data be used, separate forecast functions for oil fields and gas fields must be employed. The forecast function for oil fields from the model derived from NRG data varies significantly from that for gas fields, and a single function for both oil and gas fields may not be appropriate.
Murray, Christopher J L; Laakso, Thomas; Shibuya, Kenji; Hill, Kenneth; Lopez, Alan D
2007-09-22
Global efforts have increased the accuracy and timeliness of estimates of under-5 mortality; however, these estimates fail to use all data available, do not use transparent and reproducible methods, do not distinguish predictions from measurements, and provide no indication of uncertainty around point estimates. We aimed to develop new reproducible methods and reanalyse existing data to elucidate detailed time trends. We merged available databases, added to them when possible, and then applied Loess regression to estimate past trends and forecast to 2015 for 172 countries. We developed uncertainty estimates based on different model specifications and estimated levels and trends in neonatal, post-neonatal, and childhood mortality. Global under-5 mortality has fallen from 110 (109-110) per 1000 in 1980 to 72 (70-74) per 1000 in 2005. Child deaths worldwide have decreased from 13.5 (13.4-13.6) million in 1980 to an estimated 9.7 (9.5-10.0) million in 2005. Global under-5 mortality is expected to decline by 27% from 1990 to 2015, substantially less than the target of Millennium Development Goal 4 (MDG4) of a 67% decrease. Several regions in Latin America, north Africa, the Middle East, Europe, and southeast Asia have had consistent annual rates of decline in excess of 4% over 35 years. Global progress on MDG4 is dominated by slow reductions in sub-Saharan Africa, which also has the slowest rates of decline in fertility. Globally, we are not doing a better job of reducing child mortality now than we were three decades ago. Further improvements in the quality and timeliness of child-mortality measurements should be possible by more fully using existing datasets and applying standard analytical strategies.
Economic benefits of improved meteorological forecasts - The construction industry
NASA Technical Reports Server (NTRS)
Bhattacharyya, R. K.; Greenberg, J. S.
1976-01-01
Estimates are made of the potential economic benefits accruing to particular industries from timely utilization of satellite-derived six-hour weather forecasts, and of economic penalties resulting from failure to utilize such forecasts in day-to-day planning. The cost estimate study is centered on the U.S. construction industry, with results simplified to yes/no 6-hr forecasts on thunderstorm activity and work/no work decisions. Effects of weather elements (thunderstorms, snow and sleet) on various construction operations are indicated. Potential dollar benefits for other industries, including air transportation and other forms of transportation, are diagrammed for comparison. Geosynchronous satellites such as STORMSAT, SEOS, and SMS/GOES are considered as sources of the forecast data.
NASA Astrophysics Data System (ADS)
Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.
2018-07-01
Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.
Forecasting the mortality rates of Indonesian population by using neural network
NASA Astrophysics Data System (ADS)
Safitri, Lutfiani; Mardiyati, Sri; Rahim, Hendrisman
2018-03-01
A model that can represent a problem is required in conducting a forecasting. One of the models that has been acknowledged by the actuary community in forecasting mortality rate is the Lee-Certer model. Lee Carter model supported by Neural Network will be used to calculate mortality forecasting in Indonesia. The type of Neural Network used is feedforward neural network aligned with backpropagation algorithm in python programming language. And the final result of this study is mortality rate in forecasting Indonesia for the next few years
A Decision Support System for effective use of probability forecasts
NASA Astrophysics Data System (ADS)
De Kleermaeker, Simone; Verkade, Jan
2013-04-01
Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.
Applications of Principled Search Methods in Climate Influences and Mechanisms
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.
Real-time numerical forecast of global epidemic spreading: case study of 2009 A/H1N1pdm.
Tizzoni, Michele; Bajardi, Paolo; Poletto, Chiara; Ramasco, José J; Balcan, Duygu; Gonçalves, Bruno; Perra, Nicola; Colizza, Vittoria; Vespignani, Alessandro
2012-12-13
Mathematical and computational models for infectious diseases are increasingly used to support public-health decisions; however, their reliability is currently under debate. Real-time forecasts of epidemic spread using data-driven models have been hindered by the technical challenges posed by parameter estimation and validation. Data gathered for the 2009 H1N1 influenza crisis represent an unprecedented opportunity to validate real-time model predictions and define the main success criteria for different approaches. We used the Global Epidemic and Mobility Model to generate stochastic simulations of epidemic spread worldwide, yielding (among other measures) the incidence and seeding events at a daily resolution for 3,362 subpopulations in 220 countries. Using a Monte Carlo Maximum Likelihood analysis, the model provided an estimate of the seasonal transmission potential during the early phase of the H1N1 pandemic and generated ensemble forecasts for the activity peaks in the northern hemisphere in the fall/winter wave. These results were validated against the real-life surveillance data collected in 48 countries, and their robustness assessed by focusing on 1) the peak timing of the pandemic; 2) the level of spatial resolution allowed by the model; and 3) the clinical attack rate and the effectiveness of the vaccine. In addition, we studied the effect of data incompleteness on the prediction reliability. Real-time predictions of the peak timing are found to be in good agreement with the empirical data, showing strong robustness to data that may not be accessible in real time (such as pre-exposure immunity and adherence to vaccination campaigns), but that affect the predictions for the attack rates. The timing and spatial unfolding of the pandemic are critically sensitive to the level of mobility data integrated into the model. Our results show that large-scale models can be used to provide valuable real-time forecasts of influenza spreading, but they require high-performance computing. The quality of the forecast depends on the level of data integration, thus stressing the need for high-quality data in population-based models, and of progressive updates of validated available empirical knowledge to inform these models.
Stochastic demographic forecasting.
Lee, R D
1992-11-01
"This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary.... Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented." excerpt
Evaluation of Flood Forecast and Warning in Elbe river basin - Impact of Forecaster's Strategy
NASA Astrophysics Data System (ADS)
Danhelka, Jan; Vlasak, Tomas
2010-05-01
Czech Hydrometeorological Institute (CHMI) is responsible for flood forecasting and warning in the Czech Republic. To meet that issue CHMI operates hydrological forecasting systems and publish flow forecast in selected profiles. Flood forecast and warning is an output of system that links observation (flow and atmosphere), data processing, weather forecast (especially NWP's QPF), hydrological modeling and modeled outputs evaluation and interpretation by forecaster. Forecast users are interested in final output without separating uncertainties of separate steps of described process. Therefore an evaluation of final operational forecasts was done for profiles within Elbe river basin produced by AquaLog forecasting system during period 2002 to 2008. Effects of uncertainties of observation, data processing and especially meteorological forecasts were not accounted separately. Forecast of flood levels exceedance (peak over the threshold) during forecasting period was the main criterion as flow increase forecast is of the highest importance. Other evaluation criteria included peak flow and volume difference. In addition Nash-Sutcliffe was computed separately for each time step (1 to 48 h) of forecasting period to identify its change with the lead time. Textual flood warnings are issued for administrative regions to initiate flood protection actions in danger of flood. Flood warning hit rate was evaluated at regions level and national level. Evaluation found significant differences of model forecast skill between forecasting profiles, particularly less skill was evaluated at small headwater basins due to domination of QPF uncertainty in these basins. The average hit rate was 0.34 (miss rate = 0.33, false alarm rate = 0.32). However its explored spatial difference is likely to be influenced also by different fit of parameters sets (due to different basin characteristics) and importantly by different impact of human factor. Results suggest that the practice of interactive model operation, experience and forecasting strategy differs between responsible forecasting offices. Warning is based on model outputs interpretation by hydrologists-forecaster. Warning hit rate reached 0.60 for threshold set to lowest flood stage of which 0.11 was underestimation of flood degree (miss 0.22, false alarm 0.28). Critical success index of model forecast was 0.34, while the same criteria for warning reached 0.55. We assume that the increase accounts not only to change of scale from single forecasting point to region for warning, but partly also to forecaster's added value. There is no official warning strategy preferred in the Czech Republic (f.e. tolerance towards higher false alarm rate). Therefore forecaster decision and personal strategy is of great importance. Results show quite successful warning for 1st flood level exceedance, over-warning for 2nd flood level, but under-warning for 3rd (highest) flood level. That suggests general forecaster's preference of medium level warning (2nd flood level is legally determined to be the start of the flood and flood protection activities). In conclusion human forecaster's experience and analysis skill increases flood warning performance notably. However society preference should be specifically addressed in the warning strategy definition to support forecaster's decision making.
Model-free aftershock forecasts constructed from similar sequences in the past
NASA Astrophysics Data System (ADS)
van der Elst, N.; Page, M. T.
2017-12-01
The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.
Baik, Inkyung
2018-06-01
There are few studies that forecast the future prevalence of obesity based on the predicted prevalence model including contributing factors. The present study aimed to identify factors associated with obesity and construct forecasting models including significant contributing factors to estimate the 2020 and 2030 prevalence of obesity and abdominal obesity. Panel data from the Korea National Health and Nutrition Examination Survey and national statistics from the Korean Statistical Information Service were used for the analysis. The study subjects were 17,685 male and 24,899 female adults aged 19 years or older. The outcome variables were the prevalence of obesity (body mass index ≥ 25 kg/m 2 ) and abdominal obesity (waist circumference ≥ 90 cm for men and ≥ 85 cm for women). Stepwise logistic regression analysis was used to select significant variables from potential exposures. The survey year, age, marital status, job status, income status, smoking, alcohol consumption, sleep duration, psychological factors, dietary intake, and fertility rate were found to contribute to the prevalence of obesity and abdominal obesity. Based on the forecasting models including these variables, the 2020 and 2030 estimates for obesity prevalence were 47% and 62% for men and 32% and 37% for women, respectively. The present study suggested an increased prevalence of obesity and abdominal obesity in 2020 and 2030. Lifestyle factors were found to be significantly associated with the increasing trend in obesity prevalence and, therefore, they may require modification to prevent the rising trend.
Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood
NASA Astrophysics Data System (ADS)
Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim
2017-04-01
Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models
Transportation Sector Model of the National Energy Modeling System. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-01-01
This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. The NEMS Transportation Model comprises a series of semi-independent models which address different aspects of the transportation sector. The primary purpose of this model is to provide mid-term forecasts of transportation energy demand by fuel type including, but not limited to, motor gasoline, distillate, jet fuel, and alternative fuels (such as CNG) not commonly associated with transportation. Themore » current NEMS forecast horizon extends to the year 2010 and uses 1990 as the base year. Forecasts are generated through the separate consideration of energy consumption within the various modes of transport, including: private and fleet light-duty vehicles; aircraft; marine, rail, and truck freight; and various modes with minor overall impacts, such as mass transit and recreational boating. This approach is useful in assessing the impacts of policy initiatives, legislative mandates which affect individual modes of travel, and technological developments. The model also provides forecasts of selected intermediate values which are generated in order to determine energy consumption. These elements include estimates of passenger travel demand by automobile, air, or mass transit; estimates of the efficiency with which that demand is met; projections of vehicle stocks and the penetration of new technologies; and estimates of the demand for freight transport which are linked to forecasts of industrial output. Following the estimation of energy demand, TRAN produces forecasts of vehicular emissions of the following pollutants by source: oxides of sulfur, oxides of nitrogen, total carbon, carbon dioxide, carbon monoxide, and volatile organic compounds.« less
Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes
NASA Astrophysics Data System (ADS)
van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.
2017-12-01
Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.
Florida Model Information eXchange System (MIXS).
DOT National Transportation Integrated Search
2013-08-01
Transportation planning largely relies on travel demand forecasting, which estimates the number and type of vehicles that will use a roadway at some point in the future. Forecasting estimates are made by computer models that use a wide variety of dat...
Parameter estimation and forecasting for multiplicative log-normal cascades
NASA Astrophysics Data System (ADS)
Leövey, Andrés E.; Lux, Thomas
2012-04-01
We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing [Physica DPDNPDT0167-278910.1016/0167-2789(90)90035-N 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica DPDNPDT0167-278910.1016/j.physd.2004.01.020 193, 195 (2004)] and Kiyono [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.76.041113 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono 's procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.
Extended Kalman Filter framework for forecasting shoreline evolution
Long, Joseph; Plant, Nathaniel G.
2012-01-01
A shoreline change model incorporating both long- and short-term evolution is integrated into a data assimilation framework that uses sparse observations to generate an updated forecast of shoreline position and to estimate unobserved geophysical variables and model parameters. Application of the assimilation algorithm provides quantitative statistical estimates of combined model-data forecast uncertainty which is crucial for developing hazard vulnerability assessments, evaluation of prediction skill, and identifying future data collection needs. Significant attention is given to the estimation of four non-observable parameter values and separating two scales of shoreline evolution using only one observable morphological quantity (i.e. shoreline position).
NASA Astrophysics Data System (ADS)
Yi, J.; Choi, C.
2014-12-01
Rainfall observation and forecasting using remote sensing such as RADAR(Radio Detection and Ranging) and satellite images are widely used to delineate the increased damage by rapid weather changeslike regional storm and flash flood. The flood runoff was calculated by using adaptive neuro-fuzzy inference system, the data driven models and MAPLE(McGill Algorithm for Precipitation Nowcasting by Lagrangian Extrapolation) forecasted precipitation data as the input variables.The result of flood estimation method using neuro-fuzzy technique and RADAR forecasted precipitation data was evaluated by comparing it with the actual data.The Adaptive Neuro Fuzzy method was applied to the Chungju Reservoir basin in Korea. The six rainfall events during the flood seasons in 2010 and 2011 were used for the input data.The reservoir inflow estimation results were comparedaccording to the rainfall data used for training, checking and testing data in the model setup process. The results of the 15 models with the combination of the input variables were compared and analyzed. Using the relatively larger clustering radius and the biggest flood ever happened for training data showed the better flood estimation in this study.The model using the MAPLE forecasted precipitation data showed better result for inflow estimation in the Chungju Reservoir.
Satellite-based Calibration of Heat Flux at the Ocean Surface
NASA Astrophysics Data System (ADS)
Barron, C. N.; Dastugue, J. M.; May, J. C.; Rowley, C. D.; Smith, S. R.; Spence, P. L.; Gremes-Cordero, S.
2016-02-01
Model forecasts of upper ocean heat content and variability on diurnal to daily scales are highly dependent on estimates of heat flux through the air-sea interface. Satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. Traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle. Subsequent evolution depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. The COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates) endeavors to correct ocean forecast bias through a responsive error partition among surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using Navy operational global or regional atmospheric forcing. COFFEE addresses satellite-calibration of surface fluxes to estimate surface error covariances and links these to the ocean interior. Experiment cases combine different levels of flux calibration with different assimilation alternatives. The cases may use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
NASA Astrophysics Data System (ADS)
Cranston, Michael; Speight, Linda; Maxey, Richard; Tavendale, Amy; Buchanan, Peter
2015-04-01
One of the main challenges for the flood forecasting community remains the provision of reliable early warnings of surface (or pluvial) flooding. The Scottish Flood Forecasting Service has been developing approaches for forecasting the risk of surface water flooding including capitalising on the latest developments in quantitative precipitation forecasting from the Met Office. A probabilistic Heavy Rainfall Alert decision support tool helps operational forecasters assess the likelihood of surface water flooding against regional rainfall depth-duration estimates from MOGREPS-UK linked to historical short-duration flooding in Scotland. The surface water flood risk is communicated through the daily Flood Guidance Statement to emergency responders. A more recent development is an innovative risk-based hydrometeorological approach that links 24-hour ensemble rainfall forecasts through a hydrological model (Grid-to-Grid) to a library of impact assessments (Speight et al., 2015). The early warning tool - FEWS Glasgow - presents the risk of flooding to people, property and transport across a 1km grid over the city of Glasgow with a lead time of 24 hours. Communication of the risk was presented in a bespoke surface water flood forecast product designed based on emergency responder requirements and trialled during the 2014 Commonwealth Games in Glasgow. The development of new approaches to surface water flood forecasting are leading to improved methods of communicating the risk and better performance in early warning with a reduction in false alarm rates with summer flood guidance in 2014 (67%) compared to 2013 (81%) - although verification of instances of surface water flooding remains difficult. However the introduction of more demanding hydrometeorological capabilities with associated greater levels of uncertainty does lead to an increased demand on operational flood forecasting skills and resources. Speight, L., Cole, S.J., Moore, R.J., Pierce, C., Wright, B., Golding, B., Cranston, M., Tavendale, A., Ghimire, S., and Dhondia, J. (2015) Developing surface water flood forecasting capabilities in Scotland: an operational pilot for the 2014 Commonwealth Games in Glasgow. Journal of Flood Risk Management, In Press.
Against all odds -- Probabilistic forecasts and decision making
NASA Astrophysics Data System (ADS)
Liechti, Katharina; Zappa, Massimiliano
2015-04-01
In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.
A simple approach to measure transmissibility and forecast incidence.
Nouvellet, Pierre; Cori, Anne; Garske, Tini; Blake, Isobel M; Dorigatti, Ilaria; Hinsley, Wes; Jombart, Thibaut; Mills, Harriet L; Nedjati-Gilani, Gemma; Van Kerkhove, Maria D; Fraser, Christophe; Donnelly, Christl A; Ferguson, Neil M; Riley, Steven
2018-03-01
Outbreaks of novel pathogens such as SARS, pandemic influenza and Ebola require substantial investments in reactive interventions, with consequent implementation plans sometimes revised on a weekly basis. Therefore, short-term forecasts of incidence are often of high priority. In light of the recent Ebola epidemic in West Africa, a forecasting exercise was convened by a network of infectious disease modellers. The challenge was to forecast unseen "future" simulated data for four different scenarios at five different time points. In a similar method to that used during the recent Ebola epidemic, we estimated current levels of transmissibility, over variable time-windows chosen in an ad hoc way. Current estimated transmissibility was then used to forecast near-future incidence. We performed well within the challenge and often produced accurate forecasts. A retrospective analysis showed that our subjective method for deciding on the window of time with which to estimate transmissibility often resulted in the optimal choice. However, when near-future trends deviated substantially from exponential patterns, the accuracy of our forecasts was reduced. This exercise highlights the urgent need for infectious disease modellers to develop more robust descriptions of processes - other than the widespread depletion of susceptible individuals - that produce non-exponential patterns of incidence. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Forecasting the short-term passenger flow on high-speed railway with neural networks.
Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing
2014-01-01
Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway.
Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error
ERIC Educational Resources Information Center
Joslyn, Susan L.; LeClerc, Jared E.
2012-01-01
Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…
NASA Astrophysics Data System (ADS)
Pierro, Marco; De Felice, Matteo; Maggioni, Enrico; Moser, David; Perotto, Alessandro; Spada, Francesco; Cornaro, Cristina
2017-04-01
The growing photovoltaic generation results in a stochastic variability of the electric demand that could compromise the stability of the grid and increase the amount of energy reserve and the energy imbalance cost. On regional scale, solar power estimation and forecast is becoming essential for Distribution System Operators, Transmission System Operator, energy traders, and aggregators of generation. Indeed the estimation of regional PV power can be used for PV power supervision and real time control of residual load. Mid-term PV power forecast can be employed for transmission scheduling to reduce energy imbalance and related cost of penalties, residual load tracking, trading optimization, secondary energy reserve assessment. In this context, a new upscaling method was developed and used for estimation and mid-term forecast of the photovoltaic distributed generation in a small area in the north of Italy under the control of a local DSO. The method was based on spatial clustering of the PV fleet and neural networks models that input satellite or numerical weather prediction data (centered on cluster centroids) to estimate or predict the regional solar generation. It requires a low computational effort and very few input information should be provided by users. The power estimation model achieved a RMSE of 3% of installed capacity. Intra-day forecast (from 1 to 4 hours) obtained a RMSE of 5% - 7% while the one and two days forecast achieve to a RMSE of 7% and 7.5%. A model to estimate the forecast error and the prediction intervals was also developed. The photovoltaic production in the considered region provided the 6.9% of the electric consumption in 2015. Since the PV penetration is very similar to the one observed at national level (7.9%), this is a good case study to analyse the impact of PV generation on the electric grid and the effects of PV power forecast on transmission scheduling and on secondary reserve estimation. It appears that, already with 7% of PV penetration, the distributed PV generation could have a great impact both on the DSO energy need and on the transmission scheduling capability. Indeed, for some hours of the days in summer time, the photovoltaic generation can provide from 50% to 75% of the energy that the local DSO should buy from Italian TSO to cover the electrical demand. Moreover, mid-term forecast can reduce the annual energy imbalance between the scheduled transmission and the actual one from 10% of the TSO energy supply (without considering the PV forecast) to 2%. Furthermore, it was shown that prediction intervals could be used not only to estimate the probability of a specific PV generation bid on the energy market, but also to reduce the energy reserve predicted for the next day. Two different methods for energy reserve estimation were developed and tested. The first is based on a clear sky model while the second makes use of the PV prediction intervals with the 95% of confidence level. The latter reduces the amount of the day-ahead energy reserve of 36% with respect the clear sky method.
A nudging data assimilation algorithm for the identification of groundwater pumping
NASA Astrophysics Data System (ADS)
Cheng, Wei-Chen; Kendall, Donald R.; Putti, Mario; Yeh, William W.-G.
2009-08-01
This study develops a nudging data assimilation algorithm for estimating unknown pumping from private wells in an aquifer system using measured data of hydraulic head. The proposed algorithm treats the unknown pumping as an additional sink term in the governing equation of groundwater flow and provides a consistent physical interpretation for pumping rate identification. The algorithm identifies the unknown pumping and, at the same time, reduces the forecast error in hydraulic heads. We apply the proposed algorithm to the Las Posas Groundwater Basin in southern California. We consider the following three pumping scenarios: constant pumping rates, spatially varying pumping rates, and temporally varying pumping rates. We also study the impact of head measurement errors on the proposed algorithm. In the case study we seek to estimate the six unknown pumping rates from private wells using head measurements from four observation wells. The results show an excellent rate of convergence for pumping estimation. The case study demonstrates the applicability, accuracy, and efficiency of the proposed data assimilation algorithm for the identification of unknown pumping in an aquifer system.
A nudging data assimilation algorithm for the identification of groundwater pumping
NASA Astrophysics Data System (ADS)
Cheng, W.; Kendall, D. R.; Putti, M.; Yeh, W. W.
2008-12-01
This study develops a nudging data assimilation algorithm for estimating unknown pumping from private wells in an aquifer system using measurement data of hydraulic head. The proposed algorithm treats the unknown pumping as an additional sink term in the governing equation of groundwater flow and provides a consistently physical interpretation for pumping rate identification. The algorithm identifies unknown pumping and, at the same time, reduces the forecast error in hydraulic heads. We apply the proposed algorithm to the Las Posas Groundwater Basin in southern California. We consider the following three pumping scenarios: constant pumping rate, spatially varying pumping rates, and temporally varying pumping rates. We also study the impact of head measurement errors on the proposed algorithm. In the case study, we seek to estimate the six unknown pumping rates from private wells using head measurements from four observation wells. The results show excellent rate of convergence for pumping estimation. The case study demonstrates the applicability, accuracy, and efficiency of the proposed data assimilation algorithm for the identification of unknown pumping in an aquifer system.
Hasan, Md Tanvir; Soares Magalhaes, Ricardo J; Williams, Gail M; Mamun, Abdullah A
2015-07-01
To estimate the average annual rates of reduction of stunting, underweight and wasting for the period 1996 to 2011, and to evaluate whether Bangladesh will be expected to achieve the target of Millennium Development Goal 1C of reducing the prevalence of underweight by half by 2015. We used five nationwide, cross-sectional, Demographic and Health Survey data sets to estimate prevalence of undernutrition defined by stunting, underweight and wasting among children under 5 years of age using the WHO child growth standards. We then computed the average annual rates of reduction of prevalence of undernutrition using the formula derived by UNICEF. Finally, we projected the prevalence of undernutrition for the year 2015 using the estimated average annual rates of reduction. Nationwide covering Bangladesh. Children under 5 years of age (n 28,941). The prevalence of stunting decreased by 18.8% (from 60.0% to 41.2%), underweight by 16.0% (from 52.2% to 36.2%) and wasting by 5.1% (from 20.6% to 15.5%) during 1996 to 2011. The overall average annual rates of reduction were 2.84%, 2.69 % and 2.47%, respectively, for stunting, underweight and wasting. We forecast that in 2015, the prevalence of stunting, underweight and wasting will be 36.7%, 32.5% and 14.0%, respectively, at the national level. The prevalence of undernutrition is likely to remain high in rural areas, in the Sylhet division and in the poorest wealth quintile. Bangladesh is likely to achieve the Millennium Development Goal 1C target of reducing the prevalence of underweight by half by 2015. However, it is falling behind in reducing stunting and further investment is needed to reduce individual, household and environmental determinants of stunting in Bangladesh.
Forecasting United States heartworm Dirofilaria immitis prevalence in dogs.
Bowman, Dwight D; Liu, Yan; McMahan, Christopher S; Nordone, Shila K; Yabsley, Michael J; Lund, Robert B
2016-10-10
This paper forecasts next year's canine heartworm prevalence in the United States from 16 climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 31 million antigen heartworm tests conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on 16 predictive factors, including temperature, precipitation, median household income, local forest and surface water coverage, and presence/absence of eight mosquito species. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county heartworm prevalence for the 5-year period 2011-2015 is 0.727, demonstrating reasonable model accuracy. The correlation between 2015 observed and forecasted county-by-county heartworm prevalence is 0.940, demonstrating significant skill and showing that heartworm prevalence can be forecasted reasonably accurately. The forecast presented herein can a priori alert veterinarians to areas expected to see higher than normal heartworm activity. The proposed methods may prove useful for forecasting other diseases.
Error Estimation of An Ensemble Statistical Seasonal Precipitation Prediction Model
NASA Technical Reports Server (NTRS)
Shen, Samuel S. P.; Lau, William K. M.; Kim, Kyu-Myong; Li, Gui-Long
2001-01-01
This NASA Technical Memorandum describes an optimal ensemble canonical correlation forecasting model for seasonal precipitation. Each individual forecast is based on the canonical correlation analysis (CCA) in the spectral spaces whose bases are empirical orthogonal functions (EOF). The optimal weights in the ensemble forecasting crucially depend on the mean square error of each individual forecast. An estimate of the mean square error of a CCA prediction is made also using the spectral method. The error is decomposed onto EOFs of the predictand and decreases linearly according to the correlation between the predictor and predictand. Since new CCA scheme is derived for continuous fields of predictor and predictand, an area-factor is automatically included. Thus our model is an improvement of the spectral CCA scheme of Barnett and Preisendorfer. The improvements include (1) the use of area-factor, (2) the estimation of prediction error, and (3) the optimal ensemble of multiple forecasts. The new CCA model is applied to the seasonal forecasting of the United States (US) precipitation field. The predictor is the sea surface temperature (SST). The US Climate Prediction Center's reconstructed SST is used as the predictor's historical data. The US National Center for Environmental Prediction's optimally interpolated precipitation (1951-2000) is used as the predictand's historical data. Our forecast experiments show that the new ensemble canonical correlation scheme renders a reasonable forecasting skill. For example, when using September-October-November SST to predict the next season December-January-February precipitation, the spatial pattern correlation between the observed and predicted are positive in 46 years among the 50 years of experiments. The positive correlations are close to or greater than 0.4 in 29 years, which indicates excellent performance of the forecasting model. The forecasting skill can be further enhanced when several predictors are used.
Using Analog Ensemble to generate spatially downscaled probabilistic wind power forecasts
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Shahriari, M.; Cervone, G.
2017-12-01
We use the Analog Ensemble (AnEn) method to generate probabilistic 80-m wind power forecasts. We use data from the NCEP GFS ( 28 km resolution) and NCEP NAM (12 km resolution). We use forecasts data from NAM and GFS, and analysis data from NAM which enables us to: 1) use a lower-resolution model to create higher-resolution forecasts, and 2) use a higher-resolution model to create higher-resolution forecasts. The former essentially increases computing speed and the latter increases forecast accuracy. An aggregated model of the former can be compared against the latter to measure the accuracy of the AnEn spatial downscaling. The AnEn works by taking a deterministic future forecast and comparing it with past forecasts. The model searches for the best matching estimates within the past forecasts and selects the predictand value corresponding to these past forecasts as the ensemble prediction for the future forecast. Our study is based on predicting wind speed and air density at more than 13,000 grid points in the continental US. We run the AnEn model twice: 1) estimating 80-m wind speed by using predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind, 2) estimating air density by using predictors such as temperature, pressure, and relative humidity. We use the air density values to correct the standard wind power curves for different values of air density. The standard deviation of the ensemble members (i.e. ensemble spread) will be used as the degree of difficulty to predict wind power at different locations. The value of the correlation coefficient between the ensemble spread and the forecast error determines the appropriateness of this measure. This measure is prominent for wind farm developers as building wind farms in regions with higher predictability will reduce the real-time risks of operating in the electricity markets.
Using Landslide Failure Forecast Models in Near Real Time: the Mt. de La Saxe case-study
NASA Astrophysics Data System (ADS)
Manconi, Andrea; Giordan, Daniele
2014-05-01
Forecasting the occurrence of landslide phenomena in space and time is a major scientific challenge. The approaches used to forecast landslides mainly depend on the spatial scale analyzed (regional vs. local), the temporal range of forecast (long- vs. short-term), as well as the triggering factor and the landslide typology considered. By focusing on short-term forecast methods for large, deep seated slope instabilities, the potential time of failure (ToF) can be estimated by studying the evolution of the landslide deformation over time (i.e., strain rate) provided that, under constant stress conditions, landslide materials follow creep mechanism before reaching rupture. In the last decades, different procedures have been proposed to estimate ToF by considering simplified empirical and/or graphical methods applied to time series of deformation data. Fukuzono, 1985 proposed a failure forecast method based on the experience performed during large scale laboratory experiments, which were aimed at observing the kinematic evolution of a landslide induced by rain. This approach, known also as the inverse-velocity method, considers the evolution over time of the inverse value of the surface velocity (v) as an indicator of the ToF, by assuming that failure approaches while 1/v tends to zero. Here we present an innovative method to aimed at achieving failure forecast of landslide phenomena by considering near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and then apply straightforward statistical methods to obtain confidence intervals on the time of failure. Our results can be relevant to support the management of early warning systems during landslide emergency conditions, also when the predefined displacement and/or velocity thresholds are exceeded. In addition, our statistical approach for the definition of confidence interval and forecast reliability can be applied also to different failure forecast methods. We applied for the first time the herein presented approach in near real time during the emergency scenario relevant to the reactivation of the La Saxe rockslide, a large mass wasting menacing the population of Courmayeur, northern Italy, and the important European route E25. We show how the application of simplified but robust forecast models can be a convenient method to manage and support early warning systems during critical situations. References: Fukuzono T. (1985), A New Method for Predicting the Failure Time of a Slope, Proc. IVth International Conference and Field Workshop on Landslides, Tokyo.
Spatial Distribution of the Coefficient of Variation for the Paleo-Earthquakes in Japan
NASA Astrophysics Data System (ADS)
Nomura, S.; Ogata, Y.
2015-12-01
Renewal processes, point prccesses in which intervals between consecutive events are independently and identically distributed, are frequently used to describe this repeating earthquake mechanism and forecast the next earthquakes. However, one of the difficulties in applying recurrent earthquake models is the scarcity of the historical data. Most studied fault segments have few, or only one observed earthquake that often have poorly constrained historic and/or radiocarbon ages. The maximum likelihood estimate from such a small data set can have a large bias and error, which tends to yield high probability for the next event in a very short time span when the recurrence intervals have similar lengths. On the other hand, recurrence intervals at a fault depend on the long-term slip rate caused by the tectonic motion in average. In addition, recurrence times are also fluctuated by nearby earthquakes or fault activities which encourage or discourage surrounding seismicity. These factors have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus, this paper introduces a spatial structure on the key parameters of renewal processes for recurrent earthquakes and estimates it by using spatial statistics. Spatial variation of mean and variance parameters of recurrence times are estimated in Bayesian framework and the next earthquakes are forecasted by Bayesian predictive distributions. The proposal model is applied for recurrent earthquake catalog in Japan and its result is compared with the current forecast adopted by the Earthquake Research Committee of Japan.
Constraints on the FRB rate at 700-900 MHz
NASA Astrophysics Data System (ADS)
Connor, Liam; Lin, Hsiu-Hsien; Masui, Kiyoshi; Oppermann, Niels; Pen, Ue-Li; Peterson, Jeffrey B.; Roman, Alexander; Sievers, Jonathan
2016-07-01
Estimating the all-sky rate of fast radio bursts (FRBs) has been difficult due to small-number statistics and the fact that they are seen by disparate surveys in different regions of the sky. In this paper we provide limits for the FRB rate at 800 MHz based on the only burst detected at frequencies below 1.4 GHz, FRB 110523. We discuss the difficulties in rate estimation, particularly in providing an all-sky rate above a single fluence threshold. We find an implied rate between 700 and 900 MHz that is consistent with the rate at 1.4 GHz, scaling to 6.4^{+29.5}_{-5.0} × 10^3 sky-1 d-1 for an HTRU-like survey. This is promising for upcoming experiments below a GHz like CHIME and UTMOST, for which we forecast detection rates. Given 110523's discovery at 32σ with nothing weaker detected, down to the threshold of 8σ, we find consistency with a Euclidean flux distribution but disfavour steep distributions, ruling out γ > 2.2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendes, J.; Bessa, R.J.; Keko, H.
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highlymore » dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.« less
What might we learn from climate forecasts?
Smith, Leonard A.
2002-01-01
Most climate models are large dynamical systems involving a million (or more) variables on big computers. Given that they are nonlinear and not perfect, what can we expect to learn from them about the earth's climate? How can we determine which aspects of their output might be useful and which are noise? And how should we distribute resources between making them “better,” estimating variables of true social and economic interest, and quantifying how good they are at the moment? Just as “chaos” prevents accurate weather forecasts, so model error precludes accurate forecasts of the distributions that define climate, yielding uncertainty of the second kind. Can we estimate the uncertainty in our uncertainty estimates? These questions are discussed. Ultimately, all uncertainty is quantified within a given modeling paradigm; our forecasts need never reflect the uncertainty in a physical system. PMID:11875200
Survey of air cargo forecasting techniques
NASA Technical Reports Server (NTRS)
Kuhlthan, A. R.; Vermuri, R. S.
1978-01-01
Forecasting techniques currently in use in estimating or predicting the demand for air cargo in various markets are discussed with emphasis on the fundamentals of the different forecasting approaches. References to specific studies are cited when appropriate. The effectiveness of current methods is evaluated and several prospects for future activities or approaches are suggested. Appendices contain summary type analyses of about 50 specific publications on forecasting, and selected bibliographies on air cargo forecasting, air passenger demand forecasting, and general demand and modalsplit modeling.
California motor vehicle stock, travel, and fuel forecasts documents
DOT National Transportation Integrated Search
2005-12-30
This is the twenty-first of a series of California Motor Vehicle Stock, Travel and Fuel : Forecast (MVSTAFF) reports. These reports provide historical estimates and forecasts of : the number of registered vehicles, miles of travel, fuel consumption, ...
Low Streamflow Forcasting using Minimum Relative Entropy
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2013-12-01
Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.
New Data from EPA's Exposure Forecasting (ExpoCast) Project (ISES meeting)
The health risks posed by the chemicals in our environment depends on both chemical hazard and exposure. However, relatively few chemicals have estimates of exposure intake, limiting risk estimations for thousands of chemicals. The U.S. EPA Exposure Forecasting (ExpoCast) projec...
Projected electric power demands for the Potomac Electric Power Company. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estomin, S.; Kahal, M.
1984-03-01
This three-volume report presents the results of an econometric forecast of peak and electric power demands for the Potomac Electric Power Company (PEPCO) through the year 2002. Volume I describes the methodology, the results of the econometric estimations, the forecast assumptions and the calculated forecasts of peak demand and energy usage. Separate sets of models were developed for the Maryland Suburbs (Montgomery and Prince George's counties), the District of Columbia and Southern Maryland (served by a wholesale customer of PEPCO). For each of the three jurisdictions, energy equations were estimated for residential and commercial/industrial customers for both summer and wintermore » seasons. For the District of Columbia, summer and winter equations for energy sales to the federal government were also estimated. Equations were also estimated for street lighting and energy losses. Noneconometric techniques were employed to forecast energy sales to the Northern Virginia suburbs, Metrorail and federal government facilities located in Maryland.« less
Forecasting the stochastic demand for inpatient care: the case of the Greek national health system.
Boutsioli, Zoe
2010-08-01
The aim of this study is to estimate the unexpected demand of Greek public hospitals. A multivariate model with four explanatory variables is used. These are as follows: the weekend effect, the duty effect, the summer holiday and the official holiday. The method of the ordinary least squares is used to estimate the impact of these variables on the daily hospital emergency admissions series. The forecasted residuals of hospital regressions for each year give the estimated stochastic demand. Daily emergency admissions decline during weekends, summer months and official holidays, and increase on duty hospital days. Stochastic hospital demand varies both among hospitals and over the five-year time period under investigation. Variations among hospitals are larger than time variations. Hospital managers and health policy-makers can be availed by forecasting the future flows of emergent patients. The benefit can be both at managerial and economical level. More advanced models including additional daily variables such as the weather forecasts could provide more accurate estimations.
Improving the Representation of Snow Crystal Properties Within a Single-Moment Microphysics Scheme
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Petersen, Walter A.; Case, Jonathan L.; Dembek, S. R.
2010-01-01
As computational resources continue their expansion, weather forecast models are transitioning to the use of parameterizations that predict the evolution of hydrometeors and their microphysical processes, rather than estimating the bulk effects of clouds and precipitation that occur on a sub-grid scale. These parameterizations are referred to as single-moment, bulk water microphysics schemes, as they predict the total water mass among hydrometeors in a limited number of classes. Although the development of single moment microphysics schemes have often been driven by the need to predict the structure of convective storms, they may also provide value in predicting accumulations of snowfall. Predicting the accumulation of snowfall presents unique challenges to forecasters and microphysics schemes. In cases where surface temperatures are near freezing, accumulated depth often depends upon the snowfall rate and the ability to overcome an initial warm layer. Precipitation efficiency relates to the dominant ice crystal habit, as dendrites and plates have relatively large surface areas for the accretion of cloud water and ice, but are only favored within a narrow range of ice supersaturation and temperature. Forecast models and their parameterizations must accurately represent the characteristics of snow crystal populations, such as their size distribution, bulk density and fall speed. These properties relate to the vertical distribution of ice within simulated clouds, the temperature profile through latent heat release, and the eventual precipitation rate measured at the surface. The NASA Goddard, single-moment microphysics scheme is available to the operational forecast community as an option within the Weather Research and Forecasting (WRF) model. The NASA Goddard scheme predicts the occurrence of up to six classes of water mass: vapor, cloud ice, cloud water, rain, snow and either graupel or hail.
The integrated process rates (IPR) estimated by the Eta-CMAQ model at grid cells along the trajectory of the air mass transport path were analyzed to quantitatively investigate the relative importance of physical and chemical processes for O3 formation and evolution ov...
2014 Gulf of Mexico Hypoxia Forecast
Scavia, Donald; Evans, Mary Anne; Obenour, Dan
2014-01-01
The Gulf of Mexico annual summer hypoxia forecasts are based on average May total nitrogen loads from the Mississippi River basin for that year. The load estimate, recently released by USGS, is 4,761 metric tons per day. Based on that estimate, we predict the area of this summer’s hypoxic zone to be 14,000 square kilometers (95% credible interval, 8,000 to 20,000) – an “average year”. Our forecast hypoxic volume is 50 km3 (95% credible interval, 20 to 77).
Lee-Carter state space modeling: Application to the Malaysia mortality data
NASA Astrophysics Data System (ADS)
Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.
2014-06-01
This article presents an approach that formalizes the Lee-Carter (LC) model as a state space model. Maximum likelihood through Expectation-Maximum (EM) algorithm was used to estimate the model. The methodology is applied to Malaysia's total population mortality data. Malaysia's mortality data was modeled based on age specific death rates (ASDR) data from 1971-2009. The fitted ASDR are compared to the actual observed values. However, results from the comparison of the fitted and actual values between LC-SS model and the original LC model shows that the fitted values from the LC-SS model and original LC model are quite close. In addition, there is not much difference between the value of root mean squared error (RMSE) and Akaike information criteria (AIC) from both models. The LC-SS model estimated for this study can be extended for forecasting ASDR in Malaysia. Then, accuracy of the LC-SS compared to the original LC can be further examined by verifying the forecasting power using out-of-sample comparison.
NASA Astrophysics Data System (ADS)
Addawe, Rizavel C.; Addawe, Joel M.; Magadia, Joselito C.
2016-10-01
Accurate forecasting of dengue cases would significantly improve epidemic prevention and control capabilities. This paper attempts to provide useful models in forecasting dengue epidemic specific to the young and adult population of Baguio City. To capture the seasonal variations in dengue incidence, this paper develops a robust modeling approach to identify and estimate seasonal autoregressive integrated moving average (SARIMA) models in the presence of additive outliers. Since the least squares estimators are not robust in the presence of outliers, we suggest a robust estimation based on winsorized and reweighted least squares estimators. A hybrid algorithm, Differential Evolution - Simulated Annealing (DESA), is used to identify and estimate the parameters of the optimal SARIMA model. The method is applied to the monthly reported dengue cases in Baguio City, Philippines.
Satellite-driven modeling approach for monitoring lava flow hazards during the 2017 Etna eruption
NASA Astrophysics Data System (ADS)
Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.; Zago, V.
2017-12-01
The integration of satellite data and modeling represents an efficient strategy that may provide immediate answers to the main issues raised at the onset of a new effusive eruption. Satellite-based thermal remote sensing of hotspots related to effusive activity can effectively provide a variety of products suited to timing, locating, and tracking the radiant character of lava flows. Hotspots show the location and occurrence of eruptive events (vents). Discharge rate estimates may indicate the current intensity (effusion rate) and potential magnitude (volume). High-spatial resolution multispectral satellite data can complement field observations for monitoring the front position (length) and extension of flows (area). Physics-based models driven, or validated, by satellite-derived parameters are now capable of fast and accurate forecast of lava flow inundation scenarios (hazard). Here, we demonstrate the potential of the integrated application of satellite remote-sensing techniques and lava flow models during the 2017 effusive eruption at Mount Etna in Italy. This combined approach provided insights into lava flow field evolution by supplying detailed views of flow field construction (e.g., the opening of ephemeral vents) that were useful for more accurate and reliable forecasts of eruptive activity. Moreover, we gave a detailed chronology of the lava flow activity based on field observations and satellite images, assessed the potential extent of impacted areas, mapped the evolution of lava flow field, and executed hazard projections. The underside of this combination is the high sensitivity of lava flow inundation scenarios to uncertainties in vent location, discharge rate, and other parameters, which can make interpreting hazard forecasts difficult during an effusive crisis. However, such integration at last makes timely forecasts of lava flow hazards during effusive crises possible at the great majority of volcanoes for which no monitoring exists.
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Nasution, F. R.; Seniman; Syahputra, M. F.; Sitompul, O. S.
2018-02-01
Weather is condition of air in a certain region at a relatively short period of time, measured with various parameters such as; temperature, air preasure, wind velocity, humidity and another phenomenons in the atmosphere. In fact, extreme weather due to global warming would lead to drought, flood, hurricane and other forms of weather occasion, which directly affects social andeconomic activities. Hence, a forecasting technique is to predict weather with distinctive output, particullary mapping process based on GIS with information about current weather status in certain cordinates of each region with capability to forecast for seven days afterward. Data used in this research are retrieved in real time from the server openweathermap and BMKG. In order to obtain a low error rate and high accuracy of forecasting, the authors use Bayesian Model Averaging (BMA) method. The result shows that the BMA method has good accuracy. Forecasting error value is calculated by mean square error shows (MSE). The error value emerges at minumum temperature rated at 0.28 and maximum temperature rated at 0.15. Meanwhile, the error value of minimum humidity rates at 0.38 and the error value of maximum humidity rates at 0.04. Afterall, the forecasting error rate of wind speed is at 0.076. The lower the forecasting error rate, the more optimized the accuracy is.
Forecasting overhaul or replacement intervals based on estimated system failure intensity
NASA Astrophysics Data System (ADS)
Gannon, James M.
1994-12-01
System reliability can be expressed in terms of the pattern of failure events over time. Assuming a nonhomogeneous Poisson process and Weibull intensity function for complex repairable system failures, the degree of system deterioration can be approximated. Maximum likelihood estimators (MLE's) for the system Rate of Occurrence of Failure (ROCOF) function are presented. Evaluating the integral of the ROCOF over annual usage intervals yields the expected number of annual system failures. By associating a cost of failure with the expected number of failures, budget and program policy decisions can be made based on expected future maintenance costs. Monte Carlo simulation is used to estimate the range and the distribution of the net present value and internal rate of return of alternative cash flows based on the distributions of the cost inputs and confidence intervals of the MLE's.
NASA Astrophysics Data System (ADS)
Kotsuki, Shunji; Terasaki, Koji; Yashiro, Hasashi; Tomita, Hirofumi; Satoh, Masaki; Miyoshi, Takemasa
2017-04-01
This study aims to improve precipitation forecasts from numerical weather prediction (NWP) models through effective use of satellite-derived precipitation data. Kotsuki et al. (2016, JGR-A) successfully improved the precipitation forecasts by assimilating the Japan Aerospace eXploration Agency (JAXA)'s Global Satellite Mapping of Precipitation (GSMaP) data into the Nonhydrostatic Icosahedral Atmospheric Model (NICAM) at 112-km horizontal resolution. Kotsuki et al. mitigated the non-Gaussianity of the precipitation variables by the Gaussian transform method for observed and forecasted precipitation using the previous 30-day precipitation data. This study extends the previous study by Kotsuki et al. and explores an online estimation of model parameters using ensemble data assimilation. We choose two globally-uniform parameters, one is the cloud-to-rain auto-conversion parameter of the Berry's scheme for large scale condensation and the other is the relative humidity threshold of the Arakawa-Schubert cumulus parameterization scheme. We perform the online-estimation of the two model parameters with an ensemble transform Kalman filter by assimilating the GSMaP precipitation data. The estimated parameters improve the analyzed and forecasted mixing ratio in the lower troposphere. Therefore, the parameter estimation would be a useful technique to improve the NWP models and their forecasts. This presentation will include the most recent progress up to the time of the symposium.
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
NASA Astrophysics Data System (ADS)
Cooper, Elizabeth; Dance, Sarah; Garcia-Pintado, Javier; Nichols, Nancy; Smith, Polly
2017-04-01
Timely and accurate inundation forecasting provides vital information about the behaviour of fluvial flood water, enabling mitigating actions to be taken by residents and emergency services. Data assimilation is a powerful mathematical technique for combining forecasts from hydrodynamic models with observations to produce a more accurate forecast. We discuss the effect of both domain size and channel friction parameter estimation on observation impact in data assimilation for inundation forecasting. Numerical shallow water simulations are carried out in a simple, idealized river channel topography. Data assimilation is performed using an Ensemble Transform Kalman Filter (ETKF) and synthetic observations of water depth in identical twin experiments. We show that reinitialising the numerical inundation model with corrected water levels after an assimilation can cause an initialisation shock if a hydrostatic assumption is made, leading to significant degradation of the forecast for several hours immediately following an assimilation. We demonstrate an effective and novel method for dealing with this. We find that using data assimilation to combine observations of water depth with forecasts from a hydrodynamic model corrects the forecast very effectively at time of the observations. In agreement with other authors we find that the corrected forecast then moves quickly back to the open loop forecast which does not take the observations into account. Our investigations show that the time taken for the forecast to decay back to the open loop case depends on the length of the domain of interest when only water levels are corrected. This is because the assimilation corrects water depths in all parts of the domain, even when observations are only available in one area. Error growth in the forecast step then starts at the upstream part of the domain and propagates downstream. The impact of the observations is therefore longer-lived in a longer domain. We have found that the upstream-downstream pattern of error growth can be due to incorrect friction parameter specification, rather than errors in inflow as shown elsewhere. Our results show that joint state-parameter estimation can recover accurate values for the parameter controlling channel friction processes in the model, even when observations of water level are only available on part of the flood plain. Correcting water levels and the channel friction parameter together leads to a large improvement in the forecast water levels at all simulation times. The impact of the observations is therefore much greater when the channel friction parameter is corrected along with water levels. We find that domain length effects disappear for joint state-parameter estimation.
NASA Astrophysics Data System (ADS)
Brown, James; Seo, Dong-Jun
2010-05-01
Operational forecasts of hydrometeorological and hydrologic variables often contain large uncertainties, for which ensemble techniques are increasingly used. However, the utility of ensemble forecasts depends on the unbiasedness of the forecast probabilities. We describe a technique for quantifying and removing biases from ensemble forecasts of hydrometeorological and hydrologic variables, intended for use in operational forecasting. The technique makes no a priori assumptions about the distributional form of the variables, which is often unknown or difficult to model parametrically. The aim is to estimate the conditional cumulative distribution function (ccdf) of the observed variable given a (possibly biased) real-time ensemble forecast from one or several forecasting systems (multi-model ensembles). The technique is based on Bayesian optimal linear estimation of indicator variables, and is analogous to indicator cokriging (ICK) in geostatistics. By developing linear estimators for the conditional expectation of the observed variable at many thresholds, ICK provides a discrete approximation of the full ccdf. Since ICK minimizes the conditional error variance of the indicator expectation at each threshold, it effectively minimizes the Continuous Ranked Probability Score (CRPS) when infinitely many thresholds are employed. However, the ensemble members used as predictors in ICK, and other bias-correction techniques, are often highly cross-correlated, both within and between models. Thus, we propose an orthogonal transform of the predictors used in ICK, which is analogous to using their principal components in the linear system of equations. This leads to a well-posed problem in which a minimum number of predictors are used to provide maximum information content in terms of the total variance explained. The technique is used to bias-correct precipitation ensemble forecasts from the NCEP Global Ensemble Forecast System (GEFS), for which independent validation results are presented. Extension to multimodel ensembles from the NCEP GFS and Short Range Ensemble Forecast (SREF) systems is also proposed.
The propagation of wind errors through ocean wave hindcasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holthuijsen, L.H.; Booij, N.; Bertotti, L.
1996-08-01
To estimate uncertainties in wave forecast and hindcasts, computations have been carried out for a location in the Mediterranean Sea using three different analyses of one historic wind field. These computations involve a systematic sensitivity analysis and estimated wind field errors. This technique enables a wave modeler to estimate such uncertainties in other forecasts and hindcasts if only one wind analysis is available.
Fuzzy neural network technique for system state forecasting.
Li, Dezhi; Wang, Wilson; Ismail, Fathy
2013-10-01
In many system state forecasting applications, the prediction is performed based on multiple datasets, each corresponding to a distinct system condition. The traditional methods dealing with multiple datasets (e.g., vector autoregressive moving average models and neural networks) have some shortcomings, such as limited modeling capability and opaque reasoning operations. To tackle these problems, a novel fuzzy neural network (FNN) is proposed in this paper to effectively extract information from multiple datasets, so as to improve forecasting accuracy. The proposed predictor consists of both autoregressive (AR) nodes modeling and nonlinear nodes modeling; AR models/nodes are used to capture the linear correlation of the datasets, and the nonlinear correlation of the datasets are modeled with nonlinear neuron nodes. A novel particle swarm technique [i.e., Laplace particle swarm (LPS) method] is proposed to facilitate parameters estimation of the predictor and improve modeling accuracy. The effectiveness of the developed FNN predictor and the associated LPS method is verified by a series of tests related to Mackey-Glass data forecast, exchange rate data prediction, and gear system prognosis. Test results show that the developed FNN predictor and the LPS method can capture the dynamics of multiple datasets effectively and track system characteristics accurately.
Short-term forecasting of turbidity in trunk main networks.
Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward
2017-11-01
Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Data-Driven Approach for Daily Real-Time Estimates and Forecasts of Near-Surface Soil Moisture
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Reichle, Rolf H.; Mahanama, Sarith P. P.
2017-01-01
NASAs Soil Moisture Active Passive (SMAP) mission provides global surface soil moisture retrievals with a revisit time of 2-3 days and a latency of 24 hours. Here, to enhance the utility of the SMAP data, we present an approach for improving real-time soil moisture estimates (nowcasts) and for forecasting soil moisture several days into the future. The approach, which involves using an estimate of loss processes (evaporation and drainage) and precipitation to evolve the most recent SMAP retrieval forward in time, is evaluated against subsequent SMAP retrievals themselves. The nowcast accuracy over the continental United States (CONUS) is shown to be markedly higher than that achieved with the simple yet common persistence approach. The accuracy of soil moisture forecasts, which rely on precipitation forecasts rather than on precipitation measurements, is reduced relative to nowcast accuracy but is still significantly higher than that obtained through persistence.
NASA Astrophysics Data System (ADS)
Sinha, T.; Arumugam, S.
2012-12-01
Seasonal streamflow forecasts contingent on climate forecasts can be effectively utilized in updating water management plans and optimize generation of hydroelectric power. Streamflow in the rainfall-runoff dominated basins critically depend on forecasted precipitation in contrast to snow dominated basins, where initial hydrological conditions (IHCs) are more important. Since precipitation forecasts from Atmosphere-Ocean-General Circulation Models are available at coarse scale (~2.8° by 2.8°), spatial and temporal downscaling of such forecasts are required to implement land surface models, which typically runs on finer spatial and temporal scales. Consequently, multiple sources are introduced at various stages in predicting seasonal streamflow. Therefore, in this study, we addresses the following science questions: 1) How do we attribute the errors in monthly streamflow forecasts to various sources - (i) model errors, (ii) spatio-temporal downscaling, (iii) imprecise initial conditions, iv) no forecasts, and (iv) imprecise forecasts? and 2) How does monthly streamflow forecast errors propagate with different lead time over various seasons? In this study, the Variable Infiltration Capacity (VIC) model is calibrated over Apalachicola River at Chattahoochee, FL in the southeastern US and implemented with observed 1/8° daily forcings to estimate reference streamflow during 1981 to 2010. The VIC model is then forced with different schemes under updated IHCs prior to forecasting period to estimate relative mean square errors due to: a) temporally disaggregation, b) spatial downscaling, c) Reverse Ensemble Streamflow Prediction (imprecise IHCs), d) ESP (no forecasts), and e) ECHAM4.5 precipitation forecasts. Finally, error propagation under different schemes are analyzed with different lead time over different seasons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Huaiguang; Zhang, Yingchen
This paper proposes an approach for distribution system state forecasting, which aims to provide an accurate and high speed state forecasting with an optimal synchrophasor sensor placement (OSSP) based state estimator and an extreme learning machine (ELM) based forecaster. Specifically, considering the sensor installation cost and measurement error, an OSSP algorithm is proposed to reduce the number of synchrophasor sensor and keep the whole distribution system numerically and topologically observable. Then, the weighted least square (WLS) based system state estimator is used to produce the training data for the proposed forecaster. Traditionally, the artificial neural network (ANN) and support vectormore » regression (SVR) are widely used in forecasting due to their nonlinear modeling capabilities. However, the ANN contains heavy computation load and the best parameters for SVR are difficult to obtain. In this paper, the ELM, which overcomes these drawbacks, is used to forecast the future system states with the historical system states. The proposed approach is effective and accurate based on the testing results.« less
Forecasting the Short-Term Passenger Flow on High-Speed Railway with Neural Networks
Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing
2014-01-01
Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway. PMID:25544838
Empirical prediction intervals improve energy forecasting
Kaack, Lynn H.; Apt, Jay; Morgan, M. Granger; McSharry, Patrick
2017-01-01
Hundreds of organizations and analysts use energy projections, such as those contained in the US Energy Information Administration (EIA)’s Annual Energy Outlook (AEO), for investment and policy decisions. Retrospective analyses of past AEO projections have shown that observed values can differ from the projection by several hundred percent, and thus a thorough treatment of uncertainty is essential. We evaluate the out-of-sample forecasting performance of several empirical density forecasting methods, using the continuous ranked probability score (CRPS). The analysis confirms that a Gaussian density, estimated on past forecasting errors, gives comparatively accurate uncertainty estimates over a variety of energy quantities in the AEO, in particular outperforming scenario projections provided in the AEO. We report probabilistic uncertainties for 18 core quantities of the AEO 2016 projections. Our work frames how to produce, evaluate, and rank probabilistic forecasts in this setting. We propose a log transformation of forecast errors for price projections and a modified nonparametric empirical density forecasting method. Our findings give guidance on how to evaluate and communicate uncertainty in future energy outlooks. PMID:28760997
Nonmechanistic forecasts of seasonal influenza with iterative one-week-ahead distributions.
Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni
2018-06-15
Accurate and reliable forecasts of seasonal epidemics of infectious disease can assist in the design of countermeasures and increase public awareness and preparedness. This article describes two main contributions we made recently toward this goal: a novel approach to probabilistic modeling of surveillance time series based on "delta densities", and an optimization scheme for combining output from multiple forecasting methods into an adaptively weighted ensemble. Delta densities describe the probability distribution of the change between one observation and the next, conditioned on available data; chaining together nonparametric estimates of these distributions yields a model for an entire trajectory. Corresponding distributional forecasts cover more observed events than alternatives that treat the whole season as a unit, and improve upon multiple evaluation metrics when extracting key targets of interest to public health officials. Adaptively weighted ensembles integrate the results of multiple forecasting methods, such as delta density, using weights that can change from situation to situation. We treat selection of optimal weightings across forecasting methods as a separate estimation task, and describe an estimation procedure based on optimizing cross-validation performance. We consider some details of the data generation process, including data revisions and holiday effects, both in the construction of these forecasting methods and when performing retrospective evaluation. The delta density method and an adaptively weighted ensemble of other forecasting methods each improve significantly on the next best ensemble component when applied separately, and achieve even better cross-validated performance when used in conjunction. We submitted real-time forecasts based on these contributions as part of CDC's 2015/2016 FluSight Collaborative Comparison. Among the fourteen submissions that season, this system was ranked by CDC as the most accurate.
Obesity and severe obesity forecasts through 2030.
Finkelstein, Eric A; Khavjou, Olga A; Thompson, Hope; Trogdon, Justin G; Pan, Liping; Sherry, Bettylou; Dietz, William
2012-06-01
Previous efforts to forecast future trends in obesity applied linear forecasts assuming that the rise in obesity would continue unabated. However, evidence suggests that obesity prevalence may be leveling off. This study presents estimates of adult obesity and severe obesity prevalence through 2030 based on nonlinear regression models. The forecasted results are then used to simulate the savings that could be achieved through modestly successful obesity prevention efforts. The study was conducted in 2009-2010 and used data from the 1990 through 2008 Behavioral Risk Factor Surveillance System (BRFSS). The analysis sample included nonpregnant adults aged ≥ 18 years. The individual-level BRFSS variables were supplemented with state-level variables from the U.S. Bureau of Labor Statistics, the American Chamber of Commerce Research Association, and the Census of Retail Trade. Future obesity and severe obesity prevalence were estimated through regression modeling by projecting trends in explanatory variables expected to influence obesity prevalence. Linear time trend forecasts suggest that by 2030, 51% of the population will be obese. The model estimates a much lower obesity prevalence of 42% and severe obesity prevalence of 11%. If obesity were to remain at 2010 levels, the combined savings in medical expenditures over the next 2 decades would be $549.5 billion. The study estimates a 33% increase in obesity prevalence and a 130% increase in severe obesity prevalence over the next 2 decades. If these forecasts prove accurate, this will further hinder efforts for healthcare cost containment. Copyright © 2012 Elsevier Inc. All rights reserved.
GPS Estimates of Integrated Precipitable Water Aid Weather Forecasters
NASA Technical Reports Server (NTRS)
Moore, Angelyn W.; Gutman, Seth I.; Holub, Kirk; Bock, Yehuda; Danielson, David; Laber, Jayme; Small, Ivory
2013-01-01
Global Positioning System (GPS) meteorology provides enhanced density, low-latency (30-min resolution), integrated precipitable water (IPW) estimates to NOAA NWS (National Oceanic and Atmospheric Adminis tration Nat ional Weather Service) Weather Forecast Offices (WFOs) to provide improved model and satellite data verification capability and more accurate forecasts of extreme weather such as flooding. An early activity of this project was to increase the number of stations contributing to the NOAA Earth System Research Laboratory (ESRL) GPS meteorology observing network in Southern California by about 27 stations. Following this, the Los Angeles/Oxnard and San Diego WFOs began using the enhanced GPS-based IPW measurements provided by ESRL in the 2012 and 2013 monsoon seasons. Forecasters found GPS IPW to be an effective tool in evaluating model performance, and in monitoring monsoon development between weather model runs for improved flood forecasting. GPS stations are multi-purpose, and routine processing for position solutions also yields estimates of tropospheric zenith delays, which can be converted into mm-accuracy PWV (precipitable water vapor) using in situ pressure and temperature measurements, the basis for GPS meteorology. NOAA ESRL has implemented this concept with a nationwide distribution of more than 300 "GPSMet" stations providing IPW estimates at sub-hourly resolution currently used in operational weather models in the U.S.
NASA Astrophysics Data System (ADS)
Lopez, Patricia; Verkade, Jan; Weerts, Albrecht; Solomatine, Dimitri
2014-05-01
Hydrological forecasting is subject to many sources of uncertainty, including those originating in initial state, boundary conditions, model structure and model parameters. Although uncertainty can be reduced, it can never be fully eliminated. Statistical post-processing techniques constitute an often used approach to estimate the hydrological predictive uncertainty, where a model of forecast error is built using a historical record of past forecasts and observations. The present study focuses on the use of the Quantile Regression (QR) technique as a hydrological post-processor. It estimates the predictive distribution of water levels using deterministic water level forecasts as predictors. This work aims to thoroughly verify uncertainty estimates using the implementation of QR that was applied in an operational setting in the UK National Flood Forecasting System, and to inter-compare forecast quality and skill in various, differing configurations of QR. These configurations are (i) 'classical' QR, (ii) QR constrained by a requirement that quantiles do not cross, (iii) QR derived on time series that have been transformed into the Normal domain (Normal Quantile Transformation - NQT), and (iv) a piecewise linear derivation of QR models. The QR configurations are applied to fourteen hydrological stations on the Upper Severn River with different catchments characteristics. Results of each QR configuration are conditionally verified for progressively higher flood levels, in terms of commonly used verification metrics and skill scores. These include Brier's probability score (BS), the continuous ranked probability score (CRPS) and corresponding skill scores as well as the Relative Operating Characteristic score (ROCS). Reliability diagrams are also presented and analysed. The results indicate that none of the four Quantile Regression configurations clearly outperforms the others.
Forecasting Device Effectiveness. Volume 2. Procedures
1985-06-01
about individual tasks is used to sup- port the fotul ratings. DEFT III. At this level of analysis, the analyst uses 11 rating scales to estimate...real world or are they sust’ined at unreal levels in the training environment? The third scale r3tes how much practice the trainee will have in the...real world ’ I., F. MIMflMfl MM/ MMPMMIMMHrnMN MHIMMIMIHMN/fMhMMM flMhlMlflMMfMM l!"MMt~MWfl l o 20 1) 40 5(0 6() 70 8O 9.’ 1, ,, 0 = None; the
NASA Astrophysics Data System (ADS)
Dong, Sheng; Chi, Kun; Zhang, Qiyi; Zhang, Xiangdong
2012-03-01
Compared with traditional real-time forecasting, this paper proposes a Grey Markov Model (GMM) to forecast the maximum water levels at hydrological stations in the estuary area. The GMM combines the Grey System and Markov theory into a higher precision model. The GMM takes advantage of the Grey System to predict the trend values and uses the Markov theory to forecast fluctuation values, and thus gives forecast results involving two aspects of information. The procedure for forecasting annul maximum water levels with the GMM contains five main steps: 1) establish the GM (1, 1) model based on the data series; 2) estimate the trend values; 3) establish a Markov Model based on relative error series; 4) modify the relative errors caused in step 2, and then obtain the relative errors of the second order estimation; 5) compare the results with measured data and estimate the accuracy. The historical water level records (from 1960 to 1992) at Yuqiao Hydrological Station in the estuary area of the Haihe River near Tianjin, China are utilized to calibrate and verify the proposed model according to the above steps. Every 25 years' data are regarded as a hydro-sequence. Eight groups of simulated results show reasonable agreement between the predicted values and the measured data. The GMM is also applied to the 10 other hydrological stations in the same estuary. The forecast results for all of the hydrological stations are good or acceptable. The feasibility and effectiveness of this new forecasting model have been proved in this paper.
Page, Morgan T.; Van Der Elst, Nicholas; Hardebeck, Jeanne L.; Felzer, Karen; Michael, Andrew J.
2016-01-01
Following a large earthquake, seismic hazard can be orders of magnitude higher than the long‐term average as a result of aftershock triggering. Because of this heightened hazard, emergency managers and the public demand rapid, authoritative, and reliable aftershock forecasts. In the past, U.S. Geological Survey (USGS) aftershock forecasts following large global earthquakes have been released on an ad hoc basis with inconsistent methods, and in some cases aftershock parameters adapted from California. To remedy this, the USGS is currently developing an automated aftershock product based on the Reasenberg and Jones (1989) method that will generate more accurate forecasts. To better capture spatial variations in aftershock productivity and decay, we estimate regional aftershock parameters for sequences within the García et al. (2012) tectonic regions. We find that regional variations for mean aftershock productivity reach almost a factor of 10. We also develop a method to account for the time‐dependent magnitude of completeness following large events in the catalog. In addition to estimating average sequence parameters within regions, we develop an inverse method to estimate the intersequence parameter variability. This allows for a more complete quantification of the forecast uncertainties and Bayesian updating of the forecast as sequence‐specific information becomes available.
Rainfall Estimation over the Nile Basin using an Adapted Version of the SCaMPR Algorithm
NASA Astrophysics Data System (ADS)
Habib, E. H.; Kuligowski, R. J.; Elshamy, M. E.; Ali, M. A.; Haile, A.; Amin, D.; Eldin, A.
2011-12-01
Management of Egypt's Aswan High Dam is critical not only for flood control on the Nile but also for ensuring adequate water supplies for most of Egypt since rainfall is scarce over the vast majority of its land area. However, reservoir inflow is driven by rainfall over Sudan, Ethiopia, Uganda, and several other countries from which routine rain gauge data are sparse. Satellite-derived estimates of rainfall offer a much more detailed and timely set of data to form a basis for decisions on the operation of the dam. A single-channel infrared algorithm is currently in operational use at the Egyptian Nile Forecast Center (NFC). This study reports on the adaptation of a multi-spectral, multi-instrument satellite rainfall estimation algorithm (Self-Calibrating Multivariate Precipitation Retrieval, SCaMPR) for operational application over the Nile Basin. The algorithm uses a set of rainfall predictors from multi-spectral Infrared cloud top observations and self-calibrates them to a set of predictands from Microwave (MW) rain rate estimates. For application over the Nile Basin, the SCaMPR algorithm uses multiple satellite IR channels recently available to NFC from the Spinning Enhanced Visible and Infrared Imager (SEVIRI). Microwave rain rates are acquired from multiple sources such as SSM/I, SSMIS, AMSU, AMSR-E, and TMI. The algorithm has two main steps: rain/no-rain separation using discriminant analysis, and rain rate estimation using stepwise linear regression. We test two modes of algorithm calibration: real-time calibration with continuous updates of coefficients with newly coming MW rain rates, and calibration using static coefficients that are derived from IR-MW data from past observations. We also compare the SCaMPR algorithm to other global-scale satellite rainfall algorithms (e.g., 'Tropical Rainfall Measuring Mission (TRMM) and other sources' (TRMM-3B42) product, and the National Oceanographic and Atmospheric Administration Climate Prediction Center (NOAA-CPC) CMORPH product. The algorithm has several potential future applications such as: improving the performance accuracy of hydrologic forecasting models over the Nile Basin, and utilizing the enhanced rainfall datasets and better-calibrated hydrologic models to assess the impacts of climate change on the region's water availability.
Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won
2011-01-01
To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.
DOT National Transportation Integrated Search
1995-01-01
The Virginia Department of Transportation uses a cash flow forecasting model to predict operations expenditures by month. Components of this general forecasting model estimate line items in the VDOT budget. The cash flow model was developed in the ea...
Daily Report, Supplement, East Europe.
1993-06-29
corporate income tax , which the finance minister has promised, will have no economy-stimulating effect whatsoever. [J.E.] In your opinion, will it...not followed the more optimistic forecasts. But the cut in the rate of corporate income tax , and in the case of individual income tax the higher... corporate income tax will probably fall short of the estimate. Momentarily, rev- enue from corporate income tax is higher this year than it was
An Operational System for Surveillance and Ecological Forecasting of West Nile Virus Outbreaks
NASA Astrophysics Data System (ADS)
Wimberly, M. C.; Davis, J. K.; Vincent, G.; Hess, A.; Hildreth, M. B.
2017-12-01
Mosquito-borne disease surveillance has traditionally focused on tracking human cases along with the abundance and infection status of mosquito vectors. For many of these diseases, vector and host population dynamics are also sensitive to climatic factors, including temperature fluctuations and the availability of surface water for mosquito breeding. Thus, there is a potential to strengthen surveillance and predict future outbreaks by monitoring environmental risk factors using broad-scale sensor networks that include earth-observing satellites. The South Dakota Mosquito Information System (SDMIS) project combines entomological surveillance with gridded meteorological data from NASA's North American Land Data Assimilation System (NLDAS) to generate weekly risk maps for West Nile virus (WNV) in the north-central United States. Critical components include a mosquito infection model that smooths the noisy infection rate and compensates for unbalanced sampling, and a human infection model that combines the entomological risk estimates with lagged effects of meteorological variables from the North American Land Data Assimilation System (NLDAS). Two types of forecasts are generated: long-term forecasts of statewide risk extending through the entire WNV season, and short-term forecasts of the geographic pattern of WNV risk in the upcoming week. Model forecasts are connected to public health actions through decision support matrices that link predicted risk levels to a set of phased responses. In 2016, the SDMIS successfully forecast an early start to the WNV season and a large outbreak of WNV cases following several years of low transmission. An evaluation of the 2017 forecasts will also be presented. Our experiences with the SDMIS highlight several important lessons that can inform future efforts at disease early warning. These include the value of integrating climatic models with recent observations of infection, the critical role of automated workflows to facilitate the timely integration of multiple data streams, the need for effective synthesis and visualization of forecasts, and the importance of linking forecasts to specific public health responses.
A statistical approach to quasi-extinction forecasting.
Holmes, Elizabeth Eli; Sabo, John L; Viscido, Steven Vincent; Fagan, William Fredric
2007-12-01
Forecasting population decline to a certain critical threshold (the quasi-extinction risk) is one of the central objectives of population viability analysis (PVA), and such predictions figure prominently in the decisions of major conservation organizations. In this paper, we argue that accurate forecasting of a population's quasi-extinction risk does not necessarily require knowledge of the underlying biological mechanisms. Because of the stochastic and multiplicative nature of population growth, the ensemble behaviour of population trajectories converges to common statistical forms across a wide variety of stochastic population processes. This paper provides a theoretical basis for this argument. We show that the quasi-extinction surfaces of a variety of complex stochastic population processes (including age-structured, density-dependent and spatially structured populations) can be modelled by a simple stochastic approximation: the stochastic exponential growth process overlaid with Gaussian errors. Using simulated and real data, we show that this model can be estimated with 20-30 years of data and can provide relatively unbiased quasi-extinction risk with confidence intervals considerably smaller than (0,1). This was found to be true even for simulated data derived from some of the noisiest population processes (density-dependent feedback, species interactions and strong age-structure cycling). A key advantage of statistical models is that their parameters and the uncertainty of those parameters can be estimated from time series data using standard statistical methods. In contrast for most species of conservation concern, biologically realistic models must often be specified rather than estimated because of the limited data available for all the various parameters. Biologically realistic models will always have a prominent place in PVA for evaluating specific management options which affect a single segment of a population, a single demographic rate, or different geographic areas. However, for forecasting quasi-extinction risk, statistical models that are based on the convergent statistical properties of population processes offer many advantages over biologically realistic models.
Combining forecast weights: Why and how?
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Kok-Haur, Ng; Hock-Eam, Lim
2012-09-01
This paper proposes a procedure called forecast weight averaging which is a specific combination of forecast weights obtained from different methods of constructing forecast weights for the purpose of improving the accuracy of pseudo out of sample forecasting. It is found that under certain specified conditions, forecast weight averaging can lower the mean squared forecast error obtained from model averaging. In addition, we show that in a linear and homoskedastic environment, this superior predictive ability of forecast weight averaging holds true irrespective whether the coefficients are tested by t statistic or z statistic provided the significant level is within the 10% range. By theoretical proofs and simulation study, we have shown that model averaging like, variance model averaging, simple model averaging and standard error model averaging, each produces mean squared forecast error larger than that of forecast weight averaging. Finally, this result also holds true marginally when applied to business and economic empirical data sets, Gross Domestic Product (GDP growth rate), Consumer Price Index (CPI) and Average Lending Rate (ALR) of Malaysia.
A Model For Rapid Estimation of Economic Loss
NASA Astrophysics Data System (ADS)
Holliday, J. R.; Rundle, J. B.
2012-12-01
One of the loftier goals in seismic hazard analysis is the creation of an end-to-end earthquake prediction system: a "rupture to rafters" work flow that takes a prediction of fault rupture, propagates it with a ground shaking model, and outputs a damage or loss profile at a given location. So far, the initial prediction of an earthquake rupture (either as a point source or a fault system) has proven to be the most difficult and least solved step in this chain. However, this may soon change. The Collaboratory for the Study of Earthquake Predictability (CSEP) has amassed a suite of earthquake source models for assorted testing regions worldwide. These models are capable of providing rate-based forecasts for earthquake (point) sources over a range of time horizons. Furthermore, these rate forecasts can be easily refined into probabilistic source forecasts. While it's still difficult to fully assess the "goodness" of each of these models, progress is being made: new evaluation procedures are being devised and earthquake statistics continue to accumulate. The scientific community appears to be heading towards a better understanding of rupture predictability. Ground shaking mechanics are better understood, and many different sophisticated models exists. While these models tend to be computationally expensive and often regionally specific, they do a good job at matching empirical data. It is perhaps time to start addressing the third step in the seismic hazard prediction system. We present a model for rapid economic loss estimation using ground motion (PGA or PGV) and socioeconomic measures as its input. We show that the model can be calibrated on a global scale and applied worldwide. We also suggest how the model can be improved and generalized to non-seismic natural disasters such as hurricane and severe wind storms.
Using Socioeconomic Data to Calibrate Loss Estimates
NASA Astrophysics Data System (ADS)
Holliday, J. R.; Rundle, J. B.
2013-12-01
One of the loftier goals in seismic hazard analysis is the creation of an end-to-end earthquake prediction system: a "rupture to rafters" work flow that takes a prediction of fault rupture, propagates it with a ground shaking model, and outputs a damage or loss profile at a given location. So far, the initial prediction of an earthquake rupture (either as a point source or a fault system) has proven to be the most difficult and least solved step in this chain. However, this may soon change. The Collaboratory for the Study of Earthquake Predictability (CSEP) has amassed a suite of earthquake source models for assorted testing regions worldwide. These models are capable of providing rate-based forecasts for earthquake (point) sources over a range of time horizons. Furthermore, these rate forecasts can be easily refined into probabilistic source forecasts. While it's still difficult to fully assess the "goodness" of each of these models, progress is being made: new evaluation procedures are being devised and earthquake statistics continue to accumulate. The scientific community appears to be heading towards a better understanding of rupture predictability. Ground shaking mechanics are better understood, and many different sophisticated models exists. While these models tend to be computationally expensive and often regionally specific, they do a good job at matching empirical data. It is perhaps time to start addressing the third step in the seismic hazard prediction system. We present a model for rapid economic loss estimation using ground motion (PGA or PGV) and socioeconomic measures as its input. We show that the model can be calibrated on a global scale and applied worldwide. We also suggest how the model can be improved and generalized to non-seismic natural disasters such as hurricane and severe wind storms.
Forecasting urban water demand: A meta-regression analysis.
Sebri, Maamar
2016-12-01
Water managers and planners require accurate water demand forecasts over the short-, medium- and long-term for many purposes. These range from assessing water supply needs over spatial and temporal patterns to optimizing future investments and planning future allocations across competing sectors. This study surveys the empirical literature on the urban water demand forecasting using the meta-analytical approach. Specifically, using more than 600 estimates, a meta-regression analysis is conducted to identify explanations of cross-studies variation in accuracy of urban water demand forecasting. Our study finds that accuracy depends significantly on study characteristics, including demand periodicity, modeling method, forecasting horizon, model specification and sample size. The meta-regression results remain robust to different estimators employed as well as to a series of sensitivity checks performed. The importance of these findings lies in the conclusions and implications drawn out for regulators and policymakers and for academics alike. Copyright © 2016. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Song, Yiliao; Qin, Shanshan; Qu, Jiansheng; Liu, Feng
2015-10-01
The issue of air quality regarding PM pollution levels in China is a focus of public attention. To address that issue, to date, a series of studies is in progress, including PM monitoring programs, PM source apportionment, and the enactment of new ambient air quality index standards. However, related research concerning computer modeling for PM future trends estimation is rare, despite its significance to forecasting and early warning systems. Thereby, a study regarding deterministic and interval forecasts of PM is performed. In this study, data on hourly and 12 h-averaged air pollutants are applied to forecast PM concentrations within the Yangtze River Delta (YRD) region of China. The characteristics of PM emissions have been primarily examined and analyzed using different distribution functions. To improve the distribution fitting that is crucial for estimating PM levels, an artificial intelligence algorithm is incorporated to select the optimal parameters. Following that step, an ANF model is used to conduct deterministic forecasts of PM. With the identified distributions and deterministic forecasts, different levels of PM intervals are estimated. The results indicate that the lognormal or gamma distributions are highly representative of the recorded PM data with a goodness-of-fit R2 of approximately 0.998. Furthermore, the results of the evaluation metrics (MSE, MAPE and CP, AW) also show high accuracy within the deterministic and interval forecasts of PM, indicating that this method enables the informative and effective quantification of future PM trends.
Evaluation Of Statistical Models For Forecast Errors From The HBV-Model
NASA Astrophysics Data System (ADS)
Engeland, K.; Kolberg, S.; Renard, B.; Stensland, I.
2009-04-01
Three statistical models for the forecast errors for inflow to the Langvatn reservoir in Northern Norway have been constructed and tested according to how well the distribution and median values of the forecasts errors fit to the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order autoregressive model was constructed for the forecast errors. The parameters were conditioned on climatic conditions. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order autoregressive model was constructed for the forecast errors. For the last model positive and negative errors were modeled separately. The errors were first NQT-transformed before a model where the mean values were conditioned on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: We wanted a) the median values to be close to the observed values; b) the forecast intervals to be narrow; c) the distribution to be correct. The results showed that it is difficult to obtain a correct model for the forecast errors, and that the main challenge is to account for the auto-correlation in the errors. Model 1 and 2 gave similar results, and the main drawback is that the distributions are not correct. The 95% forecast intervals were well identified, but smaller forecast intervals were over-estimated, and larger intervals were under-estimated. Model 3 gave a distribution that fits better, but the median values do not fit well since the auto-correlation is not properly accounted for. If the 95% forecast interval is of interest, Model 2 is recommended. If the whole distribution is of interest, Model 3 is recommended.
Learning-based Wind Estimation using Distant Soundings for Unguided Aerial Delivery
NASA Astrophysics Data System (ADS)
Plyler, M.; Cahoy, K.; Angermueller, K.; Chen, D.; Markuzon, N.
2016-12-01
Delivering unguided, parachuted payloads from aircraft requires accurate knowledge of the wind field inside an operational zone. Usually, a dropsonde released from the aircraft over the drop zone gives a more accurate wind estimate than a forecast. Mission objectives occasionally demand releasing the dropsonde away from the drop zone, but still require accuracy and precision. Barnes interpolation and many other assimilation methods do poorly when the forecast error is inconsistent in a forecast grid. A machine learning approach can better leverage non-linear relations between different weather patterns and thus provide a better wind estimate at the target drop zone when using data collected up to 100 km away. This study uses the 13 km resolution Rapid Refresh (RAP) dataset available through NOAA and subsamples to an area around Yuma, AZ and up to approximately 10km AMSL. RAP forecast grids are updated with simulated dropsondes taken from analysis (historical weather maps). We train models using different data mining and machine learning techniques, most notably boosted regression trees, that can accurately assimilate the distant dropsonde. The model takes a forecast grid and simulated remote dropsonde data as input and produces an estimate of the wind stick over the drop zone. Using ballistic winds as a defining metric, we show our data driven approach does better than Barnes interpolation under some conditions, most notably when the forecast error is different between the two locations, on test data previously unseen by the model. We study and evaluate the model's performance depending on the size, the time lag, the drop altitude, and the geographic location of the training set, and identify parameters most contributing to the accuracy of the wind estimation. This study demonstrates a new approach for assimilating remotely released dropsondes, based on boosted regression trees, and shows improvement in wind estimation over currently used methods.
The economic impact of NASA R and D spending Appendices
NASA Technical Reports Server (NTRS)
Evans, M. K.
1976-01-01
Seven appendices related to a previous report on the economic impact of NASA R and D spending were presented. They dealt with: (1) theoretical and empirical development of aggregate production functions, (2) the calculation of the time series for the rate of technological progress, (3) the calculation of the industry mix variable, (4) the estimation of distributed lags, (5) the estimation of the equations for gamma, (6) a ten-year forecast of the U.S. economy, (7) simulations of the macroeconomic model for increases in NASA R and D spending of $1.0, $.0.5, and 0.1 billions.
NASA Astrophysics Data System (ADS)
Zidikheri, Meelis J.; Lucas, Christopher; Potts, Rodney J.
2017-08-01
Airborne volcanic ash is a hazard to aviation. There is an increasing demand for quantitative forecasts of ash properties such as ash mass load to allow airline operators to better manage the risks of flying through airspace likely to be contaminated by ash. In this paper we show how satellite-derived mass load information at times prior to the issuance of the latest forecast can be used to estimate various model parameters that are not easily obtained by other means such as the distribution of mass of the ash column at the volcano. This in turn leads to better forecasts of ash mass load. We demonstrate the efficacy of this approach using several case studies.
Gershon, Andrea; Thiruchelvam, Deva; Moineddin, Rahim; Zhao, Xiu Yan; Hwee, Jeremiah; To, Teresa
2017-06-01
Knowing trends in and forecasting hospitalization and emergency department visit rates for chronic obstructive pulmonary disease (COPD) can enable health care providers, hospitals, and health care decision makers to plan for the future. We conducted a time-series analysis using health care administrative data from the Province of Ontario, Canada, to determine previous trends in acute care hospitalization and emergency department visit rates for COPD and then to forecast future rates. Individuals aged 35 years and older with physician-diagnosed COPD were identified using four universal government health administrative databases and a validated case definition. Monthly COPD hospitalization and emergency department visit rates per 1,000 people with COPD were determined from 2003 to 2014 and then forecasted to 2024 using autoregressive integrated moving average models. Between 2003 and 2014, COPD prevalence increased from 8.9 to 11.1%. During that time, there were 274,951 hospitalizations and 290,482 emergency department visits for COPD. After accounting for seasonality, we found that monthly COPD hospitalization and emergency department visit rates per 1,000 individuals with COPD remained stable. COPD prevalence was forecasted to increase to 12.7% (95% confidence interval [CI], 11.4-14.1) by 2024, whereas monthly COPD hospitalization and emergency department visit rates per 1,000 people with COPD were forecasted to remain stable at 2.7 (95% CI, 1.6-4.4) and 3.7 (95% CI, 2.3-5.6), respectively. Forecasted age- and sex-stratified rates were also stable. COPD hospital and emergency department visit rates per 1,000 people with COPD have been stable for more than a decade and are projected to remain stable in the near future. Given increasing COPD prevalence, this means notably more COPD health service use in the future.
NASA Astrophysics Data System (ADS)
Yuchi, Weiran; Yao, Jiayun; McLean, Kathleen E.; Stull, Roland; Pavlovic, Radenko; Davignon, Didier; Moran, Michael D.; Henderson, Sarah B.
2016-11-01
Fine particulate matter (PM2.5) generated by forest fires has been associated with a wide range of adverse health outcomes, including exacerbation of respiratory diseases and increased risk of mortality. Due to the unpredictable nature of forest fires, it is challenging for public health authorities to reliably evaluate the magnitude and duration of potential exposures before they occur. Smoke forecasting tools are a promising development from the public health perspective, but their widespread adoption is limited by their inherent uncertainties. Observed measurements from air quality monitoring networks and remote sensing platforms are more reliable, but they are inherently retrospective. It would be ideal to reduce the uncertainty in smoke forecasts by integrating any available observations. This study takes spatially resolved PM2.5 estimates from an empirical model that integrates air quality measurements with satellite data, and averages them with PM2.5 predictions from two smoke forecasting systems. Two different indicators of population respiratory health are then used to evaluate whether the blending improved the utility of the smoke forecasts. Among a total of six models, including two single forecasts and four blended forecasts, the blended estimates always performed better than the forecast values alone. Integrating measured observations into smoke forecasts could improve public health preparedness for smoke events, which are becoming more frequent and intense as the climate changes.
Liu, Yan; Watson, Stella C; Gettings, Jenna R; Lund, Robert B; Nordone, Shila K; Yabsley, Michael J; McMahan, Christopher S
2017-01-01
This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast's construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011-2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year's regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011-2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases.
Why preferring parametric forecasting to nonparametric methods?
Jabot, Franck
2015-05-07
A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.
2017-07-01
forecasts and observations on a common grid, which enables the application a number of different spatial verification methods that reveal various...forecasts of continuous meteorological variables using categorical and object-based methods . White Sands Missile Range (NM): Army Research Laboratory (US... Research version of the Weather Research and Forecasting Model adapted for generating short-range nowcasts and gridded observations produced by the
An experimental system for flood risk forecasting and monitoring at global scale
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Alfieri, Lorenzo; Kalas, Milan; Lorini, Valerio; Salamon, Peter
2017-04-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by a wide range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasting, combining streamflow estimations with expected inundated areas and flood impacts. Finally, emerging technologies such as crowdsourcing and social media monitoring can play a crucial role in flood disaster management and preparedness. Here, we present some recent advances of an experimental procedure for near-real time flood mapping and impact assessment. The procedure translates in near real-time the daily streamflow forecasts issued by the Global Flood Awareness System (GloFAS) into event-based flood hazard maps, which are then combined with exposure and vulnerability information at global scale to derive risk forecast. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To increase the reliability of our forecasts we propose the integration of model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification and correction of impact forecasts. Finally, we present the results of preliminary tests which show the potential of the proposed procedure in supporting emergency response and management.
Improving wave forecasting by integrating ensemble modelling and machine learning
NASA Astrophysics Data System (ADS)
O'Donncha, F.; Zhang, Y.; James, S. C.
2017-12-01
Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.
Forecast model applications of retrieved three dimensional liquid water fields
NASA Technical Reports Server (NTRS)
Raymond, William H.; Olson, William S.
1990-01-01
Forecasts are made for tropical storm Emily using heating rates derived from the SSM/I physical retrievals described in chapters 2 and 3. Average values of the latent heating rates from the convective and stratiform cloud simulations, used in the physical retrieval, are obtained for individual 1.1 km thick vertical layers. Then, the layer-mean latent heating rates are regressed against the slant path-integrated liquid and ice precipitation water contents to determine the best fit two parameter regression coefficients for each layer. The regression formulae and retrieved precipitation water contents are utilized to infer the vertical distribution of heating rates for forecast model applications. In the forecast model, diabatic temperature contributions are calculated and used in a diabatic initialization, or in a diabatic initialization combined with a diabatic forcing procedure. Our forecasts show that the time needed to spin-up precipitation processes in tropical storm Emily is greatly accelerated through the application of the data.
Simultaneous calibration of ensemble river flow predictions over an entire range of lead times
NASA Astrophysics Data System (ADS)
Hemri, S.; Fundel, F.; Zappa, M.
2013-10-01
Probabilistic estimates of future water levels and river discharge are usually simulated with hydrologic models using ensemble weather forecasts as main inputs. As hydrologic models are imperfect and the meteorological ensembles tend to be biased and underdispersed, the ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, in order to achieve both reliable and sharp predictions statistical postprocessing is required. In this work Bayesian model averaging (BMA) is applied to statistically postprocess ensemble runoff raw forecasts for a catchment in Switzerland, at lead times ranging from 1 to 240 h. The raw forecasts have been obtained using deterministic and ensemble forcing meteorological models with different forecast lead time ranges. First, BMA is applied based on mixtures of univariate normal distributions, subject to the assumption of independence between distinct lead times. Then, the independence assumption is relaxed in order to estimate multivariate runoff forecasts over the entire range of lead times simultaneously, based on a BMA version that uses multivariate normal distributions. Since river runoff is a highly skewed variable, Box-Cox transformations are applied in order to achieve approximate normality. Both univariate and multivariate BMA approaches are able to generate well calibrated probabilistic forecasts that are considerably sharper than climatological forecasts. Additionally, multivariate BMA provides a promising approach for incorporating temporal dependencies into the postprocessed forecasts. Its major advantage against univariate BMA is an increase in reliability when the forecast system is changing due to model availability.
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Hock-Eam, Lim
2012-09-01
Our empirical results show that we can predict GDP growth rate more accurately in continent with fewer large economies, compared to smaller economies like Malaysia. This difficulty is very likely positively correlated with subsidy or social security policies. The stage of economic development and level of competiveness also appears to have interactive effects on this forecast stability. These results are generally independent of the forecasting procedures. Countries with high stability in their economic growth, forecasting by model selection is better than model averaging. Overall forecast weight averaging (FWA) is a better forecasting procedure in most countries. FWA also outperforms simple model averaging (SMA) and has the same forecasting ability as Bayesian model averaging (BMA) in almost all countries.
Market-based demand forecasting promotes informed strategic financial planning.
Beech, A J
2001-11-01
Market-based demand forecasting is a method of estimating future demand for a healthcare organization's services by using a broad range of data that describe the nature of demand within the organization's service area. Such data include the primary and secondary service areas, the service-area populations by various demographic groupings, discharge utilization rates, market size, and market share by service line and organizationwide. Based on observable market dynamics, strategic planners can make a variety of explicit assumptions about future trends regarding these data to develop scenarios describing potential future demand. Financial planners then can evaluate each scenario to determine its potential effect on selected financial and operational measures, such as operating margin, days cash on hand, and debt-service coverage, and develop a strategic financial plan that covers a range of contingencies.
Optimal interpolation and the Kalman filter. [for analysis of numerical weather predictions
NASA Technical Reports Server (NTRS)
Cohn, S.; Isaacson, E.; Ghil, M.
1981-01-01
The estimation theory of stochastic-dynamic systems is described and used in a numerical study of optimal interpolation. The general form of data assimilation methods is reviewed. The Kalman-Bucy, KB filter, and optimal interpolation (OI) filters are examined for effectiveness in performance as gain matrices using a one-dimensional form of the shallow-water equations. Control runs in the numerical analyses were performed for a ten-day forecast in concert with the OI method. The effects of optimality, initialization, and assimilation were studied. It was found that correct initialization is necessary in order to localize errors, especially near boundary points. Also, the use of small forecast error growth rates over data-sparse areas was determined to offset inaccurate modeling of correlation functions near boundaries.
A stochastic post-processing method for solar irradiance forecasts derived from NWPs models
NASA Astrophysics Data System (ADS)
Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.
2010-09-01
Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.
Quantifying automobile refinishing VOC air emissions - a methodology with estimates and forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S.P.; Rubick, C.
1996-12-31
Automobile refinishing coatings (referred to as paints), paint thinners, reducers, hardeners, catalysts, and cleanup solvents used during their application, contain volatile organic compounds (VOCs) which are precursors to ground level ozone formation. Some of these painting compounds create hazardous air pollutants (HAPs) which are toxic. This paper documents the methodology, data sets, and the results of surveys (conducted in the fall of 1995) used to develop revised per capita emissions factors for estimating and forecasting the VOC air emissions from the area source category of automobile refinishing. Emissions estimates, forecasts, trends, and reasons for these trends are presented. Future emissionsmore » inventory (EI) challenges are addressed in light of data availability and information networks.« less
The case for probabilistic forecasting in hydrology
NASA Astrophysics Data System (ADS)
Krzysztofowicz, Roman
2001-08-01
That forecasts should be stated in probabilistic, rather than deterministic, terms has been argued from common sense and decision-theoretic perspectives for almost a century. Yet most operational hydrological forecasting systems produce deterministic forecasts and most research in operational hydrology has been devoted to finding the 'best' estimates rather than quantifying the predictive uncertainty. This essay presents a compendium of reasons for probabilistic forecasting of hydrological variates. Probabilistic forecasts are scientifically more honest, enable risk-based warnings of floods, enable rational decision making, and offer additional economic benefits. The growing demand for information about risk and the rising capability to quantify predictive uncertainties create an unparalleled opportunity for the hydrological profession to dramatically enhance the forecasting paradigm.
Operational Earthquake Forecasting of Aftershocks for New England
NASA Astrophysics Data System (ADS)
Ebel, J.; Fadugba, O. I.
2015-12-01
Although the forecasting of mainshocks is not possible, recent research demonstrates that probabilistic forecasts of expected aftershock activity following moderate and strong earthquakes is possible. Previous work has shown that aftershock sequences in intraplate regions behave similarly to those in California, and thus the operational aftershocks forecasting methods that are currently employed in California can be adopted for use in areas of the eastern U.S. such as New England. In our application, immediately after a felt earthquake in New England, a forecast of expected aftershock activity for the next 7 days will be generated based on a generic aftershock activity model. Approximately 24 hours after the mainshock, the parameters of the aftershock model will be updated using the observed aftershock activity observed to that point in time, and a new forecast of expected aftershock activity for the next 7 days will be issued. The forecast will estimate the average number of weak, felt aftershocks and the average expected number of aftershocks based on the aftershock statistics of past New England earthquakes. The forecast also will estimate the probability that an earthquake that is stronger than the mainshock will take place during the next 7 days. The aftershock forecast will specify the expected aftershocks locations as well as the areas over which aftershocks of different magnitudes could be felt. The system will use web pages, email and text messages to distribute the aftershock forecasts. For protracted aftershock sequences, new forecasts will be issued on a regular basis, such as weekly. Initially, the distribution system of the aftershock forecasts will be limited, but later it will be expanded as experience with and confidence in the system grows.
NASA Astrophysics Data System (ADS)
Wood, A. W.; Clark, E.; Mendoza, P. A.; Nijssen, B.; Newman, A. J.; Clark, M. P.; Arnold, J.; Nowak, K. C.
2016-12-01
Many if not most national operational short-to-medium range streamflow prediction systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow are automated, but others require the hands-on-effort of an experienced human forecaster. This approach evolved out of the need to correct for deficiencies in the models and datasets that were available for forecasting, and often leads to skillful predictions despite the use of relatively simple, conceptual models. On the other hand, the process is not reproducible, which limits opportunities to assess and incorporate process variations, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast ensembles and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun to develop more centralized, `over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, the operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as the systems are being rolled out in major operational forecasting centers. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis, Research, and Prediction' (SHARP) to implement, assess and demonstrate real-time over-the-loop forecasts. We present early hindcast and verification results from SHARP for short to medium range streamflow forecasts in a number of US case study watersheds.
NASA Astrophysics Data System (ADS)
Matter, M. A.; Garcia, L. A.; Fontane, D. G.
2005-12-01
Accuracy of water supply forecasts has improved for some river basins in the western U.S.A. by integrating knowledge of climate teleconnections, such as El Niño/Southern Oscillation (ENSO), into forecasting routines, but in other basins, such as the Colorado River Basin (CRB), forecast accuracy has declined (Pagano et al. 2004). Longer lead time and more accurate seasonal forecasts, particularly during floods or drought, could help reduce uncertainty and risk in decision-making and lengthen the period for planning more efficient and effective strategies for water use and ecosystem management. The goal of this research is to extend the lead time for snowmelt hydrograph estimation by 4-6 months (from spring to the preceding fall), and at the same time increase the accuracy of snowmelt runoff estimates in the Upper CRB (UCRB). We hypothesize that: (1) UCRB snowpack accumulation and melt are driven by large scale climate modes, including ENSO, PDO and AMO, that establish by fall and persist into early spring; (2) forecast analysis may begin in the fall prior to the start of the primary snow accumulation period and when energy to change the climate system is decreasing; and (3) between fall and early spring, streamflow hydrographs will amplify precipitation and temperature signals, and thus will evolve characteristically in response to wet, dry or average hydroclimatic conditions. Historical in situ records from largely unregulated river reaches and undeveloped time periods of the UCRB are used to test this hypothesis. Preliminary results show that, beginning in the fall (e.g., October or November) streamflow characteristics, including magnitude, rate of change and variability, as well as timing and magnitude of fall/early winter and late winter/early spring season flow volumes, are directly correlated with the magnitude of the upcoming snowmelt runoff (or annual basin yield). The use of climate teleconnections to determine characteristic streamflow responses in the UCRB advances understanding of atmosphere/land surface processes and interactions in complex terrain and subsequent effects on snowpack development and runoff (i.e., water supply), and may be used to improve seasonal forecast accuracy and extend lead time to develop more efficient and effective management strategies for water resources and ecosystems.
Disaggregating residential water demand for improved forecasts and decision making
NASA Astrophysics Data System (ADS)
Woodard, G.; Brookshire, D.; Chermak, J.; Krause, K.; Roach, J.; Stewart, S.; Tidwell, V.
2003-04-01
Residential water demand is the product of population and per capita demand. Estimates of per capita demand often are based on econometric models of demand, usually based on time series data of demand aggregated at the water provider level. Various studies have examined the impact of such factors as water pricing, weather, and income, with many other factors and details of water demand remaining unclear. Impacts of water conservation programs often are estimated using simplistic engineering calculations. Partly as a result of this, policy discussions regarding water demand management often focus on water pricing, water conservation, and growth control. Projecting water demand is often a straight-forward, if fairly uncertain process of forecasting population and per capita demand rates. SAHRA researchers are developing improved forecasts of residential water demand by disaggregating demand to the level of individuals, households, and specific water uses. Research results based on high-resolution water meter loggers, household-level surveys, economic experiments and recent census data suggest that changes in wealth, household composition, and individual behavior may affect demand more than changes in population or the stock of landscape plants, water-using appliances and fixtures, generally considered the primary determinants of demand. Aging populations and lower fertility rates are dramatically reducing household size, thereby increasing the number of households and residences for a given population. Recent prosperity and low interest rates have raised home ownership rates to unprecented levels. These two trends are leading to increased per capita outdoor water demand. Conservation programs have succeeded in certain areas, such as promoting drought-tolerant native landscaping, but have failed in other areas, such as increasing irrigation efficiency or curbing swimming pool water usage. Individual behavior often is more important than the household's stock of water-using fixtures, and ranges from hedonism (installing pools and whirlpool tubs) to satisficing (adjusting irrigation timers only twice per year) to acting on deeply-held conservation ethics in ways that not only fail any benefit-cost test, but are discouraged, or even illegal (reuse of gray water and black water). Research findings are being captured in dynamic simulation models that integrate social and natural science to create tools to assist water resource managers in providing sustainable water supplies and improving residential water demand forecasts. These models feature simple, graphical user interfaces and output screens that provide decision makers with visual, easy-to-understand information at the basin level. The models reveal connections between various supply and demand components, and highlight direct impacts and feedback mechanisms associated with various policy options.
Weight and cost forecasting for advanced manned space vehicles
NASA Technical Reports Server (NTRS)
Williams, Raymond
1989-01-01
A mass and cost estimating computerized methology for predicting advanced manned space vehicle weights and costs was developed. The user friendly methology designated MERCER (Mass Estimating Relationship/Cost Estimating Relationship) organizes the predictive process according to major vehicle subsystem levels. Design, development, test, evaluation, and flight hardware cost forecasting is treated by the study. This methodology consists of a complete set of mass estimating relationships (MERs) which serve as the control components for the model and cost estimating relationships (CERs) which use MER output as input. To develop this model, numerous MER and CER studies were surveyed and modified where required. Additionally, relationships were regressed from raw data to accommodate the methology. The models and formulations which estimated the cost of historical vehicles to within 20 percent of the actual cost were selected. The result of the research, along with components of the MERCER Program, are reported. On the basis of the analysis, the following conclusions were established: (1) The cost of a spacecraft is best estimated by summing the cost of individual subsystems; (2) No one cost equation can be used for forecasting the cost of all spacecraft; (3) Spacecraft cost is highly correlated with its mass; (4) No study surveyed contained sufficient formulations to autonomously forecast the cost and weight of the entire advanced manned vehicle spacecraft program; (5) No user friendly program was found that linked MERs with CERs to produce spacecraft cost; and (6) The group accumulation weight estimation method (summing the estimated weights of the various subsystems) proved to be a useful method for finding total weight and cost of a spacecraft.
Estimation of Uncertainties in Stage-Discharge Curve for an Experimental Himalayan Watershed
NASA Astrophysics Data System (ADS)
Kumar, V.; Sen, S.
2016-12-01
Various water resource projects developed on rivers originating from the Himalayan region, the "Water Tower of Asia", plays an important role on downstream development. Flow measurements at the desired river site are very critical for river engineers and hydrologists for water resources planning and management, flood forecasting, reservoir operation and flood inundation studies. However, an accurate discharge assessment of these mountainous rivers is costly, tedious and frequently dangerous to operators during flood events. Currently, in India, discharge estimation is linked to stage-discharge relationship known as rating curve. This relationship would be affected by a high degree of uncertainty. Estimating the uncertainty of rating curve remains a relevant challenge because it is not easy to parameterize. Main source of rating curve uncertainty are errors because of incorrect discharge measurement, variation in hydraulic conditions and depth measurement. In this study our objective is to obtain best parameters of rating curve that fit the limited record of observations and to estimate uncertainties at different depth obtained from rating curve. The rating curve parameters of standard power law are estimated for three different streams of Aglar watershed located in lesser Himalayas by maximum-likelihood estimator. Quantification of uncertainties in the developed rating curves is obtained from the estimate of variances and covariances of the rating curve parameters. Results showed that the uncertainties varied with catchment behavior with error varies between 0.006-1.831 m3/s. Discharge uncertainty in the Aglar watershed streams significantly depend on the extent of extrapolation outside the range of observed water levels. Extrapolation analysis confirmed that more than 15% for maximum discharges and 5% for minimum discharges are not strongly recommended for these mountainous gauging sites.
Assessing the Value of Frost Forecasts to Orchardists: A Dynamic Decision-Making Approach.
NASA Astrophysics Data System (ADS)
Katz, Richard W.; Murphy, Allan H.; Winkler, Robert L.
1982-04-01
The methodology of decision analysis is used to investigate the economic value of frost (i.e., minimum temperature) forecasts to orchardists. First, the fruit-frost situation and previous studies of the value of minimum temperature forecasts in this context are described. Then, after a brief overview of decision analysis, a decision-making model for the fruit-frost problem is presented. The model involves identifying the relevant actions and events (or outcomes), specifying the effect of taking protective action, and describing the relationships among temperature, bud loss, and yield loss. A bivariate normal distribution is used to model the relationship between forecast and observed temperatures, thereby characterizing the quality of different types of information. Since the orchardist wants to minimize expenses (or maximize payoffs) over the entire frost-protection season and since current actions and outcomes at any point in the season are related to both previous and future actions and outcomes, the decision-making problem is inherently dynamic in nature. As a result, a class of dynamic models known as Markov decision processes is considered. A computational technique called dynamic programming is used in conjunction with these models to determine the optimal actions and to estimate the value of meteorological information.Some results concerning the value of frost forecasts to orchardists in the Yakima Valley of central Washington are presented for the cases of red delicious apples, bartlett pears, and elberta peaches. Estimates of the parameter values in the Markov decision process are obtained from relevant physical and economic data. Twenty years of National Weather Service forecast and observed temperatures for the Yakima key station are used to estimate the quality of different types of information, including perfect forecasts, current forecasts, and climatological information. The orchardist's optimal actions over the frost-protection season and the expected expenses associated with the use of such information are determined using a dynamic programming algorithm. The value of meteorological information is defined as the difference between the expected expense for the information of interest and the expected expense for climatological information. Over the entire frost-protection season, the value estimates (in 1977 dollars) for current forecasts were $808 per acre for red delicious apples, $492 per acre for bartlett pears, and $270 per acre for elberta peaches. These amounts account for 66, 63, and 47%, respectively, of the economic value associated with decisions based on perfect forecasts. Varying the quality of the minimum temperature forecasts reveals that the relationship between the accuracy and value of such forecasts is nonlinear and that improvements in current forecasts would not be as significant in terms of economic value as were comparable improvements in the past.Several possible extensions of this study of the value of frost forecasts to orchardists are briefly described. Finally, the application of the dynamic model formulated in this paper to other decision-making problems involving the use of meteorological information is mentioned.
Using Time-Phased Casualty Estimates to Determine Medical Resupply Requirements
2006-09-18
calculated from the list of tasks. The RSVP-planned MTF laydown would be replaced by the reporting MTF with a known location. One advantage of...Another advantage is the ability to adapt quickly to changing requirements. Supplies that are used at a faster than initially forecast rate will...Officer ( GMO ) Platforms. San Diego, Calif: Naval Health Research Center; 2001. Technical Report No. 01-18. 5. Galarneau MR, Pang G, Konoske PJ
2013 Gulf of Mexico Hypoxia Forecast
Scavia, Donald; Evans, Mary Anne; Obenour, Dan
2013-01-01
The Gulf of Mexico annual summer hypoxia forecasts are based on average May total nitrogen loads from the Mississippi River basin for that year. The load estimate, recently released by USGS, is 7,316 metric tons per day. Based on that estimate, we predict the area of this summer’s hypoxic zone to be 18,900 square kilometers (95% credible interval, 13,400 to 24,200), the 7th largest reported and about the size of New Jersey. Our forecast hypoxic volume is 74.5 km3 (95% credible interval, 51.5 to 97.0), also the 7th largest on record.
Pecha, Petr; Šmídl, Václav
2016-11-01
A stepwise sequential assimilation algorithm is proposed based on an optimisation approach for recursive parameter estimation and tracking of radioactive plume propagation in the early stage of a radiation accident. Predictions of the radiological situation in each time step of the plume propagation are driven by an existing short-term meteorological forecast and the assimilation procedure manipulates the model parameters to match the observations incoming concurrently from the terrain. Mathematically, the task is a typical ill-posed inverse problem of estimating the parameters of the release. The proposed method is designated as a stepwise re-estimation of the source term release dynamics and an improvement of several input model parameters. It results in a more precise determination of the adversely affected areas in the terrain. The nonlinear least-squares regression methodology is applied for estimation of the unknowns. The fast and adequately accurate segmented Gaussian plume model (SGPM) is used in the first stage of direct (forward) modelling. The subsequent inverse procedure infers (re-estimates) the values of important model parameters from the actual observations. Accuracy and sensitivity of the proposed method for real-time forecasting of the accident propagation is studied. First, a twin experiment generating noiseless simulated "artificial" observations is studied to verify the minimisation algorithm. Second, the impact of the measurement noise on the re-estimated source release rate is examined. In addition, the presented method can be used as a proposal for more advanced statistical techniques using, e.g., importance sampling. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kinyoki, Damaris K; Berkley, James A; Moloney, Grainne M; Odundo, Elijah O; Kandala, Ngianga-Bakwin; Noor, Abdisalan M
2016-07-28
Stunting among children under five years old is associated with long-term effects on cognitive development, school achievement, economic productivity in adulthood and maternal reproductive outcomes. Accurate estimation of stunting and tools to forecast risk are key to planning interventions. We estimated the prevalence and distribution of stunting among children under five years in Somalia from 2007 to 2010 and explored the role of environmental covariates in its forecasting. Data from household nutritional surveys in Somalia from 2007 to 2010 with a total of 1,066 clusters covering 73,778 children were included. We developed a Bayesian hierarchical space-time model to forecast stunting by using the relationship between observed stunting and environmental covariates in the preceding years. We then applied the model coefficients to environmental covariates in subsequent years. To determine the accuracy of the forecasting, we compared this model with a model that used data from all the years with the corresponding environmental covariates. Rainfall (OR = 0.994, 95 % Credible interval (CrI): 0.993, 0.995) and vegetation cover (OR = 0.719, 95 % CrI: 0.603, 0.858) were significant in forecasting stunting. The difference in estimates of stunting using the two approaches was less than 3 % in all the regions for all forecast years. Stunting in Somalia is spatially and temporally heterogeneous. Rainfall and vegetation are major drivers of these variations. The use of environmental covariates for forecasting of stunting is a potentially useful and affordable tool for planning interventions to reduce the high burden of malnutrition in Somalia.
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...
2017-07-11
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application
NASA Astrophysics Data System (ADS)
Chen, Jinduan; Boccelli, Dominic L.
2018-02-01
Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
NASA Technical Reports Server (NTRS)
Wolff, David B.; Fisher, Brad L.
2010-01-01
Space-borne microwave sensors provide critical rain information used in several global multi-satellite rain products, which in turn are used for a variety of important studies, including landslide forecasting, flash flood warning, data assimilation, climate studies, and validation of model forecasts of precipitation. This study employs four years (2003-2006) of satellite data to assess the relative performance and skill of SSM/I (F13, F14 and F15), AMSU-B (N15, N16 and N17), AMSR-E (Aqua) and the TRMM Microwave Imager (TMI) in estimating surface rainfall based on direct instantaneous comparisons with ground-based rain estimates from Tropical Rainfall Measuring Mission (TRMM) Ground Validation (GV) sites at Kwajalein, Republic of the Marshall Islands (KWAJ) and Melbourne, Florida (MELB). The relative performance of each of these satellite estimates is examined via comparisons with space- and time-coincident GV radar-based rain rate estimates. Because underlying surface terrain is known to affect the relative performance of the satellite algorithms, the data for MELB was further stratified into ocean, land and coast categories using a 0.25 terrain mask. Of all the satellite estimates compared in this study, TMI and AMSR-E exhibited considerably higher correlations and skills in estimating/observing surface precipitation. While SSM/I and AMSU-B exhibited lower correlations and skills for each of the different terrain categories, the SSM/I absolute biases trended slightly lower than AMSRE over ocean, where the observations from both emission and scattering channels were used in the retrievals. AMSU-B exhibited the least skill relative to GV in all of the relevant statistical categories, and an anomalous spike was observed in the probability distribution functions near 1.0 mm/hr. This statistical artifact appears to be related to attempts by algorithm developers to include some lighter rain rates, not easily detectable by its scatter-only frequencies. AMSU-B, however, agreed well with GV when the matching data was analyzed on monthly scales. These results signal developers of global rainfall products, such as the TRMM Multi-Satellite Precipitation Analysis (TMPA), and the Climate Data Center s Morphing (CMORPH) technique, that care must be taken when incorporating data from these input satellite estimates in order to provide the highest quality estimates in their products. 3
NASA Technical Reports Server (NTRS)
Wolff, David B.; Fisher, Brad L.
2011-01-01
Space-borne microwave sensors provide critical rain information used in several global multi-satellite rain products, which in turn are used for a variety of important studies, including landslide forecasting, flash flood warning, data assimilation, climate studies, and validation of model forecasts of precipitation. This study employs four years (2003-2006) of satellite data to assess the relative performance and skill of SSM/I (F13, F14 and F15), AMSU-B (N15, N16 and N17), AMSR-E (Aqua) and the TRMM Microwave Imager (TMI) in estimating surface rainfall based on direct instantaneous comparisons with ground-based rain estimates from Tropical Rainfall Measuring Mission (TRMM) Ground Validation (GV) sites at Kwajalein, Republic of the Marshall Islands (KWAJ) and Melbourne, Florida (MELB). The relative performance of each of these satellite estimates is examined via comparisons with space- and time-coincident GV radar-based rain rate estimates. Because underlying surface terrain is known to affect the relative performance of the satellite algorithms, the data for MELB was further stratified into ocean, land and coast categories using a 0.25deg terrain mask. Of all the satellite estimates compared in this study, TMI and AMSR-E exhibited considerably higher correlations and skills in estimating/observing surface precipitation. While SSM/I and AMSU-B exhibited lower correlations and skills for each of the different terrain categories, the SSM/I absolute biases trended slightly lower than AMSR-E over ocean, where the observations from both emission and scattering channels were used in the retrievals. AMSU-B exhibited the least skill relative to GV in all of the relevant statistical categories, and an anomalous spike was observed in the probability distribution functions near 1.0 mm/hr. This statistical artifact appears to be related to attempts by algorithm developers to include some lighter rain rates, not easily detectable by its scatter-only frequencies. AMSU-B, however, agreed well with GV when the matching data was analyzed on monthly scales. These results signal developers of global rainfall products, such as the TRMM Multi-Satellite Precipitation Analysis (TMPA), and the Climate Data Center s Morphing (CMORPH) technique, that care must be taken when incorporating data from these input satellite estimates in order to provide the highest quality estimates in their products.
Estimating time-based instantaneous total mortality rate based on the age-structured abundance index
NASA Astrophysics Data System (ADS)
Wang, Yingbin; Jiao, Yan
2015-05-01
The instantaneous total mortality rate ( Z) of a fish population is one of the important parameters in fisheries stock assessment. The estimation of Z is crucial to fish population dynamics analysis, abundance and catch forecast, and fisheries management. A catch curve-based method for estimating time-based Z and its change trend from catch per unit effort (CPUE) data of multiple cohorts is developed. Unlike the traditional catch-curve method, the method developed here does not need the assumption of constant Z throughout the time, but the Z values in n continuous years are assumed constant, and then the Z values in different n continuous years are estimated using the age-based CPUE data within these years. The results of the simulation analyses show that the trends of the estimated time-based Z are consistent with the trends of the true Z, and the estimated rates of change from this approach are close to the true change rates (the relative differences between the change rates of the estimated Z and the true Z are smaller than 10%). Variations of both Z and recruitment can affect the estimates of Z value and the trend of Z. The most appropriate value of n can be different given the effects of different factors. Therefore, the appropriate value of n for different fisheries should be determined through a simulation analysis as we demonstrated in this study. Further analyses suggested that selectivity and age estimation are also two factors that can affect the estimated Z values if there is error in either of them, but the estimated change rates of Z are still close to the true change rates. We also applied this approach to the Atlantic cod ( Gadus morhua) fishery of eastern Newfoundland and Labrador from 1983 to 1997, and obtained reasonable estimates of time-based Z.
Assessment of GNSS-based height data of multiple ships for measuring and forecasting great tsunamis
NASA Astrophysics Data System (ADS)
Inazu, Daisuke; Waseda, Takuji; Hibiya, Toshiyuki; Ohta, Yusaku
2016-12-01
Ship height positioning by the Global Navigation Satellite System (GNSS) was investigated for measuring and forecasting great tsunamis. We first examined GNSS height-positioning data of a navigating vessel. If we use the kinematic precise point positioning (PPP) method, tsunamis greater than 10-1 m will be detected by ship height positioning. Based on Automatic Identification System (AIS) data, we found that tens of cargo ships and tankers are usually identified to navigate over the Nankai Trough, southwest Japan. We assumed that a future Nankai Trough great earthquake tsunami will be observed by the kinematic PPP height positioning of an AIS-derived ship distribution, and examined the tsunami forecast capability of the offshore tsunami measurements based on the PPP-based ship height. A method to estimate the initial tsunami height distribution using offshore tsunami observations was used for forecasting. Tsunami forecast tests were carried out using simulated tsunami data by the PPP-based ship height of 92 cargo ships/tankers, and by currently operating deep-sea pressure and Global Positioning System (GPS) buoy observations at 71 stations over the Nankai Trough. The forecast capability using the PPP-based height of the 92 ships was shown to be comparable to or better than that using the operating offshore observatories at the 71 stations. We suppose that, immediately after the occurrence of a great earthquake, stations receiving successive ship information (AIS data) along certain areas of the coast would fail to acquire ship data due to strong ground shaking, especially near the epicenter. Such a situation would significantly deteriorate the tsunami-forecast capability using ship data. On the other hand, operational real-time analysis of seismic/geodetic data would be carried out for estimating a tsunamigenic fault model. Incorporating the seismic/geodetic fault model estimation into the tsunami forecast above possibly compensates for the deteriorated forecast capability.
Verification of operational solar flare forecast: Case of Regional Warning Center Japan
NASA Astrophysics Data System (ADS)
Kubo, Yûki; Den, Mitsue; Ishii, Mamoru
2017-08-01
In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.
NOAA Propagation Database Value in Tsunami Forecast Guidance
NASA Astrophysics Data System (ADS)
Eble, M. C.; Wright, L. M.
2016-02-01
The National Oceanic and Atmospheric Administration (NOAA) Center for Tsunami Research (NCTR) has developed a tsunami forecasting capability that combines a graphical user interface with data ingestion and numerical models to produce estimates of tsunami wave arrival times, amplitudes, current or water flow rates, and flooding at specific coastal communities. The capability integrates several key components: deep-ocean observations of tsunamis in real-time, a basin-wide pre-computed propagation database of water level and flow velocities based on potential pre-defined seismic unit sources, an inversion or fitting algorithm to refine the tsunami source based on the observations during an event, and tsunami forecast models. As tsunami waves propagate across the ocean, observations from the deep ocean are automatically ingested into the application in real-time to better define the source of the tsunami itself. Since passage of tsunami waves over a deep ocean reporting site is not immediate, we explore the value of the NOAA propagation database in providing placeholder forecasts in advance of deep ocean observations. The propagation database consists of water elevations and flow velocities pre-computed for 50 x 100 [km] unit sources in a continuous series along all known ocean subduction zones. The 2011 Japan Tohoku tsunami is presented as the case study
Classifying Volcanic Activity Using an Empirical Decision Making Algorithm
NASA Astrophysics Data System (ADS)
Junek, W. N.; Jones, W. L.; Woods, M. T.
2012-12-01
Detection and classification of developing volcanic activity is vital to eruption forecasting. Timely information regarding an impending eruption would aid civil authorities in determining the proper response to a developing crisis. In this presentation, volcanic activity is characterized using an event tree classifier and a suite of empirical statistical models derived through logistic regression. Forecasts are reported in terms of the United States Geological Survey (USGS) volcano alert level system. The algorithm employs multidisciplinary data (e.g., seismic, GPS, InSAR) acquired by various volcano monitoring systems and source modeling information to forecast the likelihood that an eruption, with a volcanic explosivity index (VEI) > 1, will occur within a quantitatively constrained area. Logistic models are constructed from a sparse and geographically diverse dataset assembled from a collection of historic volcanic unrest episodes. Bootstrapping techniques are applied to the training data to allow for the estimation of robust logistic model coefficients. Cross validation produced a series of receiver operating characteristic (ROC) curves with areas ranging between 0.78-0.81, which indicates the algorithm has good predictive capabilities. The ROC curves also allowed for the determination of a false positive rate and optimum detection for each stage of the algorithm. Forecasts for historic volcanic unrest episodes in North America and Iceland were computed and are consistent with the actual outcome of the events.
NASA Astrophysics Data System (ADS)
Shair, Syazreen Niza; Yusof, Aida Yuzi; Asmuni, Nurin Haniah
2017-05-01
Coherent mortality forecasting models have recently received increasing attention particularly in their application to sub-populations. The advantage of coherent models over independent models is the ability to forecast a non-divergent mortality for two or more sub-populations. One of the coherent models was recently developed by [1] known as the product-ratio model. This model is an extension version of the functional independent model from [2]. The product-ratio model has been applied in a developed country, Australia [1] and has been extended in a developing nation, Malaysia [3]. While [3] accounted for coherency of mortality rates between gender and ethnic group, the coherency between states in Malaysia has never been explored. This paper will forecast the mortality rates of Malaysian sub-populations according to states using the product ratio coherent model and its independent version— the functional independent model. The forecast accuracies of two different models are evaluated using the out-of-sample error measurements— the mean absolute forecast error (MAFE) for age-specific death rates and the mean forecast error (MFE) for the life expectancy at birth. We employ Malaysian mortality time series data from 1991 to 2014, segregated by age, gender and states.
Probabilistic rainfall warning system with an interactive user interface
NASA Astrophysics Data System (ADS)
Koistinen, Jarmo; Hohti, Harri; Kauhanen, Janne; Kilpinen, Juha; Kurki, Vesa; Lauri, Tuomo; Nurmi, Pertti; Rossi, Pekka; Jokelainen, Miikka; Heinonen, Mari; Fred, Tommi; Moisseev, Dmitri; Mäkelä, Antti
2013-04-01
A real time 24/7 automatic alert system is in operational use at the Finnish Meteorological Institute (FMI). It consists of gridded forecasts of the exceedance probabilities of rainfall class thresholds in the continuous lead time range of 1 hour to 5 days. Nowcasting up to six hours applies ensemble member extrapolations of weather radar measurements. With 2.8 GHz processors using 8 threads it takes about 20 seconds to generate 51 radar based ensemble members in a grid of 760 x 1226 points. Nowcasting exploits also lightning density and satellite based pseudo rainfall estimates. The latter ones utilize convective rain rate (CRR) estimate from Meteosat Second Generation. The extrapolation technique applies atmospheric motion vectors (AMV) originally developed for upper wind estimation with satellite images. Exceedance probabilities of four rainfall accumulation categories are computed for the future 1 h and 6 h periods and they are updated every 15 minutes. For longer forecasts exceedance probabilities are calculated for future 6 and 24 h periods during the next 4 days. From approximately 1 hour to 2 days Poor man's Ensemble Prediction System (PEPS) is used applying e.g. the high resolution short range Numerical Weather Prediction models HIRLAM and AROME. The longest forecasts apply EPS data from the European Centre for Medium Range Weather Forecasts (ECMWF). The blending of the ensemble sets from the various forecast sources is performed applying mixing of accumulations with equal exceedance probabilities. The blending system contains a real time adaptive estimator of the predictability of radar based extrapolations. The uncompressed output data are written to file for each member, having total size of 10 GB. Ensemble data from other sources (satellite, lightning, NWP) are converted to the same geometry as the radar data and blended as was explained above. A verification system utilizing telemetering rain gauges has been established. Alert dissemination e.g. for citizens and professional end users applies SMS messages and, in near future, smartphone maps. The present interactive user interface facilitates free selection of alert sites and two warning thresholds (any rain, heavy rain) at any location in Finland. The pilot service was tested by 1000-3000 users during summers 2010 and 2012. As an example of dedicated end-user services gridded exceedance scenarios (of probabilities 5 %, 50 % and 90 %) of hourly rainfall accumulations for the next 3 hours have been utilized as an online input data for the influent model at the Greater Helsinki Wastewater Treatment Plant.
NASA Astrophysics Data System (ADS)
Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent
2014-05-01
The Short-Term Ensemble Prediction System (STEPS) is a probabilistic precipitation nowcasting scheme developed at the Australian Bureau of Meteorology in collaboration with the UK Met Office. In order to account for the multiscaling nature of rainfall structures, the radar field is decomposed into an 8 levels multiplicative cascade using a Fast Fourier Transform. The cascade is advected using the velocity field estimated with optical flow and evolves stochastically according to a hierarchy of auto-regressive processes. This allows reproducing the empirical observation that the rate of temporal evolution of the small scales is faster than the large scales. The uncertainty in radar rainfall measurement and the unknown future development of the velocity field are also considered by stochastic modelling in order to reflect their typical spatial and temporal variability. Recently, a 4 years national research program has been initiated by the University of Leuven, the Royal Meteorological Institute (RMI) of Belgium and 3 other partners: PLURISK ("forecasting and management of extreme rainfall induced risks in the urban environment"). The project deals with the nowcasting of rainfall and subsequent urban inundations, as well as socio-economic risk quantification, communication, warning and prevention. At the urban scale it is widely recognized that the uncertainty of hydrological and hydraulic models is largely driven by the input rainfall estimation and forecast uncertainty. In support to the PLURISK project the RMI aims at integrating STEPS in the current operational deterministic precipitation nowcasting system INCA-BE (Integrated Nowcasting through Comprehensive Analysis). This contribution will illustrate examples of STEPS ensemble and probabilistic nowcasts for a few selected case studies of stratiform and convective rain in Belgium. The paper focuses on the development of STEPS products for potential hydrological users and a preliminary verification of the nowcasts, especially to analyze the spatial distribution of forecast errors. The analysis of nowcast biases reveals the locations where the convective initiation, rainfall growth and decay processes significantly reduce the forecast accuracy, but also points out the need for improving the radar-based quantitative precipitation estimation product that is used both to generate and verify the nowcasts. The collection of fields of verification statistics is implemented using an online update strategy, which potentially enables the system to learn from forecast errors as the archive of nowcasts grows. The study of the spatial or temporal distribution of nowcast errors is a key step to convey to the users an overall estimation of the nowcast accuracy and to drive future model developments.
Qader, Sarchil Hama; Dash, Jadunandan; Atkinson, Peter M
2018-02-01
Crop production and yield estimation using remotely sensed data have been studied widely, but such information is generally scarce in arid and semi-arid regions. In these regions, inter-annual variation in climatic factors (such as rainfall) combined with anthropogenic factors (such as civil war) pose major risks to food security. Thus, an operational crop production estimation and forecasting system is required to help decision-makers to make early estimates of potential food availability. Data from NASA's MODIS with official crop statistics were combined to develop an empirical regression-based model to forecast winter wheat and barley production in Iraq. The study explores remotely sensed indices representing crop productivity over the crop growing season to find the optimal correlation with crop production. The potential of three different remotely sensed indices, and information related to the phenology of crops, for forecasting crop production at the governorate level was tested and their results were validated using the leave-one-year-out approach. Despite testing several methodological approaches, and extensive spatio-temporal analysis, this paper depicts the difficulty in estimating crop yield on an annual base using current satellite low-resolution data. However, more precise estimates of crop production were possible. The result of the current research implies that the date of the maximum vegetation index (VI) offered the most accurate forecast of crop production with an average R 2 =0.70 compared to the date of MODIS EVI (Avg R 2 =0.68) and a NPP (Avg R 2 =0.66). When winter wheat and barley production were forecasted using NDVI, EVI and NPP and compared to official statistics, the relative error ranged from -20 to 20%, -45 to 28% and -48 to 22%, respectively. The research indicated that remotely sensed indices could characterize and forecast crop production more accurately than simple cropping area, which was treated as a null model against which to evaluate the proposed approach. Copyright © 2017 Elsevier B.V. All rights reserved.
Forecasting outbreaks of the Douglas-fir tussock moth from lower crown cocoon samples.
Richard R. Mason; Donald W. Scott; H. Gene Paul
1993-01-01
A predictive technique using a simple linear regression was developed to forecast the midcrown density of small tussock moth larvae from estimates of cocoon density in the previous generation. The regression estimator was derived from field samples of cocoons and larvae taken from a wide range of nonoutbreak tussock moth populations. The accuracy of the predictions was...
Modelling eWork in Europe: Estimates, Models and Forecasts from the EMERGENCE Project. IES Report.
ERIC Educational Resources Information Center
Bates, P.; Huws, U.
A study combined results of a survey of employers in 18 European countries to establish the extent to which they are currently using eWork with European official statistics to develop models, estimates, and forecasts of the numbers of eWorkers in Europe. These four types of "individual" eWork were identified: telehomeworking;…
enter or select the go button to submit request City, St Go Science Research and Collaboration Hydrology River Forecasts, January 2002 AMS Short Course on Quantitative Precipitation Estimation and Forecasting
Operational foreshock forecasting: Fifteen years after
NASA Astrophysics Data System (ADS)
Ogata, Y.
2010-12-01
We are concerned with operational forecasting of the probability that events are foreshocks of a forthcoming earthquake that is significantly larger (mainshock). Specifically, we define foreshocks as the preshocks substantially smaller than the mainshock by a magnitude gap of 0.5 or larger. The probability gain of foreshock forecast is extremely high compare to long-term forecast by renewal processes or various alarm-based intermediate-term forecasts because of a large event’s low occurrence rate in a short period and a narrow target region. Thus, it is desired to establish operational foreshock probability forecasting as seismologists have done for aftershocks. When a series of earthquakes occurs in a region, we attempt to discriminate foreshocks from a swarm or mainshock-aftershock sequence. Namely, after real time identification of an earthquake cluster using methods such as the single-link algorithm, the probability is calculated by applying statistical features that discriminate foreshocks from other types of clusters, by considering the events' stronger proximity in time and space and tendency towards chronologically increasing magnitudes. These features were modeled for probability forecasting and the coefficients of the model were estimated in Ogata et al. (1996) for the JMA hypocenter data (M≧4, 1926-1993). Currently, fifteen years has passed since the publication of the above-stated work so that we are able to present the performance and validation of the forecasts (1994-2009) by using the same model. Taking isolated events into consideration, the probability of the first events in a potential cluster being a foreshock vary in a range between 0+% and 10+% depending on their locations. This conditional forecasting performs significantly better than the unconditional (average) foreshock probability of 3.7% throughout Japan region. Furthermore, when we have the additional events in a cluster, the forecast probabilities range more widely from nearly 0% to about 40% depending on the discrimination features among the events in the cluster. This conditional forecasting further performs significantly better than the unconditional foreshock probability of 7.3%, which is the average probability of the plural events in the earthquake clusters. Indeed, the frequency ratios of the actual foreshocks are consistent with the forecasted probabilities. Reference: Ogata, Y., Utsu, T. and Katsura, K. (1996). Statistical discrimination of foreshocks from other earthquake clusters, Geophys. J. Int. 127, 17-30.
Inclusion of biomass burning in WRF-Chem: Impact of wildfires on weather forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grell, G. A.; Freitas, Saulo; Stuefer, Martin
2011-06-06
A plume rise algorithm for wildfires was included in WRF-Chem, and applied to look at the impact of intense wildfires during the 2004 Alaska wildfire season on weather forecasts using model resolutions of 10km and 2km. Biomass burning emissions were estimated using a biomass burning emissions model. In addition, a 1-D, time-dependent cloud model was used online in WRF-Chem to estimate injection heights as well as the final emission rates. It was shown that with the inclusion of the intense wildfires of the 2004 fire season in the model simulations, the interaction of the aerosols with the atmospheric radiation ledmore » to significant modifications of vertical profiles of temperature and moisture in cloud-free areas. On the other hand, when clouds were present, the high concentrations of fine aerosol (PM2.5) and the resulting large numbers of Cloud Condensation Nuclei (CCN) had a strong impact on clouds and microphysics, with decreased precipitation coverage and precipitation amounts during the first 12 hours of the integration, but significantly stronger storms during the afternoon hours.« less
NASA Astrophysics Data System (ADS)
Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K.
2015-08-01
Accuracy of reservoir inflow forecasts is instrumental for maximizing the value of water resources and benefits gained through hydropower generation. Improving hourly reservoir inflow forecasts over a 24 h lead time is considered within the day-ahead (Elspot) market of the Nordic exchange market. A complementary modelling framework presents an approach for improving real-time forecasting without needing to modify the pre-existing forecasting model, but instead formulating an independent additive or complementary model that captures the structure the existing operational model may be missing. We present here the application of this principle for issuing improved hourly inflow forecasts into hydropower reservoirs over extended lead times, and the parameter estimation procedure reformulated to deal with bias, persistence and heteroscedasticity. The procedure presented comprises an error model added on top of an unalterable constant parameter conceptual model. This procedure is applied in the 207 km2 Krinsvatn catchment in central Norway. The structure of the error model is established based on attributes of the residual time series from the conceptual model. Besides improving forecast skills of operational models, the approach estimates the uncertainty in the complementary model structure and produces probabilistic inflow forecasts that entrain suitable information for reducing uncertainty in the decision-making processes in hydropower systems operation. Deterministic and probabilistic evaluations revealed an overall significant improvement in forecast accuracy for lead times up to 17 h. Evaluation of the percentage of observations bracketed in the forecasted 95 % confidence interval indicated that the degree of success in containing 95 % of the observations varies across seasons and hydrologic years.
A Copula-Based Conditional Probabilistic Forecast Model for Wind Power Ramps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Krishnan, Venkat K; Zhang, Jie
Efficient management of wind ramping characteristics can significantly reduce wind integration costs for balancing authorities. By considering the stochastic dependence of wind power ramp (WPR) features, this paper develops a conditional probabilistic wind power ramp forecast (cp-WPRF) model based on Copula theory. The WPRs dataset is constructed by extracting ramps from a large dataset of historical wind power. Each WPR feature (e.g., rate, magnitude, duration, and start-time) is separately forecasted by considering the coupling effects among different ramp features. To accurately model the marginal distributions with a copula, a Gaussian mixture model (GMM) is adopted to characterize the WPR uncertaintymore » and features. The Canonical Maximum Likelihood (CML) method is used to estimate parameters of the multivariable copula. The optimal copula model is chosen based on the Bayesian information criterion (BIC) from each copula family. Finally, the best conditions based cp-WPRF model is determined by predictive interval (PI) based evaluation metrics. Numerical simulations on publicly available wind power data show that the developed copula-based cp-WPRF model can predict WPRs with a high level of reliability and sharpness.« less
Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)
NASA Astrophysics Data System (ADS)
Luo, Y.
2009-12-01
Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.
Projecting 21st century coastal cliff retreat in Southern California
NASA Astrophysics Data System (ADS)
Limber, P. W.; Barnard, P.; Erikson, L. H.; Vitousek, S.
2016-12-01
In California, sea level is expected to rise over 1 m by 2100, with extreme projections approaching 3 m. Sea level rise (SLR) increases the frequency, severity, and duration of wave impacts on coastal cliffs, potentially accelerating cliff retreat rates. To assess the future risk to cliff-top infrastructure, densely populated Southern California cities like Los Angeles and San Diego require estimates of coastal retreat over long time (multi-decadal) and large spatial (>100 km) scales. We developed a suite of eight coastal cliff retreat models, ranging in complexity from empirical 1-D representations of cliff response to wave impacts to more intricate 2-D process-based models integrated with artificial neural networks. The ensemble produces a comprehensive estimate of time-averaged coastal cliff retreat with uncertainty, is applicable to different geological environments, and is flexible in application depending on processing power, available data, and/or available time (e.g. if processing power and time are limited, the fast 1-D models can be used as a `rapid assessment' tool). Global-to-local nested wave models provided the hindcasts (1980-2010) and forecasts (2010-2100) used to force the models, and waves were applied in combination with eight SLR scenarios ranging from 0.25 m to 2 m. In the more detailed models, tides, non-tidal residuals, and storm surge were included for the hindcast and forecast periods. For model calibration, a new automated cliff edge extraction routine was used to estimate historical cliff retreat rates from LiDAR data. Initial model application to Southern California suggests that 1 m of SLR during the 21st century will cause cliff retreat rates to increase on average by over 50% relative to historical rates. Model results also demonstrate how small-scale, episodic cliff failure events can coalesce through time into spatially uniform, long-term cliff retreat signals.
Bayesian Processor of Output for Probabilistic Quantitative Precipitation Forecasting
NASA Astrophysics Data System (ADS)
Krzysztofowicz, R.; Maranzano, C. J.
2006-05-01
The Bayesian Processor of Output (BPO) is a new, theoretically-based technique for probabilistic forecasting of weather variates. It processes output from a numerical weather prediction (NWP) model and optimally fuses it with climatic data in order to quantify uncertainty about a predictand. The BPO is being tested by producing Probabilistic Quantitative Precipitation Forecasts (PQPFs) for a set of climatically diverse stations in the contiguous U.S. For each station, the PQPFs are produced for the same 6-h, 12-h, and 24-h periods up to 84- h ahead for which operational forecasts are produced by the AVN-MOS (Model Output Statistics technique applied to output fields from the Global Spectral Model run under the code name AVN). The inputs into the BPO are estimated as follows. The prior distribution is estimated from a (relatively long) climatic sample of the predictand; this sample is retrieved from the archives of the National Climatic Data Center. The family of the likelihood functions is estimated from a (relatively short) joint sample of the predictor vector and the predictand; this sample is retrieved from the same archive that the Meteorological Development Laboratory of the National Weather Service utilized to develop the AVN-MOS system. This talk gives a tutorial introduction to the principles and procedures behind the BPO, and highlights some results from the testing: a numerical example of the estimation of the BPO, and a comparative verification of the BPO forecasts and the MOS forecasts. It concludes with a list of demonstrated attributes of the BPO (vis- à-vis the MOS): more parsimonious definitions of predictors, more efficient extraction of predictive information, better representation of the distribution function of predictand, and equal or better performance (in terms of calibration and informativeness).
NASA Astrophysics Data System (ADS)
Rossa, Andrea M.; Laudanna Del Guerra, Franco; Borga, Marco; Zanon, Francesco; Settin, Tommaso; Leuenberger, Daniel
2010-11-01
SummaryThis study aims to assess the feasibility of assimilating carefully checked radar rainfall estimates into a numerical weather prediction (NWP) to extend the forecasting lead time for an extreme flash flood. The hydro-meteorological modeling chain includes the convection-permitting NWP model COSMO-2 and a coupled hydrological-hydraulic model. Radar rainfall estimates are assimilated into the NWP model via the latent heat nudging method. The study is focused on 26 September 2007 extreme flash flood which impacted the coastal area of North-eastern Italy around Venice. The hydro-meteorological modeling system is implemented over the 90 km2 Dese river basin draining to the Venice Lagoon. The radar rainfall observations are carefully checked for artifacts, including rain-induced signal attenuation, by means of physics-based correction procedures and comparison with a dense network of raingauges. The impact of the radar rainfall estimates in the assimilation cycle of the NWP model is very significant. The main individual organized convective systems are successfully introduced into the model state, both in terms of timing and localization. Also, high-intensity incorrectly localized precipitation is correctly reduced to about the observed levels. On the other hand, the highest rainfall intensities computed after assimilation underestimate the observed values by 20% and 50% at a scale of 20 km and 5 km, respectively. The positive impact of assimilating radar rainfall estimates is carried over into the free forecast for about 2-5 h, depending on when the forecast was started. The positive impact is larger when the main mesoscale convective system is present in the initial conditions. The improvements in the precipitation forecasts are propagated to the river flow simulations, with an extension of the forecasting lead time up to 3 h.
Ensemble forecast of human West Nile virus cases and mosquito infection rates
NASA Astrophysics Data System (ADS)
Defelice, Nicholas B.; Little, Eliza; Campbell, Scott R.; Shaman, Jeffrey
2017-02-01
West Nile virus (WNV) is now endemic in the continental United States; however, our ability to predict spillover transmission risk and human WNV cases remains limited. Here we develop a model depicting WNV transmission dynamics, which we optimize using a data assimilation method and two observed data streams, mosquito infection rates and reported human WNV cases. The coupled model-inference framework is then used to generate retrospective ensemble forecasts of historical WNV outbreaks in Long Island, New York for 2001-2014. Accurate forecasts of mosquito infection rates are generated before peak infection, and >65% of forecasts accurately predict seasonal total human WNV cases up to 9 weeks before the past reported case. This work provides the foundation for implementation of a statistically rigorous system for real-time forecast of seasonal outbreaks of WNV.
Ensemble forecast of human West Nile virus cases and mosquito infection rates.
DeFelice, Nicholas B; Little, Eliza; Campbell, Scott R; Shaman, Jeffrey
2017-02-24
West Nile virus (WNV) is now endemic in the continental United States; however, our ability to predict spillover transmission risk and human WNV cases remains limited. Here we develop a model depicting WNV transmission dynamics, which we optimize using a data assimilation method and two observed data streams, mosquito infection rates and reported human WNV cases. The coupled model-inference framework is then used to generate retrospective ensemble forecasts of historical WNV outbreaks in Long Island, New York for 2001-2014. Accurate forecasts of mosquito infection rates are generated before peak infection, and >65% of forecasts accurately predict seasonal total human WNV cases up to 9 weeks before the past reported case. This work provides the foundation for implementation of a statistically rigorous system for real-time forecast of seasonal outbreaks of WNV.
An experimental system for flood risk forecasting at global scale
NASA Astrophysics Data System (ADS)
Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.
2016-12-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.
Liu, Yan; Watson, Stella C.; Gettings, Jenna R.; Lund, Robert B.; Nordone, Shila K.; McMahan, Christopher S.
2017-01-01
This paper forecasts the 2016 canine Anaplasma spp. seroprevalence in the United States from eight climate, geographic and societal factors. The forecast’s construction and an assessment of its performance are described. The forecast is based on a spatial-temporal conditional autoregressive model fitted to over 11 million Anaplasma spp. seroprevalence test results for dogs conducted in the 48 contiguous United States during 2011–2015. The forecast uses county-level data on eight predictive factors, including annual temperature, precipitation, relative humidity, county elevation, forestation coverage, surface water coverage, population density and median household income. Non-static factors are extrapolated into the forthcoming year with various statistical methods. The fitted model and factor extrapolations are used to estimate next year’s regional prevalence. The correlation between the observed and model-estimated county-by-county Anaplasma spp. seroprevalence for the five-year period 2011–2015 is 0.902, demonstrating reasonable model accuracy. The weighted correlation (accounting for different sample sizes) between 2015 observed and forecasted county-by-county Anaplasma spp. seroprevalence is 0.987, exhibiting that the proposed approach can be used to accurately forecast Anaplasma spp. seroprevalence. The forecast presented herein can a priori alert veterinarians to areas expected to see Anaplasma spp. seroprevalence beyond the accepted endemic range. The proposed methods may prove useful for forecasting other diseases. PMID:28738085
Economic indicators selection for crime rates forecasting using cooperative feature selection
NASA Astrophysics Data System (ADS)
Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Salleh Sallehuddin, Roselina
2013-04-01
Features selection in multivariate forecasting model is very important to ensure that the model is accurate. The purpose of this study is to apply the Cooperative Feature Selection method for features selection. The features are economic indicators that will be used in crime rate forecasting model. The Cooperative Feature Selection combines grey relational analysis and artificial neural network to establish a cooperative model that can rank and select the significant economic indicators. Grey relational analysis is used to select the best data series to represent each economic indicator and is also used to rank the economic indicators according to its importance to the crime rate. After that, the artificial neural network is used to select the significant economic indicators for forecasting the crime rates. In this study, we used economic indicators of unemployment rate, consumer price index, gross domestic product and consumer sentiment index, as well as data rates of property crime and violent crime for the United States. Levenberg-Marquardt neural network is used in this study. From our experiments, we found that consumer price index is an important economic indicator that has a significant influence on the violent crime rate. While for property crime rate, the gross domestic product, unemployment rate and consumer price index are the influential economic indicators. The Cooperative Feature Selection is also found to produce smaller errors as compared to Multiple Linear Regression in forecasting property and violent crime rates.
Physics-Based and Statistical Forecasting in Slowly Stressed Environments
NASA Astrophysics Data System (ADS)
Segou, M.; Deschamps, A.
2013-12-01
We perform a retrospective forecasting experiment between 1995-2012, comparing the predictive power of physics-based and statistical models in Corinth Gulf (Central Greece), which is the fastest continental rift in the world with extension rates 11-15 mm/yr, but also at least three times lower than the motion accommodated by the San Andreas Fault System (~40 mm/yr). The seismicity of the western Corinth gulf has been characterized by significant historical events (1817 M6.6, 1861 M6.7, 1889 M7.0) whereas the modern instrumental catalog (post-1964) reveals one major event, the 1995 Aigio M6.4 (15/06/1995) together with several periods of increased microseismic activity, usually lasting few months and possibly related with fluid diffusion. We examine six predictive models, three based on the combination of Coulomb stress changes and rate-and-state theory (CRS), two epidemic type aftershock sequence (ETAS) models and one hybrid CRS-ETAS (h-ETAS) model. We investigate whether the above forecast models can adequately describe the episodic swarm activity within the gulf. Even though Corinth gulf has been studied extensively in the past there is still today a debate whether earthquake activity is related with the existence of either a shallow dipping structure or steeply dipping normal faults. In the light of the above statement, two CRS realization are based on resolving Coulomb stress changes on specified receiver faults, expressing the aforementioned structural models, whereas the third CRS model uses optimally-oriented for failure planes. In our CRS implementation we account for stress changes following all major ruptures within our testing phase with M greater than 4.5. We also estimate fault constitutive parameters from modeling the response to major earthquakes at the vicinity of the gulf (Ασ=0.2, stressing rate 0.02 bar/yr). The ETAS parameters are taken as the maximum likelihood estimates derived from stochastic declustering of the modern seismicity catalog with minimum triggering magnitude M2.5. We implement likelihood tests to evaluate our forecasts for their spatial consistency and for the total amount of predicted versus observed events with M greater than 3.0 in 10-day time intervals in two distinct evaluation phases. The first evaluation phase focuses on the Aigio 1995 aftershock sequence (15/06/1995, M6.4) whereas the second covers the period between September 2006-May 2007, characterized for the intense swarm activity.We find that (1) geology based CRS models are preferred over optimally oriented planes (2) CRS models are consistent forecasters (60-70%) of transient seismicity, having in most cases comparable performance with ETAS models (3) swarms are not triggered by static stress changes of preceding local events.
Benefits of seasonal forecasts of crop yields
NASA Astrophysics Data System (ADS)
Sakurai, G.; Okada, M.; Nishimori, M.; Yokozawa, M.
2017-12-01
Major factors behind recent fluctuations in food prices include increased biofuel production and oil price fluctuations. In addition, several extreme climate events that reduced worldwide food production coincided with upward spikes in food prices. The stabilization of crop yields is one of the most important tasks to stabilize food prices and thereby enhance food security. Recent development of technologies related to crop modeling and seasonal weather forecasting has made it possible to forecast future crop yields for maize and soybean. However, the effective use of these technologies remains limited. Here we present the potential benefits of seasonal crop-yield forecasts on a global scale for choice of planting day. For this purpose, we used a model (PRYSBI-2) that can well replicate past crop yields both for maize and soybean. This model system uses a Bayesian statistical approach to estimate the parameters of a basic process-based model of crop growth. The spatial variability of model parameters was considered by estimating the posterior distribution of the parameters from historical yield data by using the Markov-chain Monte Carlo (MCMC) method with a resolution of 1.125° × 1.125°. The posterior distributions of model parameters were estimated for each spatial grid with 30 000 MCMC steps of 10 chains each. By using this model and the estimated parameter distributions, we were able to estimate not only crop yield but also levels of associated uncertainty. We found that the global average crop yield increased about 30% as the result of the optimal selection of planting day and that the seasonal forecast of crop yield had a large benefit in and near the eastern part of Brazil and India for maize and the northern area of China for soybean. In these countries, the effects of El Niño and Indian Ocean dipole are large. The results highlight the importance of developing a system to forecast global crop yields.
Post-processing of multi-model ensemble river discharge forecasts using censored EMOS
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian
2014-05-01
When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.
NASA Astrophysics Data System (ADS)
Marín, Julio C.; Pozo, Diana; Curé, Michel
2015-01-01
In this work, we describe a method to estimate the precipitable water vapor (PWV) from Geostationary Observational Environmental Satellite (GOES) data at high altitude sites. The method was applied at Atacama Pathfinder Experiment (APEX) and Cerro Toco sites, located above 5000 m altitude in the Chajnantor plateau, in the north of Chile. It was validated using GOES-12 satellite data over the range 0-1.2 mm since submillimeter/millimeter astronomical observations are only useful within this PWV range. The PWV estimated from GOES and the Final Analyses (FNL) at APEX for 2007 and 2009 show root mean square error values of 0.23 mm and 0.36 mm over the ranges 0-0.4 mm and 0.4-1.2 mm, respectively. However, absolute relative errors of 51% and 33% were shown over these PWV ranges, respectively. We recommend using high-resolution thermodynamic profiles from the Global Forecast System (GFS) model to estimate the PWV from GOES data since they are available every three hours and at an earlier time than the FNL data. The estimated PWV from GOES/GFS agrees better with the observed PWV at both sites during night time. The largest errors are shown during daytime. Short-term PWV forecasts were implemented at both sites, applying a simple persistence method to the PWV estimated from GOES/GFS. The 12 h and 24 h PWV forecasts evaluated from August to October 2009 indicates that 25% of them show a very good agreement with observations whereas 50% of them show reasonably good agreement with observations. Transmission uncertainties calculated for PWV estimations and forecasts over the studied sites are larger over the range 0-0.4 mm than over the range 0.4-1.2 mm. Thus, the method can be used over the latter interval with more confidence.
Jung, Chan Sik; Koh, Sang-Hyun; Nam, Youngwoo; Ahn, Jeong Joon; Lee, Cha Young; Choi, Won I L
2015-08-01
Monochamus saltuarius Gebler is a vector that transmits the pine wood nematode, Bursaphelenchus xylophilus, to Korean white pine, Pinus koraiensis, in Korea. To reduce the damage caused by this nematode in pine forests, timely control measures are needed to suppress the cerambycid beetle population. This study sought to construct a forecasting model to predict beetle emergence based on spring temperature. Logs of Korean white pine were infested with M. saltuarius in 2009, and the infested logs were overwintered. In February 2010, infested logs were then moved into incubators held at constant temperature conditions of 16, 20, 23, 25, 27, 30 or 34°C until all adults had emerged. The developmental rate of the beetles was estimated by linear and nonlinear equations and a forecasting model for emergence of the beetle was constructed by pooling data based on normalized developmental rate. The lower threshold temperature for development was 8.3°C. The forecasting model relatively well predicted the emergence pattern of M. saltuarius collected from four areas in northern Republic of Korea. The median emergence dates predicted by the model were 2.2-5.9 d earlier than the observed median dates. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Fraser, Grant; Rohde, Ken; Silburn, Mark
2017-08-01
Dissolved inorganic nitrogen (DIN) movement from Australian sugarcane farms is believed to be a major cause of crown-of-thorns starfish outbreaks which have reduced the Great Barrier Reef coral cover by ~21% (1985-2012). We develop a daily model of DIN concentration in runoff based on >200 field monitored runoff events. Runoff DIN concentrations were related to nitrogen fertiliser application rates and decreased after application with time and cumulative rainfall. Runoff after liquid fertiliser applications had higher initial DIN concentrations, though these concentrations diminished more rapidly in comparison to granular fertiliser applications. The model was validated using an independent field dataset and provided reasonable estimates of runoff DIN concentrations based on a number of modelling efficiency score results. The runoff DIN concentration model was combined with a water balance cropping model to investigate temporal aspects of sugarcane fertiliser management. Nitrogen fertiliser application in December (start of wet season) had the highest risk of DIN movement, and this was further exacerbated in years with a climate forecast for 'wet' seasonal conditions. The potential utility of a climate forecasting system to predict forthcoming wet months and hence DIN loss risk is demonstrated. Earlier fertiliser application or reducing fertiliser application rates in seasons with a wet climate forecast may markedly reduce runoff DIN loads; however, it is recommended that these findings be tested at a broader scale.
Case Studies of Forecasting Ionospheric Total Electron Content
NASA Astrophysics Data System (ADS)
Mannucci, A. J.; Meng, X.; Verkhoglyadova, O. P.; Tsurutani, B.; McGranaghan, R. M.
2017-12-01
We report on medium-range forecast-mode runs of ionosphere-thermosphere coupled models that calculate ionospheric total electron content (TEC), focusing on low-latitude daytime conditions. A medium-range forecast-mode run refers to simulations that are driven by inputs that can be predicted 2-3 days in advance, for example based on simulations of the solar wind. We will present results from a weak geomagnetic storm caused by a high-speed solar wind stream on June 29, 2012. Simulations based on the Global Ionosphere Thermosphere Model (GITM) and the Thermosphere Ionosphere Electrodynamic General Circulation Model (TIEGCM) significantly over-estimate TEC in certain low latitude daytime regions, compared to TEC maps based on observations. We will present the results from a more intense coronal mass ejection (CME) driven storm where the simulations are closer to observations. We compare high latitude data sets to model inputs, such as auroral boundary and convection patterns, to assess the degree to which poorly estimated high latitude drivers may be the largest cause of discrepancy between simulations and observations. Our results reveal many factors that can affect the accuracy of forecasts, including the fidelity of empirical models used to estimate high latitude precipitation patterns, or observation proxies for solar EUV spectra, such as the F10.7 index. Implications for forecasts with few-day lead times are discussed
Traffic model for the satellite component of UMTS
NASA Technical Reports Server (NTRS)
Hu, Y. F.; Sheriff, R. E.
1995-01-01
An algorithm for traffic volume estimation for satellite mobile communications systems has been developed. This algorithm makes use of worldwide databases for demographic and economic data. In order to provide for such an estimation, the effects of competing services have been considered so that likely market demand can be forecasted. Different user groups of the predicted market have been identified according to expectations in the quality of services and mobility requirement. The number of users for different user groups are calculated taking into account the gross potential market, the penetration rate of the identified services and the profitability to provide such services via satellite.
Freeway travel-time estimation and forecasting.
DOT National Transportation Integrated Search
2012-09-01
This project presents a microsimulation-based framework for generating short-term forecasts of travel time on freeway corridors. The microsimulation model that is developed (GTsim), replicates freeway capacity drop and relaxation phenomena critical f...
[Medium-term forecast of solar cosmic rays radiation risk during a manned Mars mission].
Petrov, V M; Vlasov, A G
2006-01-01
Medium-term forecasting radiation hazard from solar cosmic rays will be vital in a manned Mars mission. Modern methods of space physics lack acceptable reliability in medium-term forecasting the SCR onset and parameters. The proposed estimation of average radiation risk from SCR during the manned Mars mission is made with the use of existing SCR fluence and spectrum models and correlation of solar particle event frequency with predicted Wolf number. Radiation risk is considered an additional death probability from acute radiation reactions (ergonomic component) or acute radial disease in flight. The algorithm for radiation risk calculation is described and resulted risk levels for various periods of the 23-th solar cycle are presented. Applicability of this method to advance forecasting and possible improvements are being investigated. Recommendations to the crew based on risk estimation are exemplified.
Brief Report: Forecasting the Economic Burden of Autism in 2015 and 2025 in the United States
ERIC Educational Resources Information Center
Leigh, J. Paul; Du, Juan
2015-01-01
Few US estimates of the economic burden of autism spectrum disorders (ASD) are available and none provide estimates for 2015 and 2025. We forecast annual direct medical, direct non-medical, and productivity costs combined will be $268 billion (range $162-$367 billion; 0.884-2.009% of GDP) for 2015 and $461 billion (range $276-$1011 billion;…
Stochastic Model of Seasonal Runoff Forecasts
NASA Astrophysics Data System (ADS)
Krzysztofowicz, Roman; Watada, Leslie M.
1986-03-01
Each year the National Weather Service and the Soil Conservation Service issue a monthly sequence of five (or six) categorical forecasts of the seasonal snowmelt runoff volume. To describe uncertainties in these forecasts for the purposes of optimal decision making, a stochastic model is formulated. It is a discrete-time, finite, continuous-space, nonstationary Markov process. Posterior densities of the actual runoff conditional upon a forecast, and transition densities of forecasts are obtained from a Bayesian information processor. Parametric densities are derived for the process with a normal prior density of the runoff and a linear model of the forecast error. The structure of the model and the estimation procedure are motivated by analyses of forecast records from five stations in the Snake River basin, from the period 1971-1983. The advantages of supplementing the current forecasting scheme with a Bayesian analysis are discussed.
Phytoremediation removal rates of benzene, toluene, and chlorobenzene.
Limmer, Matt A; Wilson, Jordan; Westenberg, David; Lee, Amy; Siegman, Mark; Burken, Joel G
2018-06-07
Phytoremediation is a sustainable remedial approach, although performance efficacy is rarely reported. In this study, we assessed a phytoremediation plot treating benzene, toluene, and chlorobenzene. A comparison of the calculated phytoremediation removal rate with estimates of onsite contaminant mass was used to forecast cleanup periods. The investigation demonstrated that substantial microbial degradation was occurring in the subsurface. Estimates of transpiration indicated that the trees planted were removing approximately 240,000 L of water per year. This large quantity of water removal implies substantial removal of contaminant due to large amounts of contaminants in the groundwater; however, these contaminants extensively sorb to the soil, resulting in large quantities of contaminant mass in the subsurface. The total estimate of subsurface contaminant mass was also complicated by the presence of non-aqueous phase liquids (NAPL), additional contaminant masses that were difficult to quantify. These uncertainties of initial contaminant mass at the site result in large uncertainty in the cleanup period, although mean estimates are on the order of decades. Collectively, the model indicates contaminant removal rates on the order of 10 -2 -10 0 kg/tree/year. The benefit of the phytoremediation system is relatively sustainable cleanup over the long periods necessary due to the presence of NAPL.
NASA Technical Reports Server (NTRS)
Wolff, David B.; Fisher, Brad L.
2008-01-01
Space-borne microwave sensors provide critical rain information used in several global multi-satellite rain products, which in turn are used for a variety of important studies, including landslide forecasting, flash flood warning, data assimilation, climate studies, and validation of model forecast of precipitation. This study employs four years (2003-2006) of satellite data to assess the relative performance and skill of SSM/I (F13, F14 and F15), AMSU-B (N15, N16 and N17), AMSR-E (AQUA) and the TRMM Microwave Imager (TMI) in estimating surface rainfall based on direct instantaneous comparison with ground-based rain estimates from Tropical Rainfall Measuring Mission (TRMM) Ground Validation (GV) sites at Kwajalein, Republic of the Marshall Islands (KWAJ) and Melbourne, Florida (MELB). The relative performance of each of these satellites is examined via comparisons with GV radar-based rain rate estimates. Because underlying surface terrain is known to affect the relative performance of the satellite algorithms, the data for MELB was further stratified into ocean, land and coast categories using a 0.25 terrain mask. Of all the satellite estimates compared in this study, TMI and AMSR-E exhibited considerably higher correlations and skills in estimating/observing surface precipitation. While SSM/I and AMSU-B exhibited lower correlations and skills for each of the different terrain categories, the SSM/I absolute biases trended slightly lower than AMSRE over ocean, where the observations from both emission and scattering channels were used in the retrievals. AMSU-B exhibited the least skill relative to GV in all of the relevant statistical categories, and an anomalous spike was observed in the probability distribution functions near 1.0 mm hr-1. This statistical artifact appears to be related to attempts by algorithm developers to include some lighter rain rates, not easily detectable by its scatter-only frequencies. AMSU-B, however, agreed well with GV when the matching data was analyzed on monthly scales. These results signal developers of global rainfall products, such as the TRMM Multi-Satellite Precipitation Analysis (TMPA), and the Climate Data Center s Morphing (CMORPH) technique, that care must be taken when incorporating data from these input satellite estimates in order to provide the highest quality estimates in their products.
Soybean Crop Area Estimation and Mapping in Mato Grosso State, Brazil
NASA Astrophysics Data System (ADS)
Gusso, A.; Ducati, J. R.
2012-07-01
Evaluation of the MODIS Crop Detection Algorithm (MCDA) procedure for estimating historical planted soybean crop areas was done on fields in Mato Grosso State, Brazil. MCDA is based on temporal profiles of EVI (Enhanced Vegetation Index) derived from satellite data of the MODIS (Moderate Resolution Imaging Spectroradiometer) imager, and was previously developed for soybean area estimation in Rio Grande do Sul State, Brazil. According to the MCDA approach, in Mato Grosso soybean area estimates can be provided in December (1st forecast), using images from the sowing period, and in February (2nd forecast), using images from sowing and maximum crop development period. The results obtained by the MCDA were compared with Brazilian Institute of Geography and Statistics (IBGE) official estimates of soybean area at municipal level. Coefficients of determination were between 0.93 and 0.98, indicating a good agreement, and also the suitability of MCDA to estimations performed in Mato Grosso State. On average, the MCDA results explained 96% of the variation of the data estimated by the IBGE. In this way, MCDA calibration was able to provide annual thematic soybean maps, forecasting the planted area in the State, with results which are comparable to the official agricultural statistics.
Earthquake focal mechanism forecasting in Italy for PSHA purposes
NASA Astrophysics Data System (ADS)
Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola
2018-01-01
In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, K.S.
The presence of overpopulation or unsustainable population growth may place pressure on the food and water supplies of countries in sensitive areas of the world. Severe air or water pollution may place additional pressure on these resources. These pressures may generate both internal and international conflict in these areas as nations struggle to provide for their citizens. Such conflicts may result in United States intervention, either unilaterally, or through the United Nations. Therefore, it is in the interests of the United States to identify potential areas of conflict in order to properly train and allocate forces. The purpose of thismore » research is to forecast the probability of conflict in a nation as a function of it s environmental conditions. Probit, logit and ordered probit models are employed to forecast the probability of a given level of conflict. Data from 95 countries are used to estimate the models. Probability forecasts are generated for these 95 nations. Out-of sample forecasts are generated for an additional 22 nations. These probabilities are then used to rank nations from highest probability of conflict to lowest. The results indicate that the dependence of a nation`s economy on agriculture, the rate of deforestation, and the population density are important variables in forecasting the probability and level of conflict. These results indicate that environmental variables do play a role in generating or exacerbating conflict. It is unclear that the United States military has any direct role in mitigating the environmental conditions that may generate conflict. A more important role for the military is to aid in data gathering to generate better forecasts so that the troops are adequntely prepared when conflicts arises.« less
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.
2011-01-01
Increases in computing resources have allowed for the utilization of high-resolution weather forecast models capable of resolving cloud microphysical and precipitation processes among varying numbers of hydrometeor categories. Several microphysics schemes are currently available within the Weather Research and Forecasting (WRF) model, ranging from single-moment predictions of precipitation content to double-moment predictions that include a prediction of particle number concentrations. Each scheme incorporates several assumptions related to the size distribution, shape, and fall speed relationships of ice crystals in order to simulate cold-cloud processes and resulting precipitation. Field campaign data offer a means of evaluating the assumptions present within each scheme. The Canadian CloudSat/CALIPSO Validation Project (C3VP) represented collaboration among the CloudSat, CALIPSO, and NASA Global Precipitation Measurement mission communities, to observe cold season precipitation processes relevant to forecast model evaluation and the eventual development of satellite retrievals of cloud properties and precipitation rates. During the C3VP campaign, widespread snowfall occurred on 22 January 2007, sampled by aircraft and surface instrumentation that provided particle size distributions, ice water content, and fall speed estimations along with traditional surface measurements of temperature and precipitation. In this study, four single-moment and two double-moment microphysics schemes were utilized to generate hypothetical WRF forecasts of the event, with C3VP data used in evaluation of their varying assumptions. Schemes that incorporate flexibility in size distribution parameters and density assumptions are shown to be preferable to fixed constants, and that a double-moment representation of the snow category may be beneficial when representing the effects of aggregation. These results may guide forecast centers in optimal configurations of their forecast models for winter weather and identify best practices present within these various schemes.
On the predictability of outliers in ensemble forecasts
NASA Astrophysics Data System (ADS)
Siegert, S.; Bröcker, J.; Kantz, H.
2012-03-01
In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.
Mohammed, Emad A; Naugler, Christopher
2017-01-01
Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.
Mohammed, Emad A.; Naugler, Christopher
2017-01-01
Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996
Uniform California earthquake rupture forecast, version 2 (UCERF 2)
Field, E.H.; Dawson, T.E.; Felzer, K.R.; Frankel, A.D.; Gupta, V.; Jordan, T.H.; Parsons, T.; Petersen, M.D.; Stein, R.S.; Weldon, R.J.; Wills, C.J.
2009-01-01
The 2007 Working Group on California Earthquake Probabilities (WGCEP, 2007) presents the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2). This model comprises a time-independent (Poisson-process) earthquake rate model, developed jointly with the National Seismic Hazard Mapping Program and a time-dependent earthquake-probability model, based on recent earthquake rates and stress-renewal statistics conditioned on the date of last event. The models were developed from updated statewide earthquake catalogs and fault deformation databases using a uniform methodology across all regions and implemented in the modular, extensible Open Seismic Hazard Analysis framework. The rate model satisfies integrating measures of deformation across the plate-boundary zone and is consistent with historical seismicity data. An overprediction of earthquake rates found at intermediate magnitudes (6.5 ??? M ???7.0) in previous models has been reduced to within the 95% confidence bounds of the historical earthquake catalog. A logic tree with 480 branches represents the epistemic uncertainties of the full time-dependent model. The mean UCERF 2 time-dependent probability of one or more M ???6.7 earthquakes in the California region during the next 30 yr is 99.7%; this probability decreases to 46% for M ???7.5 and to 4.5% for M ???8.0. These probabilities do not include the Cascadia subduction zone, largely north of California, for which the estimated 30 yr, M ???8.0 time-dependent probability is 10%. The M ???6.7 probabilities on major strike-slip faults are consistent with the WGCEP (2003) study in the San Francisco Bay Area and the WGCEP (1995) study in southern California, except for significantly lower estimates along the San Jacinto and Elsinore faults, owing to provisions for larger multisegment ruptures. Important model limitations are discussed.
Minimally invasive surgery: national trends in adoption and future directions for hospital strategy.
Tsui, Charlotte; Klein, Rachel; Garabrant, Matthew
2013-07-01
Surgeons have rapidly adopted minimally invasive surgical (MIS) techniques for a wide range of applications since the first laparoscopic appendectomy was performed in 1983. At the helm of this MIS shift has been laparoscopy, with robotic surgery also gaining ground in a number of areas. Researchers estimated national volumes, growth forecasts, and MIS adoption rates for the following procedures: cholecystectomy, appendectomy, gastric bypass, ventral hernia repair, colectomy, prostatectomy, tubal ligation, hysterectomy, and myomectomy. MIS adoption rates are based on secondary research, interviews with clinicians and administrators involved in MIS, and a review of clinical literature, where available. Overall volume estimates and growth forecasts are sourced from The Advisory Board Company's national demand model which provides current and future utilization rate projections for inpatient and outpatient services. The model takes into account demographics (growth and aging of the population) as well as non demographic factors such as inpatient to outpatient shift, increase in disease prevalence, technological advancements, coverage expansion, and changing payment models. Surgeons perform cholecystectomy, a relatively simple procedure, laparoscopically in 96 % of the cases. Use of the robot as a tool in laparoscopy is gaining traction in general surgery and seeing particular growth within colorectal surgery. Surgeons use robotic surgery in 15 % of colectomy cases, far behind that of prostatectomy but similar to that of hysterectomy, which have robotic adoption rates of 90 and 20 %, respectively. Surgeons are using minimally invasive surgical techniques, primarily laparoscopy and robotic surgery, to perform procedures that were previously done as open surgery. As risk-based pressures mount, hospital executives will increasingly scrutinize the cost of new technology and the impact it has on patient outcomes. These changing market dynamics may thwart the expansion of new surgical techniques and heighten emphasis on competency standards.
Web-Based Real Time Earthquake Forecasting and Personal Risk Management
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.
2012-12-01
Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.
Global Earthquake Activity Rate models based on version 2 of the Global Strain Rate Map
NASA Astrophysics Data System (ADS)
Bird, P.; Kreemer, C.; Kagan, Y. Y.; Jackson, D. D.
2013-12-01
Global Earthquake Activity Rate (GEAR) models have usually been based on either relative tectonic motion (fault slip rates and/or distributed strain rates), or on smoothing of seismic catalogs. However, a hybrid approach appears to perform better than either parent, at least in some retrospective tests. First, we construct a Tectonic ('T') forecast of shallow (≤ 70 km) seismicity based on global plate-boundary strain rates from version 2 of the Global Strain Rate Map. Our approach is the SHIFT (Seismic Hazard Inferred From Tectonics) method described by Bird et al. [2010, SRL], in which the character of the strain rate tensor (thrusting and/or strike-slip and/or normal) is used to select the most comparable type of plate boundary for calibration of the coupled seismogenic lithosphere thickness and corner magnitude. One difference is that activity of offshore plate boundaries is spatially smoothed using empirical half-widths [Bird & Kagan, 2004, BSSA] before conversion to seismicity. Another is that the velocity-dependence of coupling in subduction and continental-convergent boundaries [Bird et al., 2009, BSSA] is incorporated. Another forecast component is the smoothed-seismicity ('S') forecast model of [Kagan & Jackson, 1994, JGR; Kagan & Jackson, 2010, GJI], which was based on optimized smoothing of the shallow part of the GCMT catalog, years 1977-2004. Both forecasts were prepared for threshold magnitude 5.767. Then, we create hybrid forecasts by one of 3 methods: (a) taking the greater of S or T; (b) simple weighted-average of S and T; or (c) log of the forecast rate is a weighted average of the logs of S and T. In methods (b) and (c) there is one free parameter, which is the fractional contribution from S. All hybrid forecasts are normalized to the same global rate. Pseudo-prospective tests for 2005-2012 (using versions of S and T calibrated on years 1977-2004) show that many hybrid models outperform both parents (S and T), and that the optimal weight on S is in the neighborhood of 5/8. This is true whether forecast performance is scored by Kagan's [2009, GJI] I1 information score, or by the S-test of Zechar & Jordan [2010, BSSA]. These hybrids also score well (0.97) in the ASS-test of Zechar & Jordan [2008, GJI] with respect to prior relative intensity.
Socioeconomic Forecasting Model for the Tri-County Regional Planning Commission
DOT National Transportation Integrated Search
1997-01-01
Socioeconomic data is a critical input to transportation planning and travel demand forecasting. Accurate estimates of existing population, incomes, employment and other socioeconomic characteristics are necessary for meaningful calibration of a trav...
Foreign currency rate forecasting using neural networks
NASA Astrophysics Data System (ADS)
Pandya, Abhijit S.; Kondo, Tadashi; Talati, Amit; Jayadevappa, Suryaprasad
2000-03-01
Neural networks are increasingly being used as a forecasting tool in many forecasting problems. This paper discusses the application of neural networks in predicting daily foreign exchange rates between the USD, GBP as well as DEM. We approach the problem from a time-series analysis framework - where future exchange rates are forecasted solely using past exchange rates. This relies on the belief that the past prices and future prices are very close related, and interdependent. We present the result of training a neural network with historical USD-GBP data. The methodology used in explained, as well as the training process. We discuss the selection of inputs to the network, and present a comparison of using the actual exchange rates and the exchange rate differences as inputs. Price and rate differences are the preferred way of training neural network in financial applications. Results of both approaches are present together for comparison. We show that the network is able to learn the trends in the exchange rate movements correctly, and present the results of the prediction over several periods of time.
NASA Technical Reports Server (NTRS)
French, V. (Principal Investigator)
1982-01-01
An evaluation was made of Thompson-Type models which use trend terms (as a surrogate for technology), meteorological variables based on monthly average temperature, and total precipitation to forecast and estimate corn yields in Iowa, Illinois, and Indiana. Pooled and unpooled Thompson-type models were compared. Neither was found to be consistently superior to the other. Yield reliability indicators show that the models are of limited use for large area yield estimation. The models are objective and consistent with scientific knowledge. Timely yield forecasts and estimates can be made during the growing season by using normals or long range weather forecasts. The models are not costly to operate and are easy to use and understand. The model standard errors of prediction do not provide a useful current measure of modeled yield reliability.
NASA Astrophysics Data System (ADS)
Franz, K. J.; Bowman, A. L.; Hogue, T. S.; Kim, J.; Spies, R.
2011-12-01
In the face of a changing climate, growing populations, and increased human habitation in hydrologically risky locations, both short- and long-range planners increasingly require robust and reliable streamflow forecast information. Current operational forecasting utilizes watershed-scale, conceptual models driven by ground-based (commonly point-scale) observations of precipitation and temperature and climatological potential evapotranspiration (PET) estimates. The PET values are derived from historic pan evaporation observations and remain static from year-to-year. The need for regional dynamic PET values is vital for improved operational forecasting. With the advent of satellite remote sensing and the adoption of a more flexible operational forecast system by the National Weather Service, incorporation of advanced data products is now more feasible than in years past. In this study, we will test a previously developed satellite-derived PET product (UCLA MODIS-PET) in the National Weather Service forecast models and compare the model results to current methods. The UCLA MODIS-PET method is based on the Priestley-Taylor formulation, is driven with MODIS satellite products, and produces a daily, 250m PET estimate. The focus area is eight headwater basins in the upper Midwest U.S. There is a need to develop improved forecasting methods for this region that are able to account for climatic and landscape changes more readily and effectively than current methods. This region is highly flood prone yet sensitive to prolonged dry periods in late summer and early fall, and is characterized by a highly managed landscape, which has drastically altered the natural hydrologic cycle. Our goal is to improve model simulations, and thereby, the initial conditions prior to the start of a forecast through the use of PET values that better reflect actual watershed conditions. The forecast models are being tested in both distributed and lumped mode.
Updating of states in operational hydrological models
NASA Astrophysics Data System (ADS)
Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.
2012-04-01
Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.
Traffic forecasting report : 2007.
DOT National Transportation Integrated Search
2008-05-01
This is the sixth edition of the Traffic Forecasting Report (TFR). This edition of the TFR contains the latest (predominantly 2007) forecasting/modeling data as follows: : Functional class average traffic volume growth rates and trends : Vehi...
Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench
NASA Astrophysics Data System (ADS)
Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan
2016-04-01
Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.
Near-real-time Estimation and Forecast of Total Precipitable Water in Europe
NASA Astrophysics Data System (ADS)
Bartholy, J.; Kern, A.; Barcza, Z.; Pongracz, R.; Ihasz, I.; Kovacs, R.; Ferencz, C.
2013-12-01
Information about the amount and spatial distribution of atmospheric water vapor (or total precipitable water) is essential for understanding weather and the environment including the greenhouse effect, the climate system with its feedbacks and the hydrological cycle. Numerical weather prediction (NWP) models need accurate estimations of water vapor content to provide realistic forecasts including representation of clouds and precipitation. In the present study we introduce our research activity for the estimation and forecast of atmospheric water vapor in Central Europe using both observations and models. The Eötvös Loránd University (Hungary) operates a polar orbiting satellite receiving station in Budapest since 2002. This station receives Earth observation data from polar orbiting satellites including MODerate resolution Imaging Spectroradiometer (MODIS) Direct Broadcast (DB) data stream from satellites Terra and Aqua. The received DB MODIS data are automatically processed using freely distributed software packages. Using the IMAPP Level2 software total precipitable water is calculated operationally using two different methods. Quality of the TPW estimations is a crucial question for further application of the results, thus validation of the remotely sensed total precipitable water fields is presented using radiosonde data. In a current research project in Hungary we aim to compare different estimations of atmospheric water vapor content. Within the frame of the project we use a NWP model (DBCRAS; Direct Broadcast CIMSS Regional Assimilation System numerical weather prediction software developed by the University of Wisconsin, Madison) to forecast TPW. DBCRAS uses near real time Level2 products from the MODIS data processing chain. From the wide range of the derived Level2 products the MODIS TPW parameter found within the so-called mod07 results (Atmospheric Profiles Product) and the cloud top pressure and cloud effective emissivity parameters from the so-called mod06 results (Cloud Product) are assimilated twice a day (at 00 and 12 UTC) by DBCRAS. DBCRAS creates 72 hours long weather forecasts with 48 km horizontal resolution. DBCRAS is operational at the University since 2009 which means that by now sufficient data is available for the verification of the model. In the present study verification results for the DBCRAS total precipitable water forecasts are presented based on analysis data from the European Centre for Medium-Range Weather Forecasts (ECMWF). Numerical indices are calculated to quantify the performance of DBCRAS. During a limited time period DBCRAS was also ran without assimilating MODIS products which means that there is possibility to quantify the effect of assimilating MODIS physical products on the quality of the forecasts. For this limited time period verification indices are compared to decide whether MODIS data improves forecast quality or not.
Sampri, Alexia; Sypsa, Karla; Tsagarakis, Konstantinos P
2018-01-01
Background With the internet’s penetration and use constantly expanding, this vast amount of information can be employed in order to better assess issues in the US health care system. Google Trends, a popular tool in big data analytics, has been widely used in the past to examine interest in various medical and health-related topics and has shown great potential in forecastings, predictions, and nowcastings. As empirical relationships between online queries and human behavior have been shown to exist, a new opportunity to explore the behavior toward asthma—a common respiratory disease—is present. Objective This study aimed at forecasting the online behavior toward asthma and examined the correlations between queries and reported cases in order to explore the possibility of nowcasting asthma prevalence in the United States using online search traffic data. Methods Applying Holt-Winters exponential smoothing to Google Trends time series from 2004 to 2015 for the term “asthma,” forecasts for online queries at state and national levels are estimated from 2016 to 2020 and validated against available Google query data from January 2016 to June 2017. Correlations among yearly Google queries and between Google queries and reported asthma cases are examined. Results Our analysis shows that search queries exhibit seasonality within each year and the relationships between each 2 years’ queries are statistically significant (P<.05). Estimated forecasting models for a 5-year period (2016 through 2020) for Google queries are robust and validated against available data from January 2016 to June 2017. Significant correlations were found between (1) online queries and National Health Interview Survey lifetime asthma (r=–.82, P=.001) and current asthma (r=–.77, P=.004) rates from 2004 to 2015 and (2) between online queries and Behavioral Risk Factor Surveillance System lifetime (r=–.78, P=.003) and current asthma (r=–.79, P=.002) rates from 2004 to 2014. The correlations are negative, but lag analysis to identify the period of response cannot be employed until short-interval data on asthma prevalence are made available. Conclusions Online behavior toward asthma can be accurately predicted, and significant correlations between online queries and reported cases exist. This method of forecasting Google queries can be used by health care officials to nowcast asthma prevalence by city, state, or nationally, subject to future availability of daily, weekly, or monthly data on reported cases. This method could therefore be used for improved monitoring and assessment of the needs surrounding the current population of patients with asthma. PMID:29530839
Forecasting Construction Cost Index based on visibility graph: A network approach
NASA Astrophysics Data System (ADS)
Zhang, Rong; Ashuri, Baabak; Shyr, Yu; Deng, Yong
2018-03-01
Engineering News-Record (ENR), a professional magazine in the field of global construction engineering, publishes Construction Cost Index (CCI) every month. Cost estimators and contractors assess projects, arrange budgets and prepare bids by forecasting CCI. However, fluctuations and uncertainties of CCI cause irrational estimations now and then. This paper aims at achieving more accurate predictions of CCI based on a network approach in which time series is firstly converted into a visibility graph and future values are forecasted relied on link prediction. According to the experimental results, the proposed method shows satisfactory performance since the error measures are acceptable. Compared with other methods, the proposed method is easier to implement and is able to forecast CCI with less errors. It is convinced that the proposed method is efficient to provide considerably accurate CCI predictions, which will make contributions to the construction engineering by assisting individuals and organizations in reducing costs and making project schedules.
Real-time data for estimating a forward-looking interest rate rule of the ECB.
Bletzinger, Tilman; Wieland, Volker
2017-12-01
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.
The Betting Odds Rating System: Using soccer forecasts to forecast soccer.
Wunderlich, Fabian; Memmert, Daniel
2018-01-01
Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods.
The Betting Odds Rating System: Using soccer forecasts to forecast soccer
Memmert, Daniel
2018-01-01
Betting odds are frequently found to outperform mathematical models in sports related forecasting tasks, however the factors contributing to betting odds are not fully traceable and in contrast to rating-based forecasts no straightforward measure of team-specific quality is deducible from the betting odds. The present study investigates the approach of combining the methods of mathematical models and the information included in betting odds. A soccer forecasting model based on the well-known ELO rating system and taking advantage of betting odds as a source of information is presented. Data from almost 15.000 soccer matches (seasons 2007/2008 until 2016/2017) are used, including both domestic matches (English Premier League, German Bundesliga, Spanish Primera Division and Italian Serie A) and international matches (UEFA Champions League, UEFA Europe League). The novel betting odds based ELO model is shown to outperform classic ELO models, thus demonstrating that betting odds prior to a match contain more relevant information than the result of the match itself. It is shown how the novel model can help to gain valuable insights into the quality of soccer teams and its development over time, thus having a practical benefit in performance analysis. Moreover, it is argued that network based approaches might help in further improving rating and forecasting methods. PMID:29870554
Yao, Yibin; Shan, Lulu; Zhao, Qingzhi
2017-09-29
Global Navigation Satellite System (GNSS) can effectively retrieve precipitable water vapor (PWV) with high precision and high-temporal resolution. GNSS-derived PWV can be used to reflect water vapor variation in the process of strong convection weather. By studying the relationship between time-varying PWV and rainfall, it can be found that PWV contents increase sharply before raining. Therefore, a short-term rainfall forecasting method is proposed based on GNSS-derived PWV. Then the method is validated using hourly GNSS-PWV data from Zhejiang Continuously Operating Reference Station (CORS) network of the period 1 September 2014 to 31 August 2015 and its corresponding hourly rainfall information. The results show that the forecasted correct rate can reach about 80%, while the false alarm rate is about 66%. Compared with results of the previous studies, the correct rate is improved by about 7%, and the false alarm rate is comparable. The method is also applied to other three actual rainfall events of different regions, different durations, and different types. The results show that the method has good applicability and high accuracy, which can be used for rainfall forecasting, and in the future study, it can be assimilated with traditional weather forecasting techniques to improve the forecasted accuracy.
Hoerger, Michael; Quirk, Stuart W.; Chapman, Benjamin P.; Duberstein, Paul R.
2011-01-01
Emerging research has examined individual differences in affective forecasting; however, we are aware of no published study to date linking psychopathology symptoms to affective forecasting problems. Pitting cognitive theory against depressive realism theory, we examined whether dysphoria was associated with negatively biased affective forecasts or greater accuracy. Participants (n = 325) supplied predicted and actual emotional reactions for three days surrounding an emotionally-evocative relational event, Valentine’s Day. Predictions were made a month prior to the holiday. Consistent with cognitive theory, we found evidence for a dysphoric forecasting bias – the tendency of individuals in dysphoric states to overpredict negative emotional reactions to future events. The dysphoric forecasting bias was robust across ratings of positive and negative affect, forecasts for pleasant and unpleasant scenarios, continuous and categorical operationalizations of dysphoria, and three time points of observation. Similar biases were not observed in analyses examining the independent effects of anxiety and hypomania. Findings provide empirical evidence for the long assumed influence of depressive symptoms on future expectations. The present investigation has implications for affective forecasting studies examining information processing constructs, decision making, and broader domains of psychopathology. PMID:22397734
Hoerger, Michael; Quirk, Stuart W; Chapman, Benjamin P; Duberstein, Paul R
2012-01-01
Emerging research has examined individual differences in affective forecasting; however, we are aware of no published study to date linking psychopathology symptoms to affective forecasting problems. Pitting cognitive theory against depressive realism theory, we examined whether dysphoria was associated with negatively biased affective forecasts or greater accuracy. Participants (n=325) supplied predicted and actual emotional reactions for three days surrounding an emotionally evocative relational event, Valentine's Day. Predictions were made a month prior to the holiday. Consistent with cognitive theory, we found evidence for a dysphoric forecasting bias-the tendency of individuals in dysphoric states to overpredict negative emotional reactions to future events. The dysphoric forecasting bias was robust across ratings of positive and negative affect, forecasts for pleasant and unpleasant scenarios, continuous and categorical operationalisations of dysphoria, and three time points of observation. Similar biases were not observed in analyses examining the independent effects of anxiety and hypomania. Findings provide empirical evidence for the long-assumed influence of depressive symptoms on future expectations. The present investigation has implications for affective forecasting studies examining information-processing constructs, decision making, and broader domains of psychopathology.
Early Transition and Use of VIIRS and GOES-R Products by NWS Forecast Offices
NASA Technical Reports Server (NTRS)
Fuell, Kevin K.; Smith, Mathew; Jedlovec, Gary
2012-01-01
The Visible Infrared Imaging Radiometer Suite (VIIRS) on the NPOESS Preparatory Project (NPP) satellite, part of the Joint Polar Satellite System (JPSS), and the ABI and GLM sensors scheduled for the GOES-R geostationary satellite will bring advanced observing capabilities to the operational weather community. The NASA Short-term Prediction Research and Transition (SPoRT) project at Marshall Space Flight Center has been facilitating the use of real-time experimental and research satellite data by NWS Weather Forecast Offices (WFOs) for a number of years to demonstrate the planned capabilities of future sensors to address particular forecast challenges through improve situational awareness and short-term weather forecasts. For the NOAA GOES-R Proving Ground (PG) activity, SPoRT is developing and disseminating selected GOES-R proxy products to collaborating WFOs and National Centers. SPoRT developed the a pseudo-Geostationary Lightning Mapper product and helped in the transition of the Algorithm Working Group (AWG) Convective Initiation (CI) proxy product for the Hazardous Weather Testbed (HWT) Spring Experiment,. Along with its partner WFOs, SPoRT is evaluating MODIS/GOES Hybrid products, which brings ABI-like data sets from existing NASA instrumentation in front of the forecaster for everyday use. The Hybrid uses near real-time MODIS imagery to demonstrate future ABI capabilities, while utilizing standard GOES imagery to provide the temporal frequency of geostationary imagery expected by operational forecasters. In addition, SPoRT is collaborating with the GOES-R hydrology AWG to transition a baseline proxy product for rainfall rate / quantitative precipitation estimate (QPE) to the OCONUS regions. For VIIRS, SPoRT is demonstrating multispectral observing capabilities and the utility of low-light channels not previously available on operational weather satellites to address a variety of weather forecast challenges. This presentation will discuss the results of transitioning these products to collaborating WFOs throughout the country.
NASA Astrophysics Data System (ADS)
Wood, Andy; Clark, Elizabeth; Mendoza, Pablo; Nijssen, Bart; Newman, Andy; Clark, Martyn; Nowak, Kenneth; Arnold, Jeffrey
2017-04-01
Many if not most national operational streamflow prediction systems rely on a forecaster-in-the-loop approach that require the hands-on-effort of an experienced human forecaster. This approach evolved from the need to correct for long-standing deficiencies in the models and datasets used in forecasting, and the practice often leads to skillful flow predictions despite the use of relatively simple, conceptual models. Yet the 'in-the-loop' forecast process is not reproducible, which limits opportunities to assess and incorporate new techniques systematically, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun develop more centralized, 'over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, many national operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as such systems are beginning to be deployed operationally in centers such as ECMWF. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the US National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis Research and Prediction Applications' (SHARP) to implement, assess and demonstrate real-time over-the-loop ensemble flow forecasts in a range of US watersheds. The system relies on fully ensemble techniques, including: an 100-member ensemble of meteorological model forcings and an ensemble particle filter data assimilation for initializing watershed states; analog/regression-based downscaling of ensemble weather forecasts from GEFS; and statistical post-processing of ensemble forecast outputs, all of which run in real-time within a workflow managed by ECWMF's ecFlow libraries over large US regional domains. We describe SHARP and present early hindcast and verification results for short to seasonal range streamflow forecasts in a number of US case study watersheds.
A new method for determining the optimal lagged ensemble
DelSole, T.; Tippett, M. K.; Pegion, K.
2017-01-01
Abstract We propose a general methodology for determining the lagged ensemble that minimizes the mean square forecast error. The MSE of a lagged ensemble is shown to depend only on a quantity called the cross‐lead error covariance matrix, which can be estimated from a short hindcast data set and parameterized in terms of analytic functions of time. The resulting parameterization allows the skill of forecasts to be evaluated for an arbitrary ensemble size and initialization frequency. Remarkably, the parameterization also can estimate the MSE of a burst ensemble simply by taking the limit of an infinitely small interval between initialization times. This methodology is applied to forecasts of the Madden Julian Oscillation (MJO) from version 2 of the Climate Forecast System version 2 (CFSv2). For leads greater than a week, little improvement is found in the MJO forecast skill when ensembles larger than 5 days are used or initializations greater than 4 times per day. We find that if the initialization frequency is too infrequent, important structures of the lagged error covariance matrix are lost. Lastly, we demonstrate that the forecast error at leads ≥10 days can be reduced by optimally weighting the lagged ensemble members. The weights are shown to depend only on the cross‐lead error covariance matrix. While the methodology developed here is applied to CFSv2, the technique can be easily adapted to other forecast systems. PMID:28580050
Short-range quantitative precipitation forecasting using Deep Learning approaches
NASA Astrophysics Data System (ADS)
Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.
2017-12-01
Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.
Simulating Glacial Outburst Lake Releases for Suicide Basin, Mendenhall Glacier, Juneau, Alaska
NASA Astrophysics Data System (ADS)
Jacobs, A. B.; Moran, T.; Hood, E. W.
2017-12-01
Glacial Lake outbursts from Suicide Basin are recent phenomenon first characterized in 2011. The 2014 event resulted in record river stage and moderate flooding on the Mendenhall River in Juneau. Recognizing that these events can adversely impact residential areas of Juneau's Mendenhall Valley, the Alaska-Pacific River Forecast Center developed a real-time modeling technique capable of forecasting the timing and magnitude of the flood-wave crest due to releases from Suicide Basin. The 2014 event was estimated at about 37,000 acre feet with water levels cresting within 36 hours from the time the flood wave hit Mendenhall Lake. Given the magnitude of possible impacts to the public, accurate hydrological forecasting is essential for public safety and Emergency Managers. However, the data needed to effectively forecast magnitudes of specific jökulhlaup events are limited. Estimating this event as related to river stage depended upon three variables: 1) the timing of the lag between Suicide Basin water level declines and the related rise of Mendenhall Lake, 2) continuous monitoring of Mendenhall Lake water levels, and 3) estimating the total water volume stored in Suicide Basin. Real-time modeling of the event utilized a Time of Concentration hydrograph with independent power equations representing the rising and falling limbs of the hydrograph. The initial accuracy of the model — as forecasted about 24 hours prior to crest — resulted in an estimated crest within 0.5 feet of the actual with a timing error of about six hours later than the actual crest.
Remote Sensing and River Discharge Forecasting for Major Rivers in South Asia (Invited)
NASA Astrophysics Data System (ADS)
Webster, P. J.; Hopson, T. M.; Hirpa, F. A.; Brakenridge, G. R.; De-Groeve, T.; Shrestha, K.; Gebremichael, M.; Restrepo, P. J.
2013-12-01
The South Asia is a flashpoint for natural disasters particularly flooding of the Indus, Ganges, and Brahmaputra has profound societal impacts for the region and globally. The 2007 Brahmaputra floods affecting India and Bangladesh, the 2008 avulsion of the Kosi River in India, the 2010 flooding of the Indus River in Pakistan and the 2013 Uttarakhand exemplify disasters on scales almost inconceivable elsewhere. Their frequent occurrence of floods combined with large and rapidly growing populations, high levels of poverty and low resilience, exacerbate the impact of the hazards. Mitigation of these devastating hazards are compounded by limited flood forecast capability, lack of rain/gauge measuring stations and forecast use within and outside the country, and transboundary data sharing on natural hazards. Here, we demonstrate the utility of remotely-derived hydrologic and weather products in producing skillful flood forecasting information without reliance on vulnerable in situ data sources. Over the last decade a forecast system has been providing operational probabilistic forecasts of severe flooding of the Brahmaputra and Ganges Rivers in Bangldesh was developed (Hopson and Webster 2010). The system utilizes ECMWF weather forecast uncertainty information and ensemble weather forecasts, rain gauge and satellite-derived precipitation estimates, together with the limited near-real-time river stage observations from Bangladesh. This system has been expanded to Pakistan and has successfully forecast the 2010-2012 flooding (Shrestha and Webster 2013). To overcome the in situ hydrological data problem, recent efforts in parallel with the numerical modeling have utilized microwave satellite remote sensing of river widths to generate operational discharge advective-based forecasts for the Ganges and Brahmaputra. More than twenty remotely locations upstream of Bangldesh were used to produce stand-alone river flow nowcasts and forecasts at 1-15 days lead time. showing that satellite-based flow estimates are a useful source of dynamical surface water information in data-scarce regions and that they could be used for model calibration and data assimilation purposes in near-time hydrologic forecast applications (Hirpa et al. 2013). More recent efforts during this year's monsoon season are optimally combining these different independent sources of river forecast information along with archived flood inundation imagery of the Dartmouth Flood Observatory to improve the visualization and overall skill of the ongoing CFAB ensemble weather forecast-based flood forecasting system within the unique context of the ongoing flood forecasting efforts for Bangladesh.
Monthly ENSO Forecast Skill and Lagged Ensemble Size
DelSole, T.; Tippett, M.K.; Pegion, K.
2018-01-01
Abstract The mean square error (MSE) of a lagged ensemble of monthly forecasts of the Niño 3.4 index from the Climate Forecast System (CFSv2) is examined with respect to ensemble size and configuration. Although the real‐time forecast is initialized 4 times per day, it is possible to infer the MSE for arbitrary initialization frequency and for burst ensembles by fitting error covariances to a parametric model and then extrapolating to arbitrary ensemble size and initialization frequency. Applying this method to real‐time forecasts, we find that the MSE consistently reaches a minimum for a lagged ensemble size between one and eight days, when four initializations per day are included. This ensemble size is consistent with the 8–10 day lagged ensemble configuration used operationally. Interestingly, the skill of both ensemble configurations is close to the estimated skill of the infinite ensemble. The skill of the weighted, lagged, and burst ensembles are found to be comparable. Certain unphysical features of the estimated error growth were tracked down to problems with the climatology and data discontinuities. PMID:29937973
Monthly ENSO Forecast Skill and Lagged Ensemble Size
NASA Astrophysics Data System (ADS)
Trenary, L.; DelSole, T.; Tippett, M. K.; Pegion, K.
2018-04-01
The mean square error (MSE) of a lagged ensemble of monthly forecasts of the Niño 3.4 index from the Climate Forecast System (CFSv2) is examined with respect to ensemble size and configuration. Although the real-time forecast is initialized 4 times per day, it is possible to infer the MSE for arbitrary initialization frequency and for burst ensembles by fitting error covariances to a parametric model and then extrapolating to arbitrary ensemble size and initialization frequency. Applying this method to real-time forecasts, we find that the MSE consistently reaches a minimum for a lagged ensemble size between one and eight days, when four initializations per day are included. This ensemble size is consistent with the 8-10 day lagged ensemble configuration used operationally. Interestingly, the skill of both ensemble configurations is close to the estimated skill of the infinite ensemble. The skill of the weighted, lagged, and burst ensembles are found to be comparable. Certain unphysical features of the estimated error growth were tracked down to problems with the climatology and data discontinuities.
On the skill of various ensemble spread estimators for probabilistic short range wind forecasting
NASA Astrophysics Data System (ADS)
Kann, A.
2012-05-01
A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.
Water balance models in one-month-ahead streamflow forecasting
Alley, William M.
1985-01-01
Techniques are tested that incorporate information from water balance models in making 1-month-ahead streamflow forecasts in New Jersey. The results are compared to those based on simple autoregressive time series models. The relative performance of the models is dependent on the month of the year in question. The water balance models are most useful for forecasts of April and May flows. For the stations in northern New Jersey, the April and May forecasts were made in order of decreasing reliability using the water-balance-based approaches, using the historical monthly means, and using simple autoregressive models. The water balance models were useful to a lesser extent for forecasts during the fall months. For the rest of the year the improvements in forecasts over those obtained using the simpler autoregressive models were either very small or the simpler models provided better forecasts. When using the water balance models, monthly corrections for bias are found to improve minimum mean-square-error forecasts as well as to improve estimates of the forecast conditional distributions.
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
NASA Astrophysics Data System (ADS)
Demaria, E. M.; Valdes, J. B.; Wi, S.; Serrat-Capdevila, A.; Valdés-Pineda, R.; Durcik, M.
2016-12-01
In under-instrumented basins around the world, accurate and timely forecasts of river streamflows have the potential of assisting water and natural resource managers in their management decisions. The Upper Zambezi river basin is the largest basin in southern Africa and its water resources are critical to sustainable economic growth and poverty reduction in eight riparian countries. We present a real-time streamflow forecast for the basin using a multi-model-multi-satellite approach that allows accounting for model and input uncertainties. Three distributed hydrologic models with different levels of complexity: VIC, HYMOD_DS, and HBV_DS are setup at a daily time step and a 0.25 degree spatial resolution for the basin. The hydrologic models are calibrated against daily observed streamflows at the Katima-Mulilo station using a Genetic Algorithm. Three real-time satellite products: Climate Prediction Center's morphing technique (CMORPH), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and Tropical Rainfall Measuring Mission (TRMM-3B42RT) are bias-corrected with daily CHIRPS estimates. Uncertainty bounds for predicted flows are estimated with the Inverse Variance Weighting method. Because concentration times in the basin range from a few days to more than a week, we include the use of precipitation forecasts from the Global Forecasting System (GFS) to predict daily streamflows in the basin with a 10-days lead time. The skill of GFS-predicted streamflows is evaluated and the usefulness of the forecasts for short term water allocations is presented.
Forecasting the Change of Renal Stone Occurrence Rates in Astronauts
NASA Technical Reports Server (NTRS)
Myers, J.; Goodenow, D.; Gokoglu, S.; Kassemi, M.
2016-01-01
Changes in urine chemistry, during and post flight, potentially increases the risk of renal stones in astronauts. Although much is known about the effects of space flight on urine chemistry, no inflight incidences of renal stones in US astronauts exists and the question How much does this risk change with space flight? remains difficult to accurately quantify. In this discussion, we tackle this question utilizing a combination of deterministic and probabilistic modeling that implements the physics behind free stone growth and agglomeration, speciation of urine chemistry and published observations of population renal stone incidences to estimate changes in the rate of renal stone occurrence.
Sensitivity of Forecast Skill to Different Objective Analysis Schemes
NASA Technical Reports Server (NTRS)
Baker, W. E.
1979-01-01
Numerical weather forecasts are characterized by rapidly declining skill in the first 48 to 72 h. Recent estimates of the sources of forecast error indicate that the inaccurate specification of the initial conditions contributes substantially to this error. The sensitivity of the forecast skill to the initial conditions is examined by comparing a set of real-data experiments whose initial data were obtained with two different analysis schemes. Results are presented to emphasize the importance of the objective analysis techniques used in the assimilation of observational data.
A study comparison of two system model performance in estimated lifted index over Indonesia.
NASA Astrophysics Data System (ADS)
lestari, Juliana tri; Wandala, Agie
2018-05-01
Lifted index (LI) is one of atmospheric stability indices that used for thunderstorm forecasting. Numerical weather Prediction Models are essential for accurate weather forecast these day. This study has completed the attempt to compare the two NWP models these are Weather Research Forecasting (WRF) model and Global Forecasting System (GFS) model in estimates LI at 20 locations over Indonesia and verified the result with observation. Taylor diagram was used to comparing the models skill with shown the value of standard deviation, coefficient correlation and Root mean square error (RMSE). This study using the dataset on 00.00 UTC and 12.00 UTC during mid-March to Mid-April 2017. From the sample of LI distributions, both models have a tendency to overestimated LI value in almost all region in Indonesia while the WRF models has the better ability to catch the LI pattern distribution with observation than GFS model has. The verification result shows how both WRF and GFS model have such a weak relationship with observation except Eltari meteorologi station that its coefficient correlation reach almost 0.6 with the low RMSE value. Mean while WRF model have a better performance than GFS model. This study suggest that estimated LI of WRF model can provide the good performance for Thunderstorm forecasting over Indonesia in the future. However unsufficient relation between output models and observation in the certain location need a further investigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xuejun; Tang, Qiuhong; Liu, Xingcai
Real-time monitoring and predicting drought development with several months in advance is of critical importance for drought risk adaptation and mitigation. In this paper, we present a drought monitoring and seasonal forecasting framework based on the Variable Infiltration Capacity (VIC) hydrologic model over Southwest China (SW). The satellite precipitation data are used to force VIC model for near real-time estimate of land surface hydrologic conditions. As initialized with satellite-aided monitoring, the climate model-based forecast (CFSv2_VIC) and ensemble streamflow prediction (ESP)-based forecast (ESP_VIC) are both performed and evaluated through their ability in reproducing the evolution of the 2009/2010 severe drought overmore » SW. The results show that the satellite-aided monitoring is able to provide reasonable estimate of forecast initial conditions (ICs) in a real-time manner. Both of CFSv2_VIC and ESP_VIC exhibit comparable performance against the observation-based estimates for the first month, whereas the predictive skill largely drops beyond 1-month. Compared to ESP_VIC, CFSv2_VIC shows better performance as indicated by the smaller ensemble range. This study highlights the value of this operational framework in generating near real-time ICs and giving a reliable prediction with 1-month ahead, which has great implications for drought risk assessment, preparation and relief.« less
Nambe Pueblo Water Budget and Forecasting model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brainard, James Robert
2009-10-01
This report documents The Nambe Pueblo Water Budget and Water Forecasting model. The model has been constructed using Powersim Studio (PS), a software package designed to investigate complex systems where flows and accumulations are central to the system. Here PS has been used as a platform for modeling various aspects of Nambe Pueblo's current and future water use. The model contains three major components, the Water Forecast Component, Irrigation Scheduling Component, and the Reservoir Model Component. In each of the components, the user can change variables to investigate the impacts of water management scenarios on future water use. The Watermore » Forecast Component includes forecasting for industrial, commercial, and livestock use. Domestic demand is also forecasted based on user specified current population, population growth rates, and per capita water consumption. Irrigation efficiencies are quantified in the Irrigated Agriculture component using critical information concerning diversion rates, acreages, ditch dimensions and seepage rates. Results from this section are used in the Water Demand Forecast, Irrigation Scheduling, and the Reservoir Model components. The Reservoir Component contains two sections, (1) Storage and Inflow Accumulations by Categories and (2) Release, Diversion and Shortages. Results from both sections are derived from the calibrated Nambe Reservoir model where historic, pre-dam or above dam USGS stream flow data is fed into the model and releases are calculated.« less
Aggregate Resource Inventory and Needs Forecast Study : Final Report
DOT National Transportation Integrated Search
2002-09-01
This study identified and inventoried ODOT-owned and leased aggregate sites throughout the state, assessing the : quality and estimated quantity of material. In addition, an aggregate needs forecast was prepared, projecting that : 60,801,320 Mg of ag...
NASA Astrophysics Data System (ADS)
Liang, Zhongmin; Li, Yujie; Hu, Yiming; Li, Binquan; Wang, Jun
2017-06-01
Accurate and reliable long-term forecasting plays an important role in water resources management and utilization. In this paper, a hybrid model called SVR-HUP is presented to predict long-term runoff and quantify the prediction uncertainty. The model is created based on three steps. First, appropriate predictors are selected according to the correlations between meteorological factors and runoff. Second, a support vector regression (SVR) model is structured and optimized based on the LibSVM toolbox and a genetic algorithm. Finally, using forecasted and observed runoff, a hydrologic uncertainty processor (HUP) based on a Bayesian framework is used to estimate the posterior probability distribution of the simulated values, and the associated uncertainty of prediction was quantitatively analyzed. Six precision evaluation indexes, including the correlation coefficient (CC), relative root mean square error (RRMSE), relative error (RE), mean absolute percentage error (MAPE), Nash-Sutcliffe efficiency (NSE), and qualification rate (QR), are used to measure the prediction accuracy. As a case study, the proposed approach is applied in the Han River basin, South Central China. Three types of SVR models are established to forecast the monthly, flood season and annual runoff volumes. The results indicate that SVR yields satisfactory accuracy and reliability at all three scales. In addition, the results suggest that the HUP cannot only quantify the uncertainty of prediction based on a confidence interval but also provide a more accurate single value prediction than the initial SVR forecasting result. Thus, the SVR-HUP model provides an alternative method for long-term runoff forecasting.
Magnetogram Forecast: An All-Clear Space Weather Forecasting System
NASA Technical Reports Server (NTRS)
Barghouty, Nasser; Falconer, David
2015-01-01
Solar flares and coronal mass ejections (CMEs) are the drivers of severe space weather. Forecasting the probability of their occurrence is critical in improving space weather forecasts. The National Oceanic and Atmospheric Administration (NOAA) currently uses the McIntosh active region category system, in which each active region on the disk is assigned to one of 60 categories, and uses the historical flare rates of that category to make an initial forecast that can then be adjusted by the NOAA forecaster. Flares and CMEs are caused by the sudden release of energy from the coronal magnetic field by magnetic reconnection. It is believed that the rate of flare and CME occurrence in an active region is correlated with the free energy of an active region. While the free energy cannot be measured directly with present observations, proxies of the free energy can instead be used to characterize the relative free energy of an active region. The Magnetogram Forecast (MAG4) (output is available at the Community Coordinated Modeling Center) was conceived and designed to be a databased, all-clear forecasting system to support the operational goals of NASA's Space Radiation Analysis Group. The MAG4 system automatically downloads nearreal- time line-of-sight Helioseismic and Magnetic Imager (HMI) magnetograms on the Solar Dynamics Observatory (SDO) satellite, identifies active regions on the solar disk, measures a free-energy proxy, and then applies forecasting curves to convert the free-energy proxy into predicted event rates for X-class flares, M- and X-class flares, CMEs, fast CMEs, and solar energetic particle events (SPEs). The forecast curves themselves are derived from a sample of 40,000 magnetograms from 1,300 active region samples, observed by the Solar and Heliospheric Observatory Michelson Doppler Imager. Figure 1 is an example of MAG4 visual output
NASA Astrophysics Data System (ADS)
Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.
2017-12-01
Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.
James, Eric P.; Benjamin, Stanley G.; Marquis, Melinda
2016-10-28
A new gridded dataset for wind and solar resource estimation over the contiguous United States has been derived from hourly updated 1-h forecasts from the National Oceanic and Atmospheric Administration High-Resolution Rapid Refresh (HRRR) 3-km model composited over a three-year period (approximately 22 000 forecast model runs). The unique dataset features hourly data assimilation, and provides physically consistent wind and solar estimates for the renewable energy industry. The wind resource dataset shows strong similarity to that previously provided by a Department of Energy-funded study, and it includes estimates in southern Canada and northern Mexico. The solar resource dataset represents anmore » initial step towards application-specific fields such as global horizontal and direct normal irradiance. This combined dataset will continue to be augmented with new forecast data from the advanced HRRR atmospheric/land-surface model.« less
Anthropogenic range contractions bias species climate change forecasts
NASA Astrophysics Data System (ADS)
Faurby, Søren; Araújo, Miguel B.
2018-03-01
Forecasts of species range shifts under climate change most often rely on ecological niche models, in which characterizations of climate suitability are highly contingent on the species range data used. If ranges are far from equilibrium under current environmental conditions, for instance owing to local extinctions in otherwise suitable areas, modelled environmental suitability can be truncated, leading to biased estimates of the effects of climate change. Here we examine the impact of such biases on estimated risks from climate change by comparing models of the distribution of North American mammals based on current ranges with ranges accounting for historical information on species ranges. We find that estimated future diversity, almost everywhere, except in coastal Alaska, is drastically underestimated unless the full historical distribution of the species is included in the models. Consequently forecasts of climate change impacts on biodiversity for many clades are unlikely to be reliable without acknowledging anthropogenic influences on contemporary ranges.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
NASA Astrophysics Data System (ADS)
Versini, Pierre-Antoine
2012-01-01
SummaryImportant damages occur in small headwater catchments when they are hit by severe storms with complex spatio-temporal structure, sometimes resulting in flash floods. As these catchments are mostly not covered by sensor networks, it is difficult to forecast these floods. This is particularly true for road submersions, representing major concerns for flood event managers. The use of Quantitative Precipitation Estimates and Forecasts (QPE/QPF) especially based on radar measurements could particularly be adequate to evaluate rainfall-induced risks. Although their characteristic time and space scales would make them suitable for flash flood modelling, the impact of their uncertainties remain uncertain and have to be evaluated. The Gard region (France) has been chosen as case study. This area is frequently affected by severe flash floods, and an application devoted to the road network has also been recently developed for the North part of this region. This warning system combines distributed hydro-meteorological modelling and susceptibility analysis to provide warnings of road inundations. The warning system has been tested on the specific storm of the 29-30 September 2007. During this event, around 200 mm dropped on the South part of the Gard and many roads were submerged. Radar-based QPE and QPF have been used to forecast the exact location of road submersions and the results have been compared to the effective road submersions actually occurred during the event as listed by the emergency services. Used on an area it has not been calibrated, the results confirm that the road submersion warning system represents a promising tool for anticipating and quantifying the consequences of storm events at ground. It rates the submersion risk with an acceptable level of accuracy and demonstrates also the quality of high spatial and temporal resolution radar rainfall data in real time, and the possibility to use them despite their uncertainties. However because of the quality of rainfall forecasts falls drastically with time, it is not often sufficient to provide valuable information for lead times exceeding 1 h.
Lai, C.; Tsay, T.-K.; Chien, C.-H.; Wu, I.-L.
2009-01-01
Researchers at the Hydroinformatic Research and Development Team (HIRDT) of the National Taiwan University undertook a project to create a real time flood forecasting model, with an aim to predict the current in the Tamsui River Basin. The model was designed based on deterministic approach with mathematic modeling of complex phenomenon, and specific parameter values operated to produce a discrete result. The project also devised a rainfall-stage model that relates the rate of rainfall upland directly to the change of the state of river, and is further related to another typhoon-rainfall model. The geographic information system (GIS) data, based on precise contour model of the terrain, estimate the regions that were perilous to flooding. The HIRDT, in response to the project's progress, also devoted their application of a deterministic model to unsteady flow of thermodynamics to help predict river authorities issue timely warnings and take other emergency measures.
Software reliability: Additional investigations into modeling with replicated experiments
NASA Technical Reports Server (NTRS)
Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.
1984-01-01
The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.
NASA Astrophysics Data System (ADS)
Rasim; Junaeti, E.; Wirantika, R.
2018-01-01
Accurate forecasting for the sale of a product depends on the forecasting method used. The purpose of this research is to build motorcycle sales forecasting application using Fuzzy Time Series method combined with interval determination using automatic clustering algorithm. Forecasting is done using the sales data of motorcycle sales in the last ten years. Then the error rate of forecasting is measured using Means Percentage Error (MPE) and Means Absolute Percentage Error (MAPE). The results of forecasting in the one-year period obtained in this study are included in good accuracy.
Performance of time-series methods in forecasting the demand for red blood cell transfusion.
Pereira, Arturo
2004-05-01
Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.
Geist, Eric L.; Titov, Vasily V.; Arcas, Diego; Pollitz, Fred F.; Bilek, Susan L.
2007-01-01
Results from different tsunami forecasting and hazard assessment models are compared with observed tsunami wave heights from the 26 December 2004 Indian Ocean tsunami. Forecast models are based on initial earthquake information and are used to estimate tsunami wave heights during propagation. An empirical forecast relationship based only on seismic moment provides a close estimate to the observed mean regional and maximum local tsunami runup heights for the 2004 Indian Ocean tsunami but underestimates mean regional tsunami heights at azimuths in line with the tsunami beaming pattern (e.g., Sri Lanka, Thailand). Standard forecast models developed from subfault discretization of earthquake rupture, in which deep- ocean sea level observations are used to constrain slip, are also tested. Forecast models of this type use tsunami time-series measurements at points in the deep ocean. As a proxy for the 2004 Indian Ocean tsunami, a transect of deep-ocean tsunami amplitudes recorded by satellite altimetry is used to constrain slip along four subfaults of the M >9 Sumatra–Andaman earthquake. This proxy model performs well in comparison to observed tsunami wave heights, travel times, and inundation patterns at Banda Aceh. Hypothetical tsunami hazard assessments models based on end- member estimates for average slip and rupture length (Mw 9.0–9.3) are compared with tsunami observations. Using average slip (low end member) and rupture length (high end member) (Mw 9.14) consistent with many seismic, geodetic, and tsunami inversions adequately estimates tsunami runup in most regions, except the extreme runup in the western Aceh province. The high slip that occurred in the southern part of the rupture zone linked to runup in this location is a larger fluctuation than expected from standard stochastic slip models. In addition, excess moment release (∼9%) deduced from geodetic studies in comparison to seismic moment estimates may generate additional tsunami energy, if the exponential time constant of slip is less than approximately 1 hr. Overall, there is significant variation in assessed runup heights caused by quantifiable uncertainty in both first-order source parameters (e.g., rupture length, slip-length scaling) and spatiotemporal complexity of earthquake rupture.
Identification and synthetic modeling of factors affecting American black duck populations
Conroy, Michael J.; Miller, Mark W.; Hines, James E.
2002-01-01
We reviewed the literature on factors potentially affecting the population status of American black ducks (Anas rupribes). Our review suggests that there is some support for the influence of 4 major, continental-scope factors in limiting or regulating black duck populations: 1) loss in the quantity or quality of breeding habitats; 2) loss in the quantity or quality of wintering habitats; 3) harvest, and 4) interactions (competition, hybridization) with mallards (Anas platyrhychos) during the breeding and/or wintering periods. These factors were used as the basis of an annual life cycle model in which reproduction rates and survival rates were modeled as functions of the above factors, with parameters of the model describing the strength of these relationships. Variation in the model parameter values allows for consideration of scientific uncertainty as to the degree each of these factors may be contributing to declines in black duck populations, and thus allows for the investigation of the possible effects of management (e.g., habitat improvement, harvest reductions) under different assumptions. We then used available, historical data on black duck populations (abundance, annual reproduction rates, and survival rates) and possible driving factors (trends in breeding and wintering habitats, harvest rates, and abundance of mallards) to estimate model parameters. Our estimated reproduction submodel included parameters describing negative density feedback of black ducks, positive influence of breeding habitat, and negative influence of mallard densities; our survival submodel included terms for positive influence of winter habitat on reproduction rates, and negative influences of black duck density (i.e., compensation to harvest mortality). Individual models within each group (reproduction, survival) involved various combinations of these factors, and each was given an information theoretic weight for use in subsequent prediction. The reproduction model with highest AIC weight (0.70) predicted black duck age ratios increasing as a function of decreasing mallard abundance and increasing acreage of breeding habitat; all models considered involved negative density dependence for black ducks. The survival model with highest AIC weight (0.51) predicted nonharvest survival increasing as a function of increasing acreage of wintering habitat and decreasing harvest rates (additive mortality); models involving compensatory mortality effects received ≈0.12 total weight, vs. 0.88 for additive models. We used the combined model, together with our historical data set, to perform a series of 1-year population forecasts, similar to those that might be performed under adaptive management. Initial model forecasts over-predicted observed breeding populations by ≈25%. Least-squares calibration reduced the bias to ≈0.5% under prediction. After calibration, model-averaged predictions over the 16 alternative models (4 reproduction × 4 survival, weighted by AIC model weights) explained 67% of the variation in annual breeding population abundance for black ducks, suggesting that it might have utility as a predictive tool in adaptive management. We investigated the effects of statistical uncertainty in parameter values on predicted population growth rates for the combined annual model, via sensitivity analyses. Parameter sensitivity varied in relation to the parameter values over the estimated confidence intervals, and in relation to harvest rates and mallard abundance. Forecasts of black duck abundance were extremely sensitive to variation in parameter values for the coefficients for breeding and wintering habitat effects. Model-averaged forecasts of black duck abundance were also sensitive to changes in harvest rate and mallard abundance, with rapid declines in black duck abundance predicted for a range of harvest rates and mallard abundance higher than current levels of either factor, but easily envisaged, particularly given current rates of growth for mallard populations. Because of concerns about sensitivity to habitat coefficients, and particularly in light of deficiencies in the historical data used to estimate these parameters, we developed a simplified model that excludes habitat effects. We also developed alternative models involving a calibration adjustment for reproduction rates, survival rates, or neither. Calibration of survival rates performed best (AIC weight 0.59, % BIAS = -0.280, R2=0.679), with reproduction calibration somewhat inferior (AIC weight 0.41, % BIAS = -0.267, R2=0.672); models without calibration received virtually no AIC weight and were discarded. We recommend that the simplified model set (4 biological models × 2 alternative calibration factors) be retained as the best working set of alternative models for research and management. Finally, we provide some preliminary guidance for the development of adaptive harvest management for black ducks, using our working set of models.
Yang, Wan; Karspeck, Alicia; Shaman, Jeffrey
2014-01-01
A variety of filtering methods enable the recursive estimation of system state variables and inference of model parameters. These methods have found application in a range of disciplines and settings, including engineering design and forecasting, and, over the last two decades, have been applied to infectious disease epidemiology. For any system of interest, the ideal filter depends on the nonlinearity and complexity of the model to which it is applied, the quality and abundance of observations being entrained, and the ultimate application (e.g. forecast, parameter estimation, etc.). Here, we compare the performance of six state-of-the-art filter methods when used to model and forecast influenza activity. Three particle filters—a basic particle filter (PF) with resampling and regularization, maximum likelihood estimation via iterated filtering (MIF), and particle Markov chain Monte Carlo (pMCMC)—and three ensemble filters—the ensemble Kalman filter (EnKF), the ensemble adjustment Kalman filter (EAKF), and the rank histogram filter (RHF)—were used in conjunction with a humidity-forced susceptible-infectious-recovered-susceptible (SIRS) model and weekly estimates of influenza incidence. The modeling frameworks, first validated with synthetic influenza epidemic data, were then applied to fit and retrospectively forecast the historical incidence time series of seven influenza epidemics during 2003–2012, for 115 cities in the United States. Results suggest that when using the SIRS model the ensemble filters and the basic PF are more capable of faithfully recreating historical influenza incidence time series, while the MIF and pMCMC do not perform as well for multimodal outbreaks. For forecast of the week with the highest influenza activity, the accuracies of the six model-filter frameworks are comparable; the three particle filters perform slightly better predicting peaks 1–5 weeks in the future; the ensemble filters are more accurate predicting peaks in the past. PMID:24762780
A prospective earthquake forecast experiment in the western Pacific
NASA Astrophysics Data System (ADS)
Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan
2012-09-01
Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.
Statistical models for estimating daily streamflow in Michigan
Holtschlag, D.J.; Salehi, Habib
1992-01-01
Statistical models for estimating daily streamflow were analyzed for 25 pairs of streamflow-gaging stations in Michigan. Stations were paired by randomly choosing a station operated in 1989 at which 10 or more years of continuous flow data had been collected and at which flow is virtually unregulated; a nearby station was chosen where flow characteristics are similar. Streamflow data from the 25 randomly selected stations were used as the response variables; streamflow data at the nearby stations were used to generate a set of explanatory variables. Ordinary-least squares regression (OLSR) equations, autoregressive integrated moving-average (ARIMA) equations, and transfer function-noise (TFN) equations were developed to estimate the log transform of flow for the 25 randomly selected stations. The precision of each type of equation was evaluated on the basis of the standard deviation of the estimation errors. OLSR equations produce one set of estimation errors; ARIMA and TFN models each produce l sets of estimation errors corresponding to the forecast lead. The lead-l forecast is the estimate of flow l days ahead of the most recent streamflow used as a response variable in the estimation. In this analysis, the standard deviation of lead l ARIMA and TFN forecast errors were generally lower than the standard deviation of OLSR errors for l < 2 days and l < 9 days, respectively. Composite estimates were computed as a weighted average of forecasts based on TFN equations and backcasts (forecasts of the reverse-ordered series) based on ARIMA equations. The standard deviation of composite errors varied throughout the length of the estimation interval and generally was at maximum near the center of the interval. For comparison with OLSR errors, the mean standard deviation of composite errors were computed for intervals of length 1 to 40 days. The mean standard deviation of length-l composite errors were generally less than the standard deviation of the OLSR errors for l < 32 days. In addition, the composite estimates ensure a gradual transition between periods of estimated and measured flows. Model performance among stations of differing model error magnitudes were compared by computing ratios of the mean standard deviation of the length l composite errors to the standard deviation of OLSR errors. The mean error ratio for the set of 25 selected stations was less than 1 for intervals l < 32 days. Considering the frequency characteristics of the length of intervals of estimated record in Michigan, the effective mean error ratio for intervals < 30 days was 0.52. Thus, for intervals of estimation of 1 month or less, the error of the composite estimate is substantially lower than error of the OLSR estimate.
Skill in Precipitation Forecasting in the National Weather Service.
NASA Astrophysics Data System (ADS)
Charba, Jerome P.; Klein, William H.
1980-12-01
All known long-term records of forecasting performance for different types of precipitation forecasts in the National Weather Service were examined for relative skill and secular trends in skill. The largest upward trends were achieved by local probability of precipitation (PoP) forecasts for the periods 24-36 h and 36-48 h after 0000 and 1200 GMT. Over the last 13 years, the skill of these forecasts has improved at an average rate of 7.2% per 10-year interval. Over the same period, improvement has been smaller in local PoP skill in the 12-24 h range (2.0% per 10 years) and in the accuracy of "Yea/No" forecasts of measurable precipitation. The overall trend in accuracy of centralized quantitative precipitation forecasts of 0.5 in and 1.0 in has been slightly upward at the 0-24 h range and strongly upward at the 24-48 h range. Most of the improvement in these forecasts has been achieved from the early 1970s to the present. Strong upward accuracy trends in all types of precipitation forecasts within the past eight years are attributed primarily to improvements in numerical and statistical centralized guidance forecasts.The skill and accuracy of both measurable and quantitative precipitation forecasts is 35-55% greater during the cool season than during the warm season. Also, the secular rate of improvement of the cool season precipitation forecasts is 50-110% greater than that of the warm season. This seasonal difference in performance reflects the relative difficulty of forecasting predominantly stratiform precipitation of the cool season and convective precipitation of the warm season.
Kim, Moon H.; Morlock, Scott E.; Arihood, Leslie D.; Kiesler, James L.
2011-01-01
Near-real-time and forecast flood-inundation mapping products resulted from a pilot study for an 11-mile reach of the White River in Indianapolis. The study was done by the U.S. Geological Survey (USGS), Indiana Silver Jackets hazard mitigation taskforce members, the National Weather Service (NWS), the Polis Center, and Indiana University, in cooperation with the City of Indianapolis, the Indianapolis Museum of Art, the Indiana Department of Homeland Security, and the Indiana Department of Natural Resources, Division of Water. The pilot project showed that it is technically feasible to create a flood-inundation map library by means of a two-dimensional hydraulic model, use a map from the library to quickly complete a moderately detailed local flood-loss estimate, and automatically run the hydraulic model during a flood event to provide the maps and flood-damage information through a Web graphical user interface. A library of static digital flood-inundation maps was created by means of a calibrated two-dimensional hydraulic model. Estimated water-surface elevations were developed for a range of river stages referenced to a USGS streamgage and NWS flood forecast point colocated within the study reach. These maps were made available through the Internet in several formats, including geographic information system, Keyhole Markup Language, and Portable Document Format. A flood-loss estimate was completed for part of the study reach by using one of the flood-inundation maps from the static library. The Federal Emergency Management Agency natural disaster-loss estimation program HAZUS-MH, in conjunction with local building information, was used to complete a level 2 analysis of flood-loss estimation. A Service-Oriented Architecture-based dynamic flood-inundation application was developed and was designed to start automatically during a flood, obtain near real-time and forecast data (from the colocated USGS streamgage and NWS flood forecast point within the study reach), run the two-dimensional hydraulic model, and produce flood-inundation maps. The application used local building data and depth-damage curves to estimate flood losses based on the maps, and it served inundation maps and flood-loss estimates through a Web-based graphical user interface.
A new approach to the convective parameterization of the regional atmospheric model BRAMS
NASA Astrophysics Data System (ADS)
Dos Santos, A. F.; Freitas, S. R.; de Campos Velho, H. F.; Luz, E. F.; Gan, M. A.; de Mattos, J. Z.; Grell, G. A.
2013-05-01
The summer characteristics of January 2010 was performed using the atmospheric model Brazilian developments on the Regional Atmospheric Modeling System (BRAMS). The convective parameterization scheme of Grell and Dévényi was used to represent clouds and their interaction with the large scale environment. As a result, the precipitation forecasts can be combined in several ways, generating a numerical representation of precipitation and atmospheric heating and moistening rates. The purpose of this study was to generate a set of weights to compute a best combination of the hypothesis of the convective scheme. It is an inverse problem of parameter estimation and the problem is solved as an optimization problem. To minimize the difference between observed data and forecasted precipitation, the objective function was computed with the quadratic difference between five simulated precipitation fields and observation. The precipitation field estimated by the Tropical Rainfall Measuring Mission satellite was used as observed data. Weights were obtained using the firefly algorithm and the mass fluxes of each closure of the convective scheme were weighted generating a new set of mass fluxes. The results indicated the better skill of the model with the new methodology compared with the old ensemble mean calculation.
A model to assess the Mars Telecommunications Network relay robustness
NASA Technical Reports Server (NTRS)
Girerd, Andre R.; Meshkat, Leila; Edwards, Charles D., Jr.; Lee, Charles H.
2005-01-01
The relatively long mission durations and compatible radio protocols of current and projected Mars orbiters have enabled the gradual development of a heterogeneous constellation providing proximity communication services for surface assets. The current and forecasted capability of this evolving network has reached the point that designers of future surface missions consider complete dependence on it. Such designers, along with those architecting network requirements, have a need to understand the robustness of projected communication service. A model has been created to identify the robustness of the Mars Network as a function of surface location and time. Due to the decade-plus time horizon considered, the network will evolve, with emerging productive nodes and nodes that cease or fail to contribute. The model is a flexible framework to holistically process node information into measures of capability robustness that can be visualized for maximum understanding. Outputs from JPL's Telecom Orbit Analysis Simulation Tool (TOAST) provide global telecom performance parameters for current and projected orbiters. Probabilistic estimates of orbiter fuel life are derived from orbit keeping burn rates, forecasted maneuver tasking, and anomaly resolution budgets. Orbiter reliability is estimated probabilistically. A flexible scheduling framework accommodates the projected mission queue as well as potential alterations.
Model Error Estimation for the CPTEC Eta Model
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; daSilva, Arlindo
1999-01-01
Statistical data assimilation systems require the specification of forecast and observation error statistics. Forecast error is due to model imperfections and differences between the initial condition and the actual state of the atmosphere. Practical four-dimensional variational (4D-Var) methods try to fit the forecast state to the observations and assume that the model error is negligible. Here with a number of simplifying assumption, a framework is developed for isolating the model error given the forecast error at two lead-times. Two definitions are proposed for the Talagrand ratio tau, the fraction of the forecast error due to model error rather than initial condition error. Data from the CPTEC Eta Model running operationally over South America are used to calculate forecast error statistics and lower bounds for tau.
Multi-parametric variational data assimilation for hydrological forecasting
NASA Astrophysics Data System (ADS)
Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.
2017-12-01
Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.
NASA Astrophysics Data System (ADS)
van Dijk, Albert I. J. M.; Peña-Arancibia, Jorge L.; Wood, Eric F.; Sheffield, Justin; Beck, Hylke E.
2013-05-01
Ideally, a seasonal streamflow forecasting system would ingest skilful climate forecasts and propagate these through calibrated hydrological models initialized with observed catchment conditions. At global scale, practical problems exist in each of these aspects. For the first time, we analyzed theoretical and actual skill in bimonthly streamflow forecasts from a global ensemble streamflow prediction (ESP) system. Forecasts were generated six times per year for 1979-2008 by an initialized hydrological model and an ensemble of 1° resolution daily climate estimates for the preceding 30 years. A post-ESP conditional sampling method was applied to 2.6% of forecasts, based on predictive relationships between precipitation and 1 of 21 climate indices prior to the forecast date. Theoretical skill was assessed against a reference run with historic forcing. Actual skill was assessed against streamflow records for 6192 small (<10,000 km2) catchments worldwide. The results show that initial catchment conditions provide the main source of skill. Post-ESP sampling enhanced skill in equatorial South America and Southeast Asia, particularly in terms of tercile probability skill, due to the persistence and influence of the El Niño Southern Oscillation. Actual skill was on average 54% of theoretical skill but considerably more for selected regions and times of year. The realized fraction of the theoretical skill probably depended primarily on the quality of precipitation estimates. Forecast skill could be predicted as the product of theoretical skill and historic model performance. Increases in seasonal forecast skill are likely to require improvement in the observation of precipitation and initial hydrological conditions.
NASA Astrophysics Data System (ADS)
Fehlmann, Michael; Gascón, Estíbaliz; Rohrer, Mario; Schwarb, Manfred; Stoffel, Markus
2018-05-01
The snowfall limit has important implications for different hazardous processes associated with prolonged or heavy precipitation such as flash floods, rain-on-snow events and freezing precipitation. To increase preparedness and to reduce risk in such situations, early warning systems are frequently used to monitor and predict precipitation events at different temporal and spatial scales. However, in alpine and pre-alpine valleys, the estimation of the snowfall limit remains rather challenging. In this study, we characterize uncertainties related to snowfall limit for different lead times based on local measurements of a vertically pointing micro rain radar (MRR) and a disdrometer in the Zulg valley, Switzerland. Regarding the monitoring, we show that the interpolation of surface temperatures tends to overestimate the altitude of the snowfall limit and can thus lead to highly uncertain estimates of liquid precipitation in the catchment. This bias is much smaller in the Integrated Nowcasting through Comprehensive Analysis (INCA) system, which integrates surface station and remotely sensed data as well as outputs of a numerical weather prediction model. To reduce systematic error, we perform a bias correction based on local MRR measurements and thereby demonstrate the added value of such measurements for the estimation of liquid precipitation in the catchment. Regarding the nowcasting, we show that the INCA system provides good estimates up to 6 h ahead and is thus considered promising for operational hydrological applications. Finally, we explore the medium-range forecasting of precipitation type, especially with respect to rain-on-snow events. We show for a selected case study that the probability for a certain precipitation type in an ensemble-based forecast is more persistent than the respective type in the high-resolution forecast (HRES) of the European Centre for Medium Range Weather Forecasts Integrated Forecasting System (ECMWF IFS). In this case study, the ensemble-based forecast could be used to anticipate such an event up to 7-8 days ahead, whereas the use of the HRES is limited to a lead time of 4-5 days. For the different lead times investigated, we point out possibilities of considering uncertainties in snowfall limit and precipitation type estimates so as to increase preparedness to risk situations.
PONS2train: tool for testing the MLP architecture and local traning methods for runoff forecast
NASA Astrophysics Data System (ADS)
Maca, P.; Pavlasek, J.; Pech, P.
2012-04-01
The purpose of presented poster is to introduce the PONS2train developed for runoff prediction via multilayer perceptron - MLP. The software application enables the implementation of 12 different MLP's transfer functions, comparison of 9 local training algorithms and finally the evaluation the MLP performance via 17 selected model evaluation metrics. The PONS2train software is written in C++ programing language. Its implementation consists of 4 classes. The NEURAL_NET and NEURON classes implement the MLP, the CRITERIA class estimates model evaluation metrics and for model performance evaluation via testing and validation datasets. The DATA_PATTERN class prepares the validation, testing and calibration datasets. The software application uses the LAPACK, BLAS and ARMADILLO C++ linear algebra libraries. The PONS2train implements the first order local optimization algorithms: standard on-line and batch back-propagation with learning rate combined with momentum and its variants with the regularization term, Rprop and standard batch back-propagation with variable momentum and learning rate. The second order local training algorithms represents: the Levenberg-Marquardt algorithm with and without regularization and four variants of scaled conjugate gradients. The other important PONS2train features are: the multi-run, the weight saturation control, early stopping of trainings, and the MLP weights analysis. The weights initialization is done via two different methods: random sampling from uniform distribution on open interval or Nguyen Widrow method. The data patterns can be transformed via linear and nonlinear transformation. The runoff forecast case study focuses on PONS2train implementation and shows the different aspects of the MLP training, the MLP architecture estimation, the neural network weights analysis and model uncertainty estimation.
Incidence of lip cancer in the male Norwegian agricultural population.
Nordby, K C; Andersen, A; Kristensen, P
2004-08-01
To explore lip cancer (LC) associations with work environmental exposures in a record-linkage study of Norwegian farmers. We hypothesize immunosuppressive substances (e.g. mycotoxins, pesticides) to influence LC incidence. A cohort of 131,243 male Norwegian farmers born 1925-1971 was established by cross-linkage of national registers and followed up through 1999 for incident LC, (ICD-7 site 140) in the Cancer Registry of Norway. Farm production data from agricultural censuses 1969-1979 and meteorological data on solar radiation and fungal forecasts (events of wet and temperate conditions known to favour fungal growth and mycotoxin formation) served as exposure proxies. Adjusted rate ratios (RR) and 95% confidence intervals (CI) were estimated using Poisson regression. We identified 108 LC cases (rate 4.4 per 100,000 person-years). We found LC to be moderately associated with horses on the farm (RR = 1.6, CI = 1.0-2.4), construction work employment (RR = 1.7, CI = 1.1-2.6), pesticide use (RR = 0.7, CI = 0.4-1.0), grain production (RR = 1.3, CI = 0.9-2.1) and increasing levels of fungal forecasts (RR = 1.6, CI = 0.9-2.8 in the highest two quartiles). Moderate associations of LC with grain production and fungal forecasts and the negative association with pesticide could possibly be explained by exposure to immunosuppressive mycotoxins. Some of the associations observed could be explained by solar exposure. Copyright 2004 Kluwer Academic Publishers
Global forecasts of urban expansion to 2030 and direct impacts on biodiversity and carbon pools.
Seto, Karen C; Güneralp, Burak; Hutyra, Lucy R
2012-10-02
Urban land-cover change threatens biodiversity and affects ecosystem productivity through loss of habitat, biomass, and carbon storage. However, despite projections that world urban populations will increase to nearly 5 billion by 2030, little is known about future locations, magnitudes, and rates of urban expansion. Here we develop spatially explicit probabilistic forecasts of global urban land-cover change and explore the direct impacts on biodiversity hotspots and tropical carbon biomass. If current trends in population density continue and all areas with high probabilities of urban expansion undergo change, then by 2030, urban land cover will increase by 1.2 million km(2), nearly tripling the global urban land area circa 2000. This increase would result in considerable loss of habitats in key biodiversity hotspots, with the highest rates of forecasted urban growth to take place in regions that were relatively undisturbed by urban development in 2000: the Eastern Afromontane, the Guinean Forests of West Africa, and the Western Ghats and Sri Lanka hotspots. Within the pan-tropics, loss in vegetation biomass from areas with high probability of urban expansion is estimated to be 1.38 PgC (0.05 PgC yr(-1)), equal to ∼5% of emissions from tropical deforestation and land-use change. Although urbanization is often considered a local issue, the aggregate global impacts of projected urban expansion will require significant policy changes to affect future growth trajectories to minimize global biodiversity and vegetation carbon losses.
Global forecasts of urban expansion to 2030 and direct impacts on biodiversity and carbon pools
Seto, Karen C.; Güneralp, Burak; Hutyra, Lucy R.
2012-01-01
Urban land-cover change threatens biodiversity and affects ecosystem productivity through loss of habitat, biomass, and carbon storage. However, despite projections that world urban populations will increase to nearly 5 billion by 2030, little is known about future locations, magnitudes, and rates of urban expansion. Here we develop spatially explicit probabilistic forecasts of global urban land-cover change and explore the direct impacts on biodiversity hotspots and tropical carbon biomass. If current trends in population density continue and all areas with high probabilities of urban expansion undergo change, then by 2030, urban land cover will increase by 1.2 million km2, nearly tripling the global urban land area circa 2000. This increase would result in considerable loss of habitats in key biodiversity hotspots, with the highest rates of forecasted urban growth to take place in regions that were relatively undisturbed by urban development in 2000: the Eastern Afromontane, the Guinean Forests of West Africa, and the Western Ghats and Sri Lanka hotspots. Within the pan-tropics, loss in vegetation biomass from areas with high probability of urban expansion is estimated to be 1.38 PgC (0.05 PgC yr−1), equal to ∼5% of emissions from tropical deforestation and land-use change. Although urbanization is often considered a local issue, the aggregate global impacts of projected urban expansion will require significant policy changes to affect future growth trajectories to minimize global biodiversity and vegetation carbon losses. PMID:22988086
Global Positioning System (GPS) Precipitable Water in Forecasting Lightning at Spaceport Canaveral
NASA Technical Reports Server (NTRS)
Kehrer, Kristen; Graf, Brian G.; Roeder, William
2005-01-01
Using meteorology data, focusing on precipitable water (PW), obtained during the 2000-2003 thunderstorm seasons in Central Florida, this paper will, one, assess the skill and accuracy measurements of the current Mazany forecasting tool and, two, provide additional forecasting tools that can be used in predicting lightning. Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) are located in east Central Florida. KSC and CCAFS process and launch manned (NASA Space Shuttle) and unmanned (NASA and Air Force Expendable Launch Vehicles) space vehicles. One of the biggest cost impacts is unplanned launch scrubs due to inclement weather conditions such as thunderstorms. Each launch delay/scrub costs over a quarter million dollars, and the need to land the Shuttle at another landing site and return to KSC costs approximately $ 1M. Given the amount of time lost and costs incurred, the ability to accurately forecast (predict) when lightning will occur can result in significant cost and time savings. All lightning prediction models were developed using binary logistic regression. Lightning is the dependent variable and is binary. The independent variables are the Precipitable Water (PW) value for a given time of the day, the change in PW up to 12 hours, the electric field mill value, and the K-index value. In comparing the Mazany model results for the 1999 period B against actual observations for the 2000-2003 thunderstorm seasons, differences were found in the False Alarm Rate (FAR), Probability of Detection (POD) and Hit Rate (H). On average, the False Alarm Rate (FAR) increased by 58%, the Probability of Detection (POD) decreased by 31% and the Hit Rate decreased by 20%. In comparing the performance of the 6 hour forecast period to the performance of the 1.5 hour forecast period for the Mazany model, the FAR was lower by 15% and the Hit Rate was higher by 7%. However, the POD for the 6 hour forecast period was lower by 16% as compared to the POD of the 1.5 hour forecast period. Neither forecast period performed at the accuracy measures expected. A 2-Hr Forecasting Tool was developed to support a Phase I Lightning Advisory, which requires a 30-minute lead time for predicting lightning.
SEE rate estimation based on diffusion approximation of charge collection
NASA Astrophysics Data System (ADS)
Sogoyan, Armen V.; Chumakov, Alexander I.; Smolin, Anatoly A.
2018-03-01
The integral rectangular parallelepiped (IRPP) method remains the main approach to single event rate (SER) prediction for aerospace systems, despite the growing number of issues impairing method's validity when applied to scaled technology nodes. One of such issues is uncertainty in parameters extraction in the IRPP method, which can lead to a spread of several orders of magnitude in the subsequently calculated SER. The paper presents an alternative approach to SER estimation based on diffusion approximation of the charge collection by an IC element and geometrical interpretation of SEE cross-section. In contrast to the IRPP method, the proposed model includes only two parameters which are uniquely determined from the experimental data for normal incidence irradiation at an ion accelerator. This approach eliminates the necessity of arbitrary decisions during parameter extraction and, thus, greatly simplifies calculation procedure and increases the robustness of the forecast.
Estimating the greenhouse gas benefits of forestry projects: A Costa Rican Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busch, Christopher; Sathaye, Jayant; Sanchez Azofeifa, G. Arturo
If the Clean Development Mechanism proposed under the Kyoto Protocol is to serve as an effective means for combating global climate change, it will depend upon reliable estimates of greenhouse gas benefits. This paper sketches the theoretical basis for estimating the greenhouse gas benefits of forestry projects and suggests lessons learned based on a case study of Costa Rica's Protected Areas Project, which is a 500,000 hectare effort to reduce deforestation and enhance reforestation. The Protected Areas Project in many senses advances the state of the art for Clean Development Mechanism-type forestry projects, as does the third-party verification work ofmore » SGS International Certification Services on the project. Nonetheless, sensitivity analysis shows that carbon benefit estimates for the project vary widely based on the imputed deforestation rate in the baseline scenario, e.g. the deforestation rate expected if the project were not implemented. This, along with a newly available national dataset that confirms other research showing a slower rate of deforestation in Costa Rica, suggests that the use of the 1979--1992 forest cover data originally as the basis for estimating carbon savings should be reconsidered. When the newly available data is substituted, carbon savings amount to 8.9 Mt (million tones) of carbon, down from the original estimate of 15.7 Mt. The primary general conclusion is that project developers should give more attention to the forecasting land use and land cover change scenarios underlying estimates of greenhouse gas benefits.« less
United States Data Center Energy Usage Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shehabi, Arman; Smith, Sarah; Sartor, Dale
This report estimates historical data center electricity consumption back to 2000, relying on previous studies and historical shipment data, and forecasts consumption out to 2020 based on new trends and the most recent data available. Figure ES-1 provides an estimate of total U.S. data center electricity use (servers, storage, network equipment, and infrastructure) from 2000-2020. In 2014, data centers in the U.S. consumed an estimated 70 billion kWh, representing about 1.8% of total U.S. electricity consumption. Current study results show data center electricity consumption increased by about 4% from 2010-2014, a large shift from the 24% percent increase estimated frommore » 2005-2010 and the nearly 90% increase estimated from 2000-2005. Energy use is expected to continue slightly increasing in the near future, increasing 4% from 2014-2020, the same rate as the past five years. Based on current trend estimates, U.S. data centers are projected to consume approximately 73 billion kWh in 2020.« less
Burris, Lucy; Skagen, Susan K.
2013-01-01
Playa wetlands on the west-central Great Plains of North America are vulnerable to sediment infilling from upland agriculture, putting at risk several important ecosystem services as well as essential habitats and food resources of diverse wetland-dependent biota. Climate predictions for this semi-arid area indicate reduced precipitation which may alter rates of erosion, runoff, and sedimentation of playas. We forecasted erosion rates, sediment depths, and resultant playa wetland depths across the west-central Great Plains and examined the relative roles of land use context and projected changes in precipitation in the sedimentation process. We estimated erosion with the Revised Universal Soil Loss Equation (RUSLE) using historic values and downscaled precipitation predictions from three general circulation models and three emissions scenarios. We calibrated RUSLE results using field sediment measurements. RUSLE is appealing for regional scale modeling because it uses climate forecasts with monthly resolution and other widely available values including soil texture, slope and land use. Sediment accumulation rates will continue near historic levels through 2070 and will be sufficient to cause most playas (if not already filled) to fill with sediment within the next 100 years in the absence of mitigation. Land use surrounding the playa, whether grassland or tilled cropland, is more influential in sediment accumulation than climate-driven precipitation change.
Estimation and prediction of origin-destination matrices for I-66.
DOT National Transportation Integrated Search
2011-09-01
This project uses the Box-Jenkins time-series technique to model and forecast the traffic flows and then : uses the flow forecasts to predict the origin-destination matrices. First, a detailed analysis was conducted : to investigate the best data cor...
DOT National Transportation Integrated Search
2009-02-01
This working paper describes a group of techniques for disaggregating origin-destination tables : for truck forecasting that makes explicit use of observed traffic on a network. Six models within : the group are presented, each of which uses nonlinea...
Aggregate Auto Travel Forecasting : State of the Art and Suggestions for Future Research
DOT National Transportation Integrated Search
1976-12-01
The report reviews existing forecasting models of auto vehicle miles of travel (VMT), and presents evidence that such models incorrectly omit time cost and spatial form variables. The omission of these variables biases parameter estimates in existing...
Forecast and virtual weather driven plant disease risk modeling system
USDA-ARS?s Scientific Manuscript database
We describe a system in use and development that leverages public weather station data, several spatialized weather forecast types, leaf wetness estimation, generic plant disease models, and online statistical evaluation. Convergent technological developments in all these areas allow, with funding f...
Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?
Hey, Spencer Phillips; Kimmelman, Jonathan
2016-10-01
Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach. © 2016 John Wiley & Sons Ltd.
Ability of matrix models to explain the past and predict the future of plant populations.
McEachern, Kathryn; Crone, Elizabeth E.; Ellis, Martha M.; Morris, William F.; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlen, Johan; Kaye, Thomas N.; Knight, Tiffany M.; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer I.; Doak, Daniel F.; Ganesan, Rengaian; Thorpe, Andrea S.; Menges, Eric S.
2013-01-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models.
Ability of matrix models to explain the past and predict the future of plant populations.
Crone, Elizabeth E; Ellis, Martha M; Morris, William F; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlén, Johan; Kaye, Thomas N; Knight, Tiffany M; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer L; Doak, Daniel F; Ganesan, Rengaian; McEachern, Kathyrn; Thorpe, Andrea S; Menges, Eric S
2013-10-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models. © 2013 Society for Conservation Biology.
Forecasting financial asset processes: stochastic dynamics via learning neural networks.
Giebel, S; Rainer, M
2010-01-01
Models for financial asset dynamics usually take into account their inherent unpredictable nature by including a suitable stochastic component into their process. Unknown (forward) values of financial assets (at a given time in the future) are usually estimated as expectations of the stochastic asset under a suitable risk-neutral measure. This estimation requires the stochastic model to be calibrated to some history of sufficient length in the past. Apart from inherent limitations, due to the stochastic nature of the process, the predictive power is also limited by the simplifying assumptions of the common calibration methods, such as maximum likelihood estimation and regression methods, performed often without weights on the historic time series, or with static weights only. Here we propose a novel method of "intelligent" calibration, using learning neural networks in order to dynamically adapt the parameters of the stochastic model. Hence we have a stochastic process with time dependent parameters, the dynamics of the parameters being themselves learned continuously by a neural network. The back propagation in training the previous weights is limited to a certain memory length (in the examples we consider 10 previous business days), which is similar to the maximal time lag of autoregressive processes. We demonstrate the learning efficiency of the new algorithm by tracking the next-day forecasts for the EURTRY and EUR-HUF exchange rates each.
NASA Astrophysics Data System (ADS)
Fobair, Richard C., II
This research presents a model for forecasting the numbers of jobs created in the energy efficiency retrofit (EER) supply chain resulting from an investment in upgrading residential buildings in Florida. This investigation examined material supply chains stretching from mining to project installation for three product types: insulation, windows/doors, and heating, ventilating, and air conditioning (HVAC) systems. Outputs from the model are provided for the project, sales, manufacturing, and mining level. The model utilizes reverse-estimation to forecast the numbers of jobs that result from an investment. Reverse-estimation is a process that deconstructs a total investment into its constituent parts. In this research, an investment is deconstructed into profit, overhead, and hard costs for each level of the supply chain and over multiple iterations of inter-industry exchanges. The model processes an investment amount, the type of work and method of contracting into a prediction of the number of jobs created. The deconstruction process utilizes data from the U.S. Economic Census. At each supply chain level, the cost of labor is reconfigured into full-time equivalent (FTE) jobs (i.e. equivalent to 40 hours per week for 52 weeks) utilizing loaded labor rates and a typical employee mix. The model is sensitive to adjustable variables, such as percentage of work performed per type of product, allocation of worker time per skill level, annual hours for FTE calculations, wage rate, and benefits. This research provides several new insights into job creation. First, it provides definitions that can be used for future research on jobs in supply chains related to energy efficiency. Second, it provides a methodology for future investigators to calculate jobs in a supply chain resulting from an investment in energy efficiency upgrades to a building. The methodology used in this research is unique because it examines gross employment at the sub-industry level for specific commodities. Most research on employment examines the net employment change (job creation less job destruction) at levels for regions, industries, and the aggregate economy. Third, it provides a forecast of the numbers of jobs for an investment in energy efficiency over the entire supply chain for the selected industries and the job factors for major levels of the supply chain.
Planning Inmarsat's second generation of spacecraft
NASA Astrophysics Data System (ADS)
Williams, W. P.
1982-09-01
The next generation of studies of the Inmarsat service are outlined, such as traffic forecasting studies, communications capacity estimates, space segment design, cost estimates, and financial analysis. Traffic forecasting will require future demand estimates, and a computer model has been developed which estimates demand over the Atlantic, Pacific, and Indian ocean regions. Communications estimates are based on traffic estimates, as a model converts traffic demand into a required capacity figure for a given area. The Erlang formula is used, requiring additional data such as peak hour ratios and distribution estimates. Basic space segment technical requirements are outlined (communications payload, transponder arrangements, etc), and further design studies involve such areas as space segment configuration, launcher and spacecraft studies, transmission planning, and earth segment configurations. Cost estimates of proposed design parameters will be performed, but options must be reduced to make construction feasible. Finally, a financial analysis will be carried out in order to calculate financial returns.
Rate/state Coulomb stress transfer model for the CSEP Japan seismicity forecast
NASA Astrophysics Data System (ADS)
Toda, Shinji; Enescu, Bogdan
2011-03-01
Numerous studies retrospectively found that seismicity rate jumps (drops) by coseismic Coulomb stress increase (decrease). The Collaboratory for the Study of Earthquake Prediction (CSEP) instead provides us an opportunity for prospective testing of the Coulomb hypothesis. Here we adapt our stress transfer model incorporating rate and state dependent friction law to the CSEP Japan seismicity forecast. We demonstrate how to compute the forecast rates of large shocks in 2009 using the large earthquakes during the past 120 years. The time dependent impact of the coseismic stress perturbations explains qualitatively well the occurrence of the recent moderate size shocks. Such ability is partly similar to that of statistical earthquake clustering models. However, our model differs from them as follows: the off-fault aftershock zones can be simulated using finite fault sources; the regional areal patterns of triggered seismicity are modified by the dominant mechanisms of the potential sources; the imparted stresses due to large earthquakes produce stress shadows that lead to a reduction of the forecasted number of earthquakes. Although the model relies on several unknown parameters, it is the first physics based model submitted to the CSEP Japan test center and has the potential to be tuned for short-term earthquake forecasts.
Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick
2018-01-01
When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Short-term ensemble radar rainfall forecasts for hydrological applications
NASA Astrophysics Data System (ADS)
Codo de Oliveira, M.; Rico-Ramirez, M. A.
2016-12-01
Flooding is a very common natural disaster around the world, putting local population and economy at risk. Forecasting floods several hours ahead and issuing warnings are of main importance to permit proper response in emergency situations. However, it is important to know the uncertainties related to the rainfall forecasting in order to produce more reliable forecasts. Nowcasting models (short-term rainfall forecasts) are able to produce high spatial and temporal resolution predictions that are useful in hydrological applications. Nonetheless, they are subject to uncertainties mainly due to the nowcasting model used, errors in radar rainfall estimation, temporal development of the velocity field and to the fact that precipitation processes such as growth and decay are not taken into account. In this study an ensemble generation scheme using rain gauge data as a reference to estimate radars errors is used to produce forecasts with up to 3h lead-time. The ensembles try to assess in a realistic way the residual uncertainties that remain even after correction algorithms are applied in the radar data. The ensembles produced are compered to a stochastic ensemble generator. Furthermore, the rainfall forecast output was used as an input in a hydrodynamic sewer network model and also in hydrological model for catchments of different sizes in north England. A comparative analysis was carried of how was carried out to assess how the radar uncertainties propagate into these models. The first named author is grateful to CAPES - Ciencia sem Fronteiras for funding this PhD research.
Short-term solar activity forecasting
NASA Technical Reports Server (NTRS)
Xie-Zhen, C.; Ai-Di, Z.
1979-01-01
A method of forecasting the level of activity of every active region on the surface of the Sun within one to three days is proposed in order to estimate the possibility of the occurrence of ionospheric disturbances and proton events. The forecasting method is a probability process based on statistics. In many of the cases, the accuracy in predicting the short term solar activity was in the range of 70%, although there were many false alarms.
Jeff Prestemon; David T. Butry; Douglas S. Thomas
2016-01-01
Research shows that some categories of human-ignited wildfires may be forecastable, owing to their temporal clustering, with the possibility that resources could be predeployed to help reduce the incidence of such wildfires. We estimated several kinds of incendiary and other human-ignited wildfire forecast models at the weekly time step for tribal land units in the...
NASA Astrophysics Data System (ADS)
Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.
2015-12-01
In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.
NASA Astrophysics Data System (ADS)
Kolotii, Andrii; Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii; Ostapenko, Vadim; Oliinyk, Tamara
2015-04-01
Efficient and timely crop monitoring and yield forecasting are important tasks for ensuring of stability and sustainable economic development [1]. As winter crops pay prominent role in agriculture of Ukraine - the main focus of this study is concentrated on winter wheat. In our previous research [2, 3] it was shown that usage of biophysical parameters of crops such as FAPAR (derived from Geoland-2 portal as for SPOT Vegetation data) is far more efficient for crop yield forecasting to NDVI derived from MODIS data - for available data. In our current work efficiency of usage such biophysical parameters as LAI, FAPAR, FCOVER (derived from SPOT Vegetation and PROBA-V data at resolution of 1 km and simulated within WOFOST model) and NDVI product (derived from MODIS) for winter wheat monitoring and yield forecasting is estimated. As the part of crop monitoring workflow (vegetation anomaly detection, vegetation indexes and products analysis) and yield forecasting SPIRITS tool developed by JRC is used. Statistics extraction is done for landcover maps created in SRI within FP-7 SIGMA project. Efficiency of usage satellite based and modelled with WOFOST model biophysical products is estimated. [1] N. Kussul, S. Skakun, A. Shelestov, O. Kussul, "Sensor Web approach to Flood Monitoring and Risk Assessment", in: IGARSS 2013, 21-26 July 2013, Melbourne, Australia, pp. 815-818. [2] F. Kogan, N. Kussul, T. Adamenko, S. Skakun, O. Kravchenko, O. Kryvobok, A. Shelestov, A. Kolotii, O. Kussul, and A. Lavrenyuk, "Winter wheat yield forecasting in Ukraine based on Earth observation, meteorological data and biophysical models," International Journal of Applied Earth Observation and Geoinformation, vol. 23, pp. 192-203, 2013. [3] Kussul O., Kussul N., Skakun S., Kravchenko O., Shelestov A., Kolotii A, "Assessment of relative efficiency of using MODIS data to winter wheat yield forecasting in Ukraine", in: IGARSS 2013, 21-26 July 2013, Melbourne, Australia, pp. 3235 - 3238.
Parametric decadal climate forecast recalibration (DeFoReSt 1.0)
NASA Astrophysics Data System (ADS)
Pasternack, Alexander; Bhend, Jonas; Liniger, Mark A.; Rust, Henning W.; Müller, Wolfgang A.; Ulbrich, Uwe
2018-01-01
Near-term climate predictions such as decadal climate forecasts are increasingly being used to guide adaptation measures. For near-term probabilistic predictions to be useful, systematic errors of the forecasting systems have to be corrected. While methods for the calibration of probabilistic forecasts are readily available, these have to be adapted to the specifics of decadal climate forecasts including the long time horizon of decadal climate forecasts, lead-time-dependent systematic errors (drift) and the errors in the representation of long-term changes and variability. These features are compounded by small ensemble sizes to describe forecast uncertainty and a relatively short period for which typically pairs of reforecasts and observations are available to estimate calibration parameters. We introduce the Decadal Climate Forecast Recalibration Strategy (DeFoReSt), a parametric approach to recalibrate decadal ensemble forecasts that takes the above specifics into account. DeFoReSt optimizes forecast quality as measured by the continuous ranked probability score (CRPS). Using a toy model to generate synthetic forecast observation pairs, we demonstrate the positive effect on forecast quality in situations with pronounced and limited predictability. Finally, we apply DeFoReSt to decadal surface temperature forecasts from the MiKlip prototype system and find consistent, and sometimes considerable, improvements in forecast quality compared with a simple calibration of the lead-time-dependent systematic errors.
NASA Astrophysics Data System (ADS)
Gagnon, Patrick; Rousseau, Alain N.; Charron, Dominique; Fortin, Vincent; Audet, René
2017-11-01
Several businesses and industries rely on rainfall forecasts to support their day-to-day operations. To deal with the uncertainty associated with rainfall forecast, some meteorological organisations have developed products, such as ensemble forecasts. However, due to the intensive computational requirements of ensemble forecasts, the spatial resolution remains coarse. For example, Environment and Climate Change Canada's (ECCC) Global Ensemble Prediction System (GEPS) data is freely available on a 1-degree grid (about 100 km), while those of the so-called High Resolution Deterministic Prediction System (HRDPS) are available on a 2.5-km grid (about 40 times finer). Potential users are then left with the option of using either a high-resolution rainfall forecast without uncertainty estimation and/or an ensemble with a spectrum of plausible rainfall values, but at a coarser spatial scale. The objective of this study was to evaluate the added value of coupling the Gibbs Sampling Disaggregation Model (GSDM) with ECCC products to provide accurate, precise and consistent rainfall estimates at a fine spatial resolution (10-km) within a forecast framework (6-h). For 30, 6-h, rainfall events occurring within a 40,000-km2 area (Québec, Canada), results show that, using 100-km aggregated reference rainfall depths as input, statistics of the rainfall fields generated by GSDM were close to those of the 10-km reference field. However, in forecast mode, GSDM outcomes inherit of the ECCC forecast biases, resulting in a poor performance when GEPS data were used as input, mainly due to the inherent rainfall depth distribution of the latter product. Better performance was achieved when the Regional Deterministic Prediction System (RDPS), available on a 10-km grid and aggregated at 100-km, was used as input to GSDM. Nevertheless, most of the analyzed ensemble forecasts were weakly consistent. Some areas of improvement are identified herein.
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn; Steinsland, Ingelin
2014-05-01
This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.
Lee, Jared A.; Hacker, Joshua P.; Monache, Luca Delle; ...
2016-08-03
A current barrier to greater deployment of offshore wind turbines is the poor quality of numerical weather prediction model wind and turbulence forecasts over open ocean. The bulk of development for atmospheric boundary layer (ABL) parameterization schemes has focused on land, partly due to a scarcity of observations over ocean. The 100-m FINO1 tower in the North Sea is one of the few sources worldwide of atmospheric profile observations from the sea surface to turbine hub height. These observations are crucial to developing a better understanding and modeling of physical processes in the marine ABL. In this paper we usemore » the WRF single column model (SCM), coupled with an ensemble Kalman filter from the Data Assimilation Research Testbed (DART), to create 100-member ensembles at the FINO1 location. The goal of this study is to determine the extent to which model parameter estimation can improve offshore wind forecasts. Combining two datasets that provide lateral forcing for the SCM and two methods for determining z 0, the time-varying sea-surface roughness length, we conduct four WRF-SCM/DART experiments over the October-December 2006 period. The two methods for determining z 0 are the default Fairall-adjusted Charnock formulation in WRF, and using parameter estimation techniques to estimate z 0 in DART. Using DART to estimate z 0 is found to reduce 1-h forecast errors of wind speed over the Charnock-Fairall z 0 ensembles by 4%–22%. Finally, however, parameter estimation of z 0 does not simultaneously reduce turbulent flux forecast errors, indicating limitations of this approach and the need for new marine ABL parameterizations.« less
Flood Nowcasting With Linear Catchment Models, Radar and Kalman Filters
NASA Astrophysics Data System (ADS)
Pegram, Geoff; Sinclair, Scott
A pilot study using real time rainfall data as input to a parsimonious linear distributed flood forecasting model is presented. The aim of the study is to deliver an operational system capable of producing flood forecasts, in real time, for the Mgeni and Mlazi catchments near the city of Durban in South Africa. The forecasts can be made at time steps which are of the order of a fraction of the catchment response time. To this end, the model is formulated in Finite Difference form in an equation similar to an Auto Regressive Moving Average (ARMA) model; it is this formulation which provides the required computational efficiency. The ARMA equation is a discretely coincident form of the State-Space equations that govern the response of an arrangement of linear reservoirs. This results in a functional relationship between the reservoir response con- stants and the ARMA coefficients, which guarantees stationarity of the ARMA model. Input to the model is a combined "Best Estimate" spatial rainfall field, derived from a combination of weather RADAR and Satellite rainfield estimates with point rain- fall given by a network of telemetering raingauges. Several strategies are employed to overcome the uncertainties associated with forecasting. Principle among these are the use of optimal (double Kalman) filtering techniques to update the model states and parameters in response to current streamflow observations and the application of short term forecasting techniques to provide future estimates of the rainfield as input to the model.
An operational procedure for rapid flood risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc
2017-07-01
The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.
Projections of the current and future disease burden of hepatitis C virus infection in Malaysia.
McDonald, Scott A; Dahlui, Maznah; Mohamed, Rosmawati; Naning, Herlianna; Shabaruddin, Fatiha Hana; Kamarulzaman, Adeeba
2015-01-01
The prevalence of hepatitis C virus (HCV) infection in Malaysia has been estimated at 2.5% of the adult population. Our objective, satisfying one of the directives of the WHO Framework for Global Action on Viral Hepatitis, was to forecast the HCV disease burden in Malaysia using modelling methods. An age-structured multi-state Markov model was developed to simulate the natural history of HCV infection. We tested three historical incidence scenarios that would give rise to the estimated prevalence in 2009, and calculated the incidence of cirrhosis, end-stage liver disease, and death, and disability-adjusted life-years (DALYs) under each scenario, to the year 2039. In the baseline scenario, current antiviral treatment levels were extended from 2014 to the end of the simulation period. To estimate the disease burden averted under current sustained virological response rates and treatment levels, the baseline scenario was compared to a counterfactual scenario in which no past or future treatment is assumed. In the baseline scenario, the projected disease burden for the year 2039 is 94,900 DALYs/year (95% credible interval (CrI): 77,100 to 124,500), with 2,002 (95% CrI: 1340 to 3040) and 540 (95% CrI: 251 to 1,030) individuals predicted to develop decompensated cirrhosis and hepatocellular carcinoma, respectively, in that year. Although current treatment practice is estimated to avert a cumulative total of 2,200 deaths from DC or HCC, a cumulative total of 63,900 HCV-related deaths is projected by 2039. The HCV-related disease burden is already high and is forecast to rise steeply over the coming decades under current levels of antiviral treatment. Increased governmental resources to improve HCV screening and treatment rates and to reduce transmission are essential to address the high projected HCV disease burden in Malaysia.
Projections of the Current and Future Disease Burden of Hepatitis C Virus Infection in Malaysia
McDonald, Scott A.; Dahlui, Maznah; Mohamed, Rosmawati; Naning, Herlianna; Shabaruddin, Fatiha Hana; Kamarulzaman, Adeeba
2015-01-01
Background The prevalence of hepatitis C virus (HCV) infection in Malaysia has been estimated at 2.5% of the adult population. Our objective, satisfying one of the directives of the WHO Framework for Global Action on Viral Hepatitis, was to forecast the HCV disease burden in Malaysia using modelling methods. Methods An age-structured multi-state Markov model was developed to simulate the natural history of HCV infection. We tested three historical incidence scenarios that would give rise to the estimated prevalence in 2009, and calculated the incidence of cirrhosis, end-stage liver disease, and death, and disability-adjusted life-years (DALYs) under each scenario, to the year 2039. In the baseline scenario, current antiviral treatment levels were extended from 2014 to the end of the simulation period. To estimate the disease burden averted under current sustained virological response rates and treatment levels, the baseline scenario was compared to a counterfactual scenario in which no past or future treatment is assumed. Results In the baseline scenario, the projected disease burden for the year 2039 is 94,900 DALYs/year (95% credible interval (CrI): 77,100 to 124,500), with 2,002 (95% CrI: 1340 to 3040) and 540 (95% CrI: 251 to 1,030) individuals predicted to develop decompensated cirrhosis and hepatocellular carcinoma, respectively, in that year. Although current treatment practice is estimated to avert a cumulative total of 2,200 deaths from DC or HCC, a cumulative total of 63,900 HCV-related deaths is projected by 2039. Conclusions The HCV-related disease burden is already high and is forecast to rise steeply over the coming decades under current levels of antiviral treatment. Increased governmental resources to improve HCV screening and treatment rates and to reduce transmission are essential to address the high projected HCV disease burden in Malaysia. PMID:26042425
Evaluation of Pollen Apps Forecasts: The Need for Quality Control in an eHealth Service.
Bastl, Katharina; Berger, Uwe; Kmenta, Maximilian
2017-05-08
Pollen forecasts are highly valuable for allergen avoidance and thus raising the quality of life of persons concerned by pollen allergies. They are considered as valuable free services for the public. Careful scientific evaluation of pollen forecasts in terms of accurateness and reliability has not been available till date. The aim of this study was to analyze 9 mobile apps, which deliver pollen information and pollen forecasts, with a focus on their accurateness regarding the prediction of the pollen load in the grass pollen season 2016 to assess their usefulness for pollen allergy sufferers. The following number of apps was evaluated for each location: 3 apps for Vienna (Austria), 4 apps for Berlin (Germany), and 1 app each for Basel (Switzerland) and London (United Kingdom). All mobile apps were freely available. Today's grass pollen forecast was compared throughout the defined grass pollen season at each respective location with measured grass pollen concentrations. Hit rates were calculated for the exact performance and for a tolerance in a range of ±2 and ±4 pollen per cubic meter. In general, for most apps, hit rates score around 50% (6 apps). It was found that 1 app showed better results, whereas 3 apps performed less well. Hit rates increased when calculated with tolerances for most apps. In contrast, the forecast for the "readiness to flower" for grasses was performed at a sufficiently accurate level, although only two apps provided such a forecast. The last of those forecasts coincided with the first moderate grass pollen load on the predicted day or 3 days after and performed even from about a month before well within the range of 3 days. Advertisement was present in 3 of the 9 analyzed apps, whereas an imprint mentioning institutions with experience in pollen forecasting was present in only three other apps. The quality of pollen forecasts is in need of improvement, and quality control for pollen forecasts is recommended to avoid potential harm to pollen allergy sufferers due to inadequate forecasts. The inclusion of information on reliability of provided forecasts and a similar handling regarding probabilistic weather forecasts should be considered. ©Katharina Bastl, Uwe Berger, Maximilian Kmenta. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 08.05.2017.
Evaluation of Pollen Apps Forecasts: The Need for Quality Control in an eHealth Service
Berger, Uwe; Kmenta, Maximilian
2017-01-01
Background Pollen forecasts are highly valuable for allergen avoidance and thus raising the quality of life of persons concerned by pollen allergies. They are considered as valuable free services for the public. Careful scientific evaluation of pollen forecasts in terms of accurateness and reliability has not been available till date. Objective The aim of this study was to analyze 9 mobile apps, which deliver pollen information and pollen forecasts, with a focus on their accurateness regarding the prediction of the pollen load in the grass pollen season 2016 to assess their usefulness for pollen allergy sufferers. Methods The following number of apps was evaluated for each location: 3 apps for Vienna (Austria), 4 apps for Berlin (Germany), and 1 app each for Basel (Switzerland) and London (United Kingdom). All mobile apps were freely available. Today’s grass pollen forecast was compared throughout the defined grass pollen season at each respective location with measured grass pollen concentrations. Hit rates were calculated for the exact performance and for a tolerance in a range of ±2 and ±4 pollen per cubic meter. Results In general, for most apps, hit rates score around 50% (6 apps). It was found that 1 app showed better results, whereas 3 apps performed less well. Hit rates increased when calculated with tolerances for most apps. In contrast, the forecast for the “readiness to flower” for grasses was performed at a sufficiently accurate level, although only two apps provided such a forecast. The last of those forecasts coincided with the first moderate grass pollen load on the predicted day or 3 days after and performed even from about a month before well within the range of 3 days. Advertisement was present in 3 of the 9 analyzed apps, whereas an imprint mentioning institutions with experience in pollen forecasting was present in only three other apps. Conclusions The quality of pollen forecasts is in need of improvement, and quality control for pollen forecasts is recommended to avoid potential harm to pollen allergy sufferers due to inadequate forecasts. The inclusion of information on reliability of provided forecasts and a similar handling regarding probabilistic weather forecasts should be considered. PMID:28483740
NASA Astrophysics Data System (ADS)
Ismail, A.; Hassan, Noor I.
2013-09-01
Cancer is one of the principal causes of death in Malaysia. This study was performed to determine the pattern of rate of cancer deaths at a public hospital in Malaysia over an 11 year period from year 2001 to 2011, to determine the best fitted model of forecasting the rate of cancer deaths using Univariate Modeling and to forecast the rates for the next two years (2012 to 2013). The medical records of the death of patients with cancer admitted at this Hospital over 11 year's period were reviewed, with a total of 663 cases. The cancers were classified according to 10th Revision International Classification of Diseases (ICD-10). Data collected include socio-demographic background of patients such as registration number, age, gender, ethnicity, ward and diagnosis. Data entry and analysis was accomplished using SPSS 19.0 and Minitab 16.0. The five Univariate Models used were Naïve with Trend Model, Average Percent Change Model (ACPM), Single Exponential Smoothing, Double Exponential Smoothing and Holt's Method. The overall 11 years rate of cancer deaths showed that at this hospital, Malay patients have the highest percentage (88.10%) compared to other ethnic groups with males (51.30%) higher than females. Lung and breast cancer have the most number of cancer deaths among gender. About 29.60% of the patients who died due to cancer were aged 61 years old and above. The best Univariate Model used for forecasting the rate of cancer deaths is Single Exponential Smoothing Technique with alpha of 0.10. The forecast for the rate of cancer deaths shows a horizontally or flat value. The forecasted mortality trend remains at 6.84% from January 2012 to December 2013. All the government and private sectors and non-governmental organizations need to highlight issues on cancer especially lung and breast cancers to the public through campaigns using mass media, media electronics, posters and pamphlets in the attempt to decrease the rate of cancer deaths in Malaysia.
Transportation economics and energy
NASA Astrophysics Data System (ADS)
Soltani Sobh, Ali
The overall objective of this research is to study the impacts of technology improvement including fuel efficiency increment, extending the use of natural gas vehicle and electric vehicles on key parameters of transportation. In the first chapter, a simple economic analysis is used in order to demonstrate the adoption rate of natural gas vehicles as an alternative fuel vehicle. The effect of different factors on adoption rate of commuters is calculated in sensitivity analysis. In second chapter the VMT is modeled and forecasted under influence of CNG vehicles in different scenarios. The VMT modeling is based on the time series data for Washington State. In order to investigate the effect of population growth on VMT, the per capita model is also developed. In third chapter the effect of fuel efficiency improvement on fuel tax revenue and greenhouse emission is examined. The model is developed based on time series data of Washington State. The rebound effect resulted from fuel efficiency improvement is estimated and is considered in fuel consumption forecasting. The reduction in fuel tax revenue and greenhouse gas (GHG) emissions as two outcomes of lower fuel consumption are computed. In addition, the proper fuel tax rate to restitute the revenue is suggested. In the fourth chapter effective factors on electric vehicles (EV) adoption is discussed. The constructed model is aggregated binomial logit share model that estimates the modal split between EV and conventional vehicles for different states over time. Various factors are incorporated in the utility function as explanatory variables in order to quantify their effect on EV adoption choices. The explanatory variables include income, VMT, electricity price, gasoline price, urban area and number of EV stations.
A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run.
Armeanu, Daniel; Andrei, Jean Vasile; Lache, Leonard; Panait, Mirela
2017-01-01
The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets.
Snowmelt runoff modeling in simulation and forecasting modes with the Martinec-Mango model
NASA Technical Reports Server (NTRS)
Shafer, B.; Jones, E. B.; Frick, D. M. (Principal Investigator)
1982-01-01
The Martinec-Rango snowmelt runoff model was applied to two watersheds in the Rio Grande basin, Colorado-the South Fork Rio Grande, a drainage encompassing 216 sq mi without reservoirs or diversions and the Rio Grande above Del Norte, a drainage encompassing 1,320 sq mi without major reservoirs. The model was successfully applied to both watersheds when run in a simulation mode for the period 1973-79. This period included both high and low runoff seasons. Central to the adaptation of the model to run in a forecast mode was the need to develop a technique to forecast the shape of the snow cover depletion curves between satellite data points. Four separate approaches were investigated-simple linear estimation, multiple regression, parabolic exponential, and type curve. Only the parabolic exponential and type curve methods were run on the South Fork and Rio Grande watersheds for the 1980 runoff season using satellite snow cover updates when available. Although reasonable forecasts were obtained in certain situations, neither method seemed ready for truly operational forecasts, possibly due to a large amount of estimated climatic data for one or two primary base stations during the 1980 season.
A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run
Armeanu, Daniel; Lache, Leonard; Panait, Mirela
2017-01-01
The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets. PMID:28742100
Forecasting the remaining reservoir capacity in the Laurentian Great Lakes watershed
NASA Astrophysics Data System (ADS)
Alighalehbabakhani, Fatemeh; Miller, Carol J.; Baskaran, Mark; Selegean, James P.; Barkach, John H.; Dahl, Travis; Abkenar, Seyed Mohsen Sadatiyan
2017-12-01
Sediment accumulation behind a dam is a significant factor in reservoir operation and watershed management. There are many dams located within the Laurentian Great Lakes watershed whose operations have been adversely affected by excessive reservoir sedimentation. Reservoir sedimentation effects include reduction of flood control capability and limitations to both water supply withdrawals and power generation due to reduced reservoir storage. In this research, the sediment accumulation rates of twelve reservoirs within the Great Lakes watershed were evaluated using the Soil and Water Assessment Tool (SWAT). The estimated sediment accumulation rates by SWAT were compared to estimates relying on radionuclide dating of sediment cores and bathymetric survey methods. Based on the sediment accumulation rate, the remaining reservoir capacity for each study site was estimated. Evaluation of the anthropogenic impacts including land use change and dam construction on the sediment yield were assessed in this research. The regression analysis was done on the current and pre-European settlement sediment yield for the modeled watersheds to predict the current and natural sediment yield in un-modeled watersheds. These eleven watersheds are in the state of Indiana, Michigan, Ohio, New York, and Wisconsin.
Forecasting European Droughts using the North American Multi-Model Ensemble (NMME)
NASA Astrophysics Data System (ADS)
Thober, Stephan; Kumar, Rohini; Samaniego, Luis; Sheffield, Justin; Schäfer, David; Mai, Juliane
2015-04-01
Soil moisture droughts have the potential to diminish crop yields causing economic damage or even threatening the livelihood of societies. State-of-the-art drought forecasting systems incorporate seasonal meteorological forecasts to estimate future drought conditions. Meteorological forecasting skill (in particular that of precipitation), however, is limited to a few weeks because of the chaotic behaviour of the atmosphere. One of the most important challenges in drought forecasting is to understand how the uncertainty in the atmospheric forcings (e.g., precipitation and temperature) is further propagated into hydrologic variables such as soil moisture. The North American Multi-Model Ensemble (NMME) provides the latest collection of a multi-institutional seasonal forecasting ensemble for precipitation and temperature. In this study, we analyse the skill of NMME forecasts for predicting European drought events. The monthly NMME forecasts are downscaled to daily values to force the mesoscale hydrological model (mHM). The mHM soil moisture forecasts obtained with the forcings of the dynamical models are then compared against those obtained with the Ensemble Streamflow Prediction (ESP) approach. ESP recombines historical meteorological forcings to create a new ensemble forecast. Both forecasts are compared against reference soil moisture conditions obtained using observation based meteorological forcings. The study is conducted for the period from 1982 to 2009 and covers a large part of the Pan-European domain (10°W to 40°E and 35°N to 55°N). Results indicate that NMME forecasts are better at predicting the reference soil moisture variability as compared to ESP. For example, NMME explains 50% of the variability in contrast to only 31% by ESP at a six-month lead time. The Equitable Threat Skill Score (ETS), which combines the hit and false alarm rates, is analysed for drought events using a 0.2 threshold of a soil moisture percentile index. On average, the NMME based ensemble forecasts have consistently higher skill than the ESP based ones (ETS of 13% as compared to 5% at a six-month lead time). Additionally, the ETS ensemble spread of NMME forecasts is considerably narrower than that of ESP; the lower boundary of the NMME ensemble spread coincides most of the time with the ensemble median of ESP. Among the NMME models, NCEP-CFSv2 outperforms the other models in terms of ETS most of the time. Removing the three worst performing models does not deteriorate the ensemble performance (neither in skill nor in spread), but would substantially reduce the computational resources required in an operational forecasting system. For major European drought events (e.g., 1990, 1992, 2003, and 2007), NMME forecasts tend to underestimate area under drought and drought magnitude during times of drought development. During drought recovery, this underestimation is weaker for area under drought or even reversed into an overestimation for drought magnitude. This indicates that the NMME models are too wet during drought development and too dry during drought recovery. In summary, soil moisture drought forecasts by NMME are more skillful than those of an ESP based approach. However, they still show systematic biases in reproducing the observed drought dynamics during drought development and recovery.
NASA Astrophysics Data System (ADS)
Tanioka, Yuichiro
2017-04-01
After tsunami disaster due to the 2011 Tohoku-oki great earthquake, improvement of the tsunami forecast has been an urgent issue in Japan. National Institute of Disaster Prevention is installing a cable network system of earthquake and tsunami observation (S-NET) at the ocean bottom along the Japan and Kurile trench. This cable system includes 125 pressure sensors (tsunami meters) which are separated by 30 km. Along the Nankai trough, JAMSTEC already installed and operated the cable network system of seismometers and pressure sensors (DONET and DONET2). Those systems are the most dense observation network systems on top of source areas of great underthrust earthquakes in the world. Real-time tsunami forecast has depended on estimation of earthquake parameters, such as epicenter, depth, and magnitude of earthquakes. Recently, tsunami forecast method has been developed using the estimation of tsunami source from tsunami waveforms observed at the ocean bottom pressure sensors. However, when we have many pressure sensors separated by 30km on top of the source area, we do not need to estimate the tsunami source or earthquake source to compute tsunami. Instead, we can initiate a tsunami simulation from those dense tsunami observed data. Observed tsunami height differences with a time interval at the ocean bottom pressure sensors separated by 30 km were used to estimate tsunami height distribution at a particular time. In our new method, tsunami numerical simulation was initiated from those estimated tsunami height distribution. In this paper, the above method is improved and applied for the tsunami generated by the 2011 Tohoku-oki great earthquake. Tsunami source model of the 2011 Tohoku-oki great earthquake estimated using observed tsunami waveforms, coseimic deformation observed by GPS and ocean bottom sensors by Gusman et al. (2012) is used in this study. The ocean surface deformation is computed from the source model and used as an initial condition of tsunami simulation. By assuming that this computed tsunami is a real tsunami and observed at ocean bottom sensors, new tsunami simulation is carried out using the above method. The station distribution (each station is separated by 15 min., about 30 km) observed tsunami waveforms which were actually computed from the source model. Tsunami height distributions are estimated from the above method at 40, 80, and 120 seconds after the origin time of the earthquake. The Near-field Tsunami Inundation forecast method (Gusman et al. 2014) was used to estimate the tsunami inundation along the Sanriku coast. The result shows that the observed tsunami inundation was well explained by those estimated inundation. This also shows that it takes about 10 minutes to estimate the tsunami inundation from the origin time of the earthquake. This new method developed in this paper is very effective for a real-time tsunami forecast.
2014-12-01
anticyclone. Vertical wind shear was low, while a moderate level of upper level diffluence existed. The minimum sea level pressure ( SLP ) was estimated...pre-Sinlaku disturbance. At this time, JTWC estimated maximum surface level winds to be 15 to 20 kt, with a SLP near 1005 hPa. 17 Figure 11...poleward side of the circulation. Surface winds had increased to near 23 kt as the SLP continued to fall to 1004 hPa. JTWC forecasters upgraded the
NASA Astrophysics Data System (ADS)
Abunama, Taher; Othman, Faridah
2017-06-01
Analysing the fluctuations of wastewater inflow rates in sewage treatment plants (STPs) is essential to guarantee a sufficient treatment of wastewater before discharging it to the environment. The main objectives of this study are to statistically analyze and forecast the wastewater inflow rates into the Bandar Tun Razak STP in Kuala Lumpur, Malaysia. A time series analysis of three years’ weekly influent data (156weeks) has been conducted using the Auto-Regressive Integrated Moving Average (ARIMA) model. Various combinations of ARIMA orders (p, d, q) have been tried to select the most fitted model, which was utilized to forecast the wastewater inflow rates. The linear regression analysis was applied to testify the correlation between the observed and predicted influents. ARIMA (3, 1, 3) model was selected with the highest significance R-square and lowest normalized Bayesian Information Criterion (BIC) value, and accordingly the wastewater inflow rates were forecasted to additional 52weeks. The linear regression analysis between the observed and predicted values of the wastewater inflow rates showed a positive linear correlation with a coefficient of 0.831.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Jared A.; Hacker, Joshua P.; Monache, Luca Delle
A current barrier to greater deployment of offshore wind turbines is the poor quality of numerical weather prediction model wind and turbulence forecasts over open ocean. The bulk of development for atmospheric boundary layer (ABL) parameterization schemes has focused on land, partly due to a scarcity of observations over ocean. The 100-m FINO1 tower in the North Sea is one of the few sources worldwide of atmospheric profile observations from the sea surface to turbine hub height. These observations are crucial to developing a better understanding and modeling of physical processes in the marine ABL. In this paper we usemore » the WRF single column model (SCM), coupled with an ensemble Kalman filter from the Data Assimilation Research Testbed (DART), to create 100-member ensembles at the FINO1 location. The goal of this study is to determine the extent to which model parameter estimation can improve offshore wind forecasts. Combining two datasets that provide lateral forcing for the SCM and two methods for determining z 0, the time-varying sea-surface roughness length, we conduct four WRF-SCM/DART experiments over the October-December 2006 period. The two methods for determining z 0 are the default Fairall-adjusted Charnock formulation in WRF, and using parameter estimation techniques to estimate z 0 in DART. Using DART to estimate z 0 is found to reduce 1-h forecast errors of wind speed over the Charnock-Fairall z 0 ensembles by 4%–22%. Finally, however, parameter estimation of z 0 does not simultaneously reduce turbulent flux forecast errors, indicating limitations of this approach and the need for new marine ABL parameterizations.« less
Evaluation and economic value of winter weather forecasts
NASA Astrophysics Data System (ADS)
Snyder, Derrick W.
State and local highway agencies spend millions of dollars each year to deploy winter operation teams to plow snow and de-ice roadways. Accurate and timely weather forecast information is critical for effective decision making. Students from Purdue University partnered with the Indiana Department of Transportation to create an experimental winter weather forecast service for the 2012-2013 winter season in Indiana to assist in achieving these goals. One forecast product, an hourly timeline of winter weather hazards produced daily, was evaluated for quality and economic value. Verification of the forecasts was performed with data from the Rapid Refresh numerical weather model. Two objective verification criteria were developed to evaluate the performance of the timeline forecasts. Using both criteria, the timeline forecasts had issues with reliability and discrimination, systematically over-forecasting the amount of winter weather that was observed while also missing significant winter weather events. Despite these quality issues, the forecasts still showed significant, but varied, economic value compared to climatology. Economic value of the forecasts was estimated to be 29.5 million or 4.1 million, depending on the verification criteria used. Limitations of this valuation system are discussed and a framework is developed for more thorough studies in the future.
A comparative analysis of errors in long-term econometric forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tepel, R.
1986-04-01
The growing body of literature that documents forecast accuracy falls generally into two parts. The first is prescriptive and is carried out by modelers who use simulation analysis as a tool for model improvement. These studies are ex post, that is, they make use of known values for exogenous variables and generate an error measure wholly attributable to the model. The second type of analysis is descriptive and seeks to measure errors, identify patterns among errors and variables and compare forecasts from different sources. Most descriptive studies use an ex ante approach, that is, they evaluate model outputs based onmore » estimated (or forecasted) exogenous variables. In this case, it is the forecasting process, rather than the model, that is under scrutiny. This paper uses an ex ante approach to measure errors in forecast series prepared by Data Resources Incorporated (DRI), Wharton Econometric Forecasting Associates (Wharton), and Chase Econometrics (Chase) and to determine if systematic patterns of errors can be discerned between services, types of variables (by degree of aggregation), length of forecast and time at which the forecast is made. Errors are measured as the percent difference between actual and forecasted values for the historical period of 1971 to 1983.« less
A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.
Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan
2017-01-01
Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.
Forecast horizon of multi-item dynamic lot size model with perishable inventory.
Jing, Fuying; Lan, Zirui
2017-01-01
This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon.
Forecast horizon of multi-item dynamic lot size model with perishable inventory
Jing, Fuying
2017-01-01
This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon. PMID:29125856
Using Bayes Model Averaging for Wind Power Forecasts
NASA Astrophysics Data System (ADS)
Preede Revheim, Pål; Beyer, Hans Georg
2014-05-01
For operational purposes predictions of the forecasts of the lumped output of groups of wind farms spread over larger geographic areas will often be of interest. A naive approach is to make forecasts for each individual site and sum them up to get the group forecast. It is however well documented that a better choice is to use a model that also takes advantage of spatial smoothing effects. It might however be the case that some sites tends to more accurately reflect the total output of the region, either in general or for certain wind directions. It will then be of interest giving these a greater influence over the group forecast. Bayesian model averaging (BMA) is a statistical post-processing method for producing probabilistic forecasts from ensembles. Raftery et al. [1] show how BMA can be used for statistical post processing of forecast ensembles, producing PDFs of future weather quantities. The BMA predictive PDF of a future weather quantity is a weighted average of the ensemble members' PDFs, where the weights can be interpreted as posterior probabilities and reflect the ensemble members' contribution to overall forecasting skill over a training period. In Revheim and Beyer [2] the BMA procedure used in Sloughter, Gneiting and Raftery [3] were found to produce fairly accurate PDFs for the future mean wind speed of a group of sites from the single sites wind speeds. However, when the procedure was attempted applied to wind power it resulted in either problems with the estimation of the parameters (mainly caused by longer consecutive periods of no power production) or severe underestimation (mainly caused by problems with reflecting the power curve). In this paper the problems that arose when applying BMA to wind power forecasting is met through two strategies. First, the BMA procedure is run with a combination of single site wind speeds and single site wind power production as input. This solves the problem with longer consecutive periods where the input data does not contain information, but it has the disadvantage of nearly doubling the number of model parameters to be estimated. Second, the BMA procedure is run with group mean wind power as the response variable instead of group mean wind speed. This also solves the problem with longer consecutive periods without information in the input data, but it leaves the power curve to also be estimated from the data. [1] Raftery, A. E., et al. (2005). Using Bayesian Model Averaging to Calibrate Forecast Ensembles. Monthly Weather Review, 133, 1155-1174. [2]Revheim, P. P. and H. G. Beyer (2013). Using Bayesian Model Averaging for wind farm group forecasts. EWEA Wind Power Forecasting Technology Workshop,Rotterdam, 4-5 December 2013. [3]Sloughter, J. M., T. Gneiting and A. E. Raftery (2010). Probabilistic Wind Speed Forecasting Using Ensembles and Bayesian Model Averaging. Journal of the American Statistical Association, Vol. 105, No. 489, 25-35
Flash floods in June and July 2009 in the Czech Republic
NASA Astrophysics Data System (ADS)
Sercl, Petr; Danhelka, Jan; Tyl, Radovan
2010-05-01
Several flash floods occurred in the territory of the Czech Republic during the last decade of June and beginning of July 2009. These events caused vast economic damage and unfortunately there were also 15 fatalities. The complete evaluation of flash floods from the point of view of its meteorological cause, hydrological development and impacts was done under the responsibility of Ministry of Environment of the Czech Republic. Czech Hydrometeorological Institute (CHMI) coordinated this project. The results of the project contain several concrete proposals to reduce the threat of flash floods in the Czech Republic. The proposals were focused on possible future improvements of CHMI forecasting service activities including all other parts of Flood prevention and protection system in the Czech Republic. The synoptic cause of floods was the extraordinary long (12 days is longest in more than 60 years history) presence of eastern cyclonic situation over the Central Europe bringing warm, moist and unstable air masses from Mediterranean and Black Sea area. Very intensive thunderstorms accompanied by torrential rain occurred almost daily. Storm cells were organized in train effect and crossed repeatedly the same places within several hours. The extremity of the flood events was also influenced by soil saturation due to daily occurrence of rainstorms. The peak flows exceeded significantly 100-year of recurrence time in many sites. The observed and mainly unobserved catchments were affected. The detailed fields of rainfall amounts were gained from the adjusted meteorological radar observation. All of the available rainfall measurements at the climatological and rain gage stations were used for the adjustment. Hydraulic and rainfall-runoff models were used to evaluate the hydrological response. It was proved again, that the outputs from currently used meteorological forecasting models are not sufficient for a reliable local forecast of the strong convective storms and their possible consequences - flash floods. Within the frame of the research project SP/1c4/16/07 "Implementation of new techniques for stream flow forecasting tools" (project period 2007-2011, funded by Ministry of Environment) a forecasting system for the estimation of runoff response to torrential rainfall has been developed. CN value automatic update based on antecedent precipitation is used to estimate possible runoff from storm. Ten minutes radar rainfall estimates and COTREC based nowcasting serve as meteorological input. Results of 2009 events hindcast are presented. It proved the underestimation of rainfall by raw radar data and thus the need for real time adjustment of radar estimates based on rain gauge data. The main output from presented forecasting system is an estimation of flash flood risk. Risk estimation is based on exceeding 3 defined thresholds defined as ratios between the estimated peak flow and theoretical 100-year flood on particular basin. The procedures mentioned above were being developed during the period 2008-2009. Intensive testing is expected by CHMI forecasting offices during 2010-2011.
NASA Technical Reports Server (NTRS)
Wu, Xiaohua; Diak, George R.; Hayden, Cristopher M.; Young, John A.
1995-01-01
These observing system simulation experiments investigate the assimilation of satellite-observed water vapor and cloud liquid water data in the initialization of a limited-area primitive equations model with the goal of improving short-range precipitation forecasts. The assimilation procedure presented includes two aspects: specification of an initial cloud liquid water vertical distribution and diabatic initialization. The satellite data is simulated for the next generation of polar-orbiting satellite instruments, the Advanced Microwave Sounding Unit (AMSU) and the High-Resolution Infrared Sounder (HIRS), which are scheduled to be launched on the NOAA-K satellite in the mid-1990s. Based on cloud-top height and total column cloud liquid water amounts simulated for satellite data a diagnostic method is used to specify an initial cloud water vertical distribution and to modify the initial moisture distribution in cloudy areas. Using a diabatic initialization procedure, the associated latent heating profiles are directly assimilated into the numerical model. The initial heating is estimated by time averaging the latent heat release from convective and large-scale condensation during the early forecast stage after insertion of satellite-observed temperature, water vapor, and cloud water formation. The assimilation of satellite-observed moisture and cloud water, together withy three-mode diabatic initialization, significantly alleviates the model precipitation spinup problem, especially in the first 3 h of the forecast. Experimental forecasts indicate that the impact of satellite-observed temperature and water vapor profiles and cloud water alone in the initialization procedure shortens the spinup time for precipitation rates by 1-2 h and for regeneration of the areal coverage by 3 h. The diabatic initialization further reduces the precipitation spinup time (compared to adiabatic initialization) by 1 h.
Chowell, Gerardo; Viboud, Cécile
2016-10-01
The increasing use of mathematical models for epidemic forecasting has highlighted the importance of designing models that capture the baseline transmission characteristics in order to generate reliable epidemic forecasts. Improved models for epidemic forecasting could be achieved by identifying signature features of epidemic growth, which could inform the design of models of disease spread and reveal important characteristics of the transmission process. In particular, it is often taken for granted that the early growth phase of different growth processes in nature follow early exponential growth dynamics. In the context of infectious disease spread, this assumption is often convenient to describe a transmission process with mass action kinetics using differential equations and generate analytic expressions and estimates of the reproduction number. In this article, we carry out a simulation study to illustrate the impact of incorrectly assuming an exponential-growth model to characterize the early phase (e.g., 3-5 disease generation intervals) of an infectious disease outbreak that follows near-exponential growth dynamics. Specifically, we assess the impact on: 1) goodness of fit, 2) bias on the growth parameter, and 3) the impact on short-term epidemic forecasts. Designing transmission models and statistical approaches that more flexibly capture the profile of epidemic growth could lead to enhanced model fit, improved estimates of key transmission parameters, and more realistic epidemic forecasts.
An empirical method for estimating travel times for wet volcanic mass flows
Pierson, Thomas C.
1998-01-01
Travel times for wet volcanic mass flows (debris avalanches and lahars) can be forecast as a function of distance from source when the approximate flow rate (peak discharge near the source) can be estimated beforehand. The near-source flow rate is primarily a function of initial flow volume, which should be possible to estimate to an order of magnitude on the basis of geologic, geomorphic, and hydrologic factors at a particular volcano. Least-squares best fits to plots of flow-front travel time as a function of distance from source provide predictive second-degree polynomial equations with high coefficients of determination for four broad size classes of flow based on near-source flow rate: extremely large flows (>1 000 000 m3/s), very large flows (10 000–1 000 000 m3/s), large flows (1000–10 000 m3/s), and moderate flows (100–1000 m3/s). A strong nonlinear correlation that exists between initial total flow volume and flow rate for "instantaneously" generated debris flows can be used to estimate near-source flow rates in advance. Differences in geomorphic controlling factors among different flows in the data sets have relatively little effect on the strong nonlinear correlations between travel time and distance from source. Differences in flow type may be important, especially for extremely large flows, but this could not be evaluated here. At a given distance away from a volcano, travel times can vary by approximately an order of magnitude depending on flow rate. The method can provide emergency-management officials a means for estimating time windows for evacuation of communities located in hazard zones downstream from potentially hazardous volcanoes.
A New Tool for Forecasting Solar Drivers of Severe Space Weather
NASA Technical Reports Server (NTRS)
Adams, J. H.; Falconer, D.; Barghouty, A. F.; Khazanov, I.; Moore, R.
2010-01-01
This poster describes a tool that is designed to forecast solar drivers for severe space weather. Since most severe space weather is driven by Solar flares and Coronal Mass Ejections (CMEs) - the strongest of these originate in active regions and are driven by the release of coronal free magnetic energy and There is a positive correlation between an active region's free magnetic energy and the likelihood of flare and CME production therefore we can use this positive correlation as the basis of our empirical space weather forecasting tool. The new tool takes a full disk Michelson Doppler Imager (MDI) magnetogram, identifies strong magnetic field areas, identifies these with NOAA active regions, and measures a free-magnetic-energy proxy. It uses an empirically derived forecasting function to convert the free-magnetic-energy proxy to an expected event rate. It adds up the expected event rates from all active regions on the disk to forecast the expected rate and probability of each class of events -- X-class flares, X&M class flares, CMEs, fast CMEs, and solar particle events (SPEs).
Rapid changes in the range limits of Scots pine 4000 years ago
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gear, A.J.; Huntley, B.
Paleoecological data provide estimates of response rates to past climate changes. Fossil Pinus sylvestris stumps in far northern Scotland demonstrate former presence of pine trees where conventional pollen evidence of pine forests is lacking. Radiocarbon, dendrochronological, and fine temporal-resolution palynological data show that pine forest were present for about four centuries some 4,000 years ago; the forests expanded and then retreated rapidly some 70 to 80 kilometers. Despite the rapidity of this response to climate change, it occurred at rates slower by an order of magnitude than those necessary to maintain equilibrium with forecast climate changes attributed to the greenhousemore » effect.« less
NASA Astrophysics Data System (ADS)
Norbeck, J. H.; Rubinstein, J. L.
2017-12-01
The earthquake activity in Oklahoma and Kansas that began in 2008 reflects the most widespread instance of induced seismicity observed to date. In this work, we demonstrate that the basement fault stressing conditions that drive seismicity rate evolution are related directly to the operational history of 958 saltwater disposal wells completed in the Arbuckle aquifer. We developed a fluid pressurization model based on the assumption that pressure changes are dominated by reservoir compressibility effects. Using injection well data, we established a detailed description of the temporal and spatial variability in stressing conditions over the 21.5-year period from January 1995 through June 2017. With this stressing history, we applied a numerical model based on rate-and-state friction theory to generate seismicity rate forecasts across a broad range of spatial scales. The model replicated the onset of seismicity, the timing of the peak seismicity rate, and the reduction in seismicity following decreased disposal activity. The behavior of the induced earthquake sequence was consistent with the prediction from rate-and-state theory that the system evolves toward a steady seismicity rate depending on the ratio between the current and background stressing rates. Seismicity rate transients occurred over characteristic timescales inversely proportional to stressing rate. We found that our hydromechanical earthquake rate model outperformed observational and empirical forecast models for one-year forecast durations over the period 2008 through 2016.
NASA Astrophysics Data System (ADS)
Dahl, Mads-Peter; Colleuille, Hervé; Boje, Søren; Sund, Monica; Krøgli, Ingeborg; Devoli, Graziella
2015-04-01
The Norwegian Water Resources and Energy Directorate (NVE) runs a national early warning system (EWS) for shallow landslides in Norway. Slope failures included in the EWS are debris slides, debris flows, debris avalanches and slush flows. The EWS has been operational on national scale since 2013 and consists of (a) quantitative landslide thresholds and daily hydro-meteorological prognosis; (b) daily qualitative expert evaluation of prognosis / additional data in decision to determine warning levels; (c) publication of warning levels through various custom build internet platforms. The effectiveness of an EWS depends on both the quality of forecasts being issued, and the communication of forecasts to the public. In this analysis a preliminary evaluation of landslide forecasts from the Norwegian EWS within the period 2012-2014 is presented. Criteria for categorizing forecasts as correct, missed events or false alarms are discussed and concrete examples of forecasts falling into the latter two categories are presented. The evaluation show a rate of correct forecasts exceeding 90%. However correct forecast categorization is sometimes difficult, particularly due to poorly documented landslide events. Several challenges has to be met in the process of further lowering rates of missed events of false alarms in the EWS. Among others these include better implementation of susceptibility maps in landslide forecasting, more detailed regionalization of hydro-meteorological landslide thresholds, improved prognosis on precipitation, snowmelt and soil water content as well as the build-up of more experience among the people performing landslide forecasting.
Volcanic Eruption Forecasts From Accelerating Rates of Drumbeat Long-Period Earthquakes
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Hernandez, Stephen; Main, Ian G.; Gaunt, H. Elizabeth; Mothes, Patricia; Ruiz, Mario
2018-02-01
Accelerating rates of quasiperiodic "drumbeat" long-period earthquakes (LPs) are commonly reported before eruptions at andesite and dacite volcanoes, and promise insights into the nature of fundamental preeruptive processes and improved eruption forecasts. Here we apply a new Bayesian Markov chain Monte Carlo gamma point process methodology to investigate an exceptionally well-developed sequence of drumbeat LPs preceding a recent large vulcanian explosion at Tungurahua volcano, Ecuador. For more than 24 hr, LP rates increased according to the inverse power law trend predicted by material failure theory, and with a retrospectively forecast failure time that agrees with the eruption onset within error. LPs resulted from repeated activation of a single characteristic source driven by accelerating loading, rather than a distributed failure process, showing that similar precursory trends can emerge from quite different underlying physics. Nevertheless, such sequences have clear potential for improving forecasts of eruptions at Tungurahua and analogous volcanoes.
Arima model and exponential smoothing method: A comparison
NASA Astrophysics Data System (ADS)
Wan Ahmad, Wan Kamarul Ariffin; Ahmad, Sabri
2013-04-01
This study shows the comparison between Autoregressive Moving Average (ARIMA) model and Exponential Smoothing Method in making a prediction. The comparison is focused on the ability of both methods in making the forecasts with the different number of data sources and the different length of forecasting period. For this purpose, the data from The Price of Crude Palm Oil (RM/tonne), Exchange Rates of Ringgit Malaysia (RM) in comparison to Great Britain Pound (GBP) and also The Price of SMR 20 Rubber Type (cents/kg) with three different time series are used in the comparison process. Then, forecasting accuracy of each model is measured by examinethe prediction error that producedby using Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE), and Mean Absolute deviation (MAD). The study shows that the ARIMA model can produce a better prediction for the long-term forecasting with limited data sources, butcannot produce a better prediction for time series with a narrow range of one point to another as in the time series for Exchange Rates. On the contrary, Exponential Smoothing Method can produce a better forecasting for Exchange Rates that has a narrow range of one point to another for its time series, while itcannot produce a better prediction for a longer forecasting period.
A Real-time Irrigation Forecasting System in Jiefangzha Irrigation District, China
NASA Astrophysics Data System (ADS)
Cong, Z.
2015-12-01
In order to improve the irrigation efficiency, we need to know when and how much to irrigate in real time. If we know the soil moisture content at this time, we can forecast the soil moisture content in the next days based on the rainfall forecasting and the crop evapotranspiration forecasting. Then the irrigation should be considered when the forecasting soil moisture content reaches to a threshold. Jiefangzha Irrigation District, a part of Hetao Irrigation District, is located in Inner Mongolia, China. The irrigated area of this irrigation district is about 140,000 ha mainly planting wheat, maize and sunflower. The annual precipitation is below 200mm, so the irrigation is necessary and the irrigation water comes from the Yellow river. We set up 10 sites with 4 TDR sensors at each site (20cm, 40cm, 60cm and 80cm depth) to monitor the soil moisture content. The weather forecasting data are downloaded from the website of European Centre for Medium-Range Weather Forecasts (ECMWF). The reference evapotranspiration is estimated based on FAO-Blaney-Criddle equation with only the air temperature from ECMWF. Then the crop water requirement is forecasted by the crop coefficient multiplying the reference evapotranspiration. Finally, the soil moisture content is forecasted based on soil water balance with the initial condition is set as the monitoring soil moisture content. When the soil moisture content reaches to a threshold, the irrigation warning will be announced. The irrigation mount can be estimated through three ways: (1) making the soil moisture content be equal to the field capacity; (2) making the soil moisture saturated; or (3) according to the irrigation quota. The forecasting period is 10 days. The system is developed according to B2C model with Java language. All the databases and the data analysis are carried out in the server. The customers can log in the website with their own username and password then get the information about the irrigation forecasting and other information about the irrigation. This system can be expanded in other irrigation districts. In future, it is even possible to upgrade the system for the mobile user.
Probabilistic forecasting of extreme weather events based on extreme value theory
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert
2016-04-01
Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.
Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets
NASA Astrophysics Data System (ADS)
Cifter, Atilla
2011-06-01
This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.
Forecasting induced seismicity rate and Mmax using calibrated numerical models
NASA Astrophysics Data System (ADS)
Dempsey, D.; Suckale, J.
2016-12-01
At Groningen, The Netherlands, several decades of induced seismicity from gas extraction has culminated in a M 3.6 event (mid 2012). From a public safety and commercial perspective, it is desirable to anticipate future seismicity outcomes at Groningen. One way to quantify earthquake risk is Probabilistic Seismic Hazard Analysis (PSHA), which requires an estimate of the future seismicity rate and its magnitude frequency distribution (MFD). This approach is effective at quantifying risk from tectonic events because the seismicity rate, once measured, is almost constant over timescales of interest. In contrast, rates of induced seismicity vary significantly over building lifetimes, largely in response to changes in injection or extraction. Thus, the key to extending PSHA to induced earthquakes is to estimate future changes of the seismicity rate in response to some proposed operating schedule. Numerical models can describe the physical link between fluid pressure, effective stress change, and the earthquake process (triggering and propagation). However, models with predictive potential of individual earthquakes face the difficulty of characterizing specific heterogeneity - stress, strength, roughness, etc. - at locations of interest. Modeling catalogs of earthquakes provides a means of averaging over this uncertainty, focusing instead on the collective features of the seismicity, e.g., its rate and MFD. The model we use incorporates fluid pressure and stress changes to describe nucleation and crack-like propagation of earthquakes on stochastically characterized 1D faults. This enables simulation of synthetic catalogs of induced seismicity from which the seismicity rate, location and MFD are extracted. A probability distribution for Mmax - the largest event in some specified time window - is also computed. Because the model captures the physics linking seismicity to changes in the reservoir, earthquake observations and operating information can be used to calibrate a model at a specific site (or, ideally, many models). This restricts analysis of future seismicity to likely parameter sets and provides physical justification for linking operational changes to subsequent seismicity. To illustrate these concepts, a recent study of prior and forecast seismicity at Groningen will be presented.
Model documentation report: Residential sector demand module of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less
Using base rates and correlational data to supplement clinical risk assessments.
Davis, Jaya; Sorensen, Jon R
2013-01-01
The current study is a partial replication of previous studies designed to estimate the level of risk posed by capital murder defendants. The study draws on data describing the behavior of nearly 2,000 incarcerated capital murderers to forecast violence propensity among defendants sentenced to life imprisonment. Logistic regression is used to model various violence outcomes, relying on the following predictors: age, educational attainment, prior imprisonment, and gang affiliation. This exercise is designed to illustrate how actuarial data may be used to anchor individualized clinical assessments of risk in capital murder trials.
NASA Technical Reports Server (NTRS)
Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William
2017-01-01
Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.
Real-time Mainshock Forecast by Statistical Discrimination of Foreshock Clusters
NASA Astrophysics Data System (ADS)
Nomura, S.; Ogata, Y.
2016-12-01
Foreshock discremination is one of the most effective ways for short-time forecast of large main shocks. Though many large earthquakes accompany their foreshocks, discreminating them from enormous small earthquakes is difficult and only probabilistic evaluation from their spatio-temporal features and magnitude evolution may be available. Logistic regression is the statistical learning method best suited to such binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Statistical learning methods can keep learning discreminating features from updating catalog and give probabilistic recognition of forecast in real time. We estimated a non-linear function of foreshock proportion by smooth spline bases and evaluate the possibility of foreshocks by the logit function. In this study, we classified foreshocks from earthquake catalog by the Japan Meteorological Agency by single-link clustering methods and learned spatial and temporal features of foreshocks by the probability density ratio estimation. We use the epicentral locations, time spans and difference in magnitudes for learning and forecasting. Magnitudes of main shocks are also predicted our method by incorporating b-values into our method. We discuss the spatial pattern of foreshocks from the classifier composed by our model. We also implement a back test to validate predictive performance of the model by this catalog.
Rainfall Estimation over the Nile Basin using Multi-Spectral, Multi- Instrument Satellite Techniques
NASA Astrophysics Data System (ADS)
Habib, E.; Kuligowski, R.; Sazib, N.; Elshamy, M.; Amin, D.; Ahmed, M.
2012-04-01
Management of Egypt's Aswan High Dam is critical not only for flood control on the Nile but also for ensuring adequate water supplies for most of Egypt since rainfall is scarce over the vast majority of its land area. However, reservoir inflow is driven by rainfall over Sudan, Ethiopia, Uganda, and several other countries from which routine rain gauge data are sparse. Satellite- derived estimates of rainfall offer a much more detailed and timely set of data to form a basis for decisions on the operation of the dam. A single-channel infrared (IR) algorithm is currently in operational use at the Egyptian Nile Forecast Center (NFC). In this study, the authors report on the adaptation of a multi-spectral, multi-instrument satellite rainfall estimation algorithm (Self- Calibrating Multivariate Precipitation Retrieval, SCaMPR) for operational application by NFC over the Nile Basin. The algorithm uses a set of rainfall predictors that come from multi-spectral Infrared cloud top observations and self-calibrate them to a set of predictands that come from the more accurate, but less frequent, Microwave (MW) rain rate estimates. For application over the Nile Basin, the SCaMPR algorithm uses multiple satellite IR channels that have become recently available to NFC from the Spinning Enhanced Visible and Infrared Imager (SEVIRI). Microwave rain rates are acquired from multiple sources such as the Special Sensor Microwave/Imager (SSM/I), the Special Sensor Microwave Imager and Sounder (SSMIS), the Advanced Microwave Sounding Unit (AMSU), the Advanced Microwave Scanning Radiometer on EOS (AMSR-E), and the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). The algorithm has two main steps: rain/no-rain separation using discriminant analysis, and rain rate estimation using stepwise linear regression. We test two modes of algorithm calibration: real- time calibration with continuous updates of coefficients with newly coming MW rain rates, and calibration using static coefficients that are derived from IR-MW data from past observations. We also compare the SCaMPR algorithm to other global-scale satellite rainfall algorithms (e.g., 'Tropical Rainfall Measuring Mission (TRMM) and other sources' (TRMM-3B42) product, and the National Oceanographic and Atmospheric Administration Climate Prediction Center (NOAA-CPC) CMORPH product. The algorithm has several potential future applications such as: improving the performance accuracy of hydrologic forecasting models over the Nile Basin, and utilizing the enhanced rainfall datasets and better-calibrated hydrologic models to assess the impacts of climate change on the region's water availability using global circulation models and regional climate models.
NASA Technical Reports Server (NTRS)
Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.
Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.
Mavragani, Amaryllis; Sampri, Alexia; Sypsa, Karla; Tsagarakis, Konstantinos P
2018-03-12
With the internet's penetration and use constantly expanding, this vast amount of information can be employed in order to better assess issues in the US health care system. Google Trends, a popular tool in big data analytics, has been widely used in the past to examine interest in various medical and health-related topics and has shown great potential in forecastings, predictions, and nowcastings. As empirical relationships between online queries and human behavior have been shown to exist, a new opportunity to explore the behavior toward asthma-a common respiratory disease-is present. This study aimed at forecasting the online behavior toward asthma and examined the correlations between queries and reported cases in order to explore the possibility of nowcasting asthma prevalence in the United States using online search traffic data. Applying Holt-Winters exponential smoothing to Google Trends time series from 2004 to 2015 for the term "asthma," forecasts for online queries at state and national levels are estimated from 2016 to 2020 and validated against available Google query data from January 2016 to June 2017. Correlations among yearly Google queries and between Google queries and reported asthma cases are examined. Our analysis shows that search queries exhibit seasonality within each year and the relationships between each 2 years' queries are statistically significant (P<.05). Estimated forecasting models for a 5-year period (2016 through 2020) for Google queries are robust and validated against available data from January 2016 to June 2017. Significant correlations were found between (1) online queries and National Health Interview Survey lifetime asthma (r=-.82, P=.001) and current asthma (r=-.77, P=.004) rates from 2004 to 2015 and (2) between online queries and Behavioral Risk Factor Surveillance System lifetime (r=-.78, P=.003) and current asthma (r=-.79, P=.002) rates from 2004 to 2014. The correlations are negative, but lag analysis to identify the period of response cannot be employed until short-interval data on asthma prevalence are made available. Online behavior toward asthma can be accurately predicted, and significant correlations between online queries and reported cases exist. This method of forecasting Google queries can be used by health care officials to nowcast asthma prevalence by city, state, or nationally, subject to future availability of daily, weekly, or monthly data on reported cases. This method could therefore be used for improved monitoring and assessment of the needs surrounding the current population of patients with asthma. ©Amaryllis Mavragani, Alexia Sampri, Karla Sypsa, Konstantinos P Tsagarakis. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 12.03.2018.
NASA Astrophysics Data System (ADS)
Segou, Margarita
2014-05-01
Corinth Gulf (Central Greece) is the fastest continental rift in the world with extension rates 11-15 mm/yr with diverse seismic deformation including earthquakes with M greater than 6.0, several periods of increased microseismic activity, usually lasting few months and possibly related with fluid diffusion, and swarm episodes lasting few days. In this study I perform a retrospective forecast experiment between 1995-2012, focusing on the comparison between physics-based and statistical models for short term time classes. Even though Corinth gulf has been studied extensively in the past there is still today a debate whether earthquake activity is related with the existence of either a shallow dipping structure or steeply dipping normal faults. In the light of the above statement, two CRS realization are based on resolving Coulomb stress changes on specified receiver faults, expressing the aforementioned structural models, whereas the third CRS model uses optimally-oriented for failure planes. The CRS implementation accounts for stress changes following all major ruptures with M greater than 4.5 within the testing phase. I also estimate fault constitutive parameters from modeling the response to major earthquakes at the vicinity of the gulf (Aσ=0.2, stressing rate app. 0.02 bar/yr). The generic ETAS parameters are taken as the maximum likelihood estimates derived from the stochastic declustering of the modern seismicity catalog (1995-2012) with minimum triggering magnitude M2.5. I test whether the generic ETAS can efficiently describe the aftershock spatio-temporal clustering but also the evolution of swarm episodes and microseismicity. For the reason above, I implement likelihood tests to evaluate the forecasts for their spatial consistency and for the total amount of predicted versus observed events with M greater than 3.0 in 10-day time windows during three distinct evaluation phases; the first evaluation phase focuses on the Aigio 1995 aftershock sequence (15/06/1995, M6.4), the second covers the period between September 2006-May 2007, characterized for its intense microseismicity, and the third is related with the May 2013 swarm. The conclusions support that (1) geology based CRS models are preferred over optimally oriented planes (2) CRS models are consistent forecasters (60-70%) of transient seismicity, having in most cases comparable performance with ETAS models (3) microseismicity and swarms are not triggered by static stress changes of preceding local events with magnitude M greater than 4.5 and (4) the generic ETAS model can efficiently describe the recent swarm episode. The findings of this study have a number of important implications for future short-term forecasting and time-dependent hazard within Corinth Gulf.
Predictability of extremes in non-linear hierarchically organized systems
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.; Soloviev, A.
2011-12-01
Understanding the complexity of non-linear dynamics of hierarchically organized systems progresses to new approaches in assessing hazard and risk of the extreme catastrophic events. In particular, a series of interrelated step-by-step studies of seismic process along with its non-stationary though self-organized behaviors, has led already to reproducible intermediate-term middle-range earthquake forecast/prediction technique that has passed control in forward real-time applications during the last two decades. The observed seismic dynamics prior to and after many mega, great, major, and strong earthquakes demonstrate common features of predictability and diverse behavior in course durable phase transitions in complex hierarchical non-linear system of blocks-and-faults of the Earth lithosphere. The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable analytical models, which leads to widespread practice of their deceptive application. The consequences of underestimation of seismic hazard propagate non-linearly into inflicted underestimation of risk and, eventually, into unexpected societal losses due to earthquakes and associated phenomena (i.e., collapse of buildings, landslides, tsunamis, liquefaction, etc.). The studies aimed at forecast/prediction of extreme events (interpreted as critical transitions) in geophysical and socio-economical systems include: (i) large earthquakes in geophysical systems of the lithosphere blocks-and-faults, (ii) starts and ends of economic recessions, (iii) episodes of a sharp increase in the unemployment rate, (iv) surge of the homicides in socio-economic systems. These studies are based on a heuristic search of phenomena preceding critical transitions and application of methodologies of pattern recognition of infrequent events. Any study of rare phenomena of highly complex origin, by their nature, implies using problem oriented methods, which design breaks the limits of classical statistical or econometric applications. The unambiguously designed forecast/prediction algorithms of the "yes or no" variety, analyze the observable quantitative integrals and indicators available to a given date, then provides unambiguous answer to the question whether a critical transition should be expected or not in the next time interval. Since the predictability of an originating non-linear dynamical system is limited in principle, the probabilistic component of forecast/prediction algorithms is represented by the empirical probabilities of alarms, on one side, and failures-to-predict, on the other, estimated on control sets achieved in the retro- and prospective experiments. Predicting in advance is the only decisive test of forecast/predictions and the relevant on-going experiments are conducted in the case seismic extremes, recessions, and increases of unemployment rate. The results achieved in real-time testing keep being encouraging and confirm predictability of the extremes.
The Value, Protocols, and Scientific Ethics of Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Jordan, Thomas H.
2013-04-01
Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should provide public sources of information on short-term probabilities that are authoritative, scientific, open, and timely. Alert procedures should be negotiated with end-users to facilitate decisions at different levels of society, based in part on objective analysis of costs and benefits but also on less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Unfortunately, in most countries, operational forecasting systems do not conform to such high standards, and earthquake scientists are often called upon to advise the public in roles that exceed their civic authority, expertise in risk communication, and situational knowledge. Certain ethical principles are well established; e.g., announcing unreliable predictions in public forums should be avoided, because bad information can be dangerous. But what are the professional responsibilities of earthquake scientists during seismic crises, especially when the public information through official channels is thought to be inadequate or incorrect? How much should these responsibilities be discounted in the face of personal liability? How should scientists contend with highly uncertain forecasts? To what degree should the public be involved in controversies about forecasting results? No simple answers to these questions can be offered, but the need for answers can be reduced by improving operational forecasting systems. This will require more substantial, and more trustful, collaborations between scientists, civil authorities, and public stakeholders.
Forecasts of forest conditions
Robert Huggett; David N. Wear; Ruhong Li; John Coulston; Shan Liu
2013-01-01
Key FindingsAmong the five forest management types, only planted pine is expected to increase in area. In 2010 planted pine comprised 19 percent of southern forests. By 2060, planted pine is forecasted to comprise somewhere between 24 and 36 percent of forest area.Although predicted rates of change vary, all forecasts reveal...
A Systems Modeling Approach to Forecast Corn Economic Optimum Nitrogen Rate.
Puntel, Laila A; Sawyer, John E; Barker, Daniel W; Thorburn, Peter J; Castellano, Michael J; Moore, Kenneth J; VanLoocke, Andrew; Heaton, Emily A; Archontoulis, Sotirios V
2018-01-01
Historically crop models have been used to evaluate crop yield responses to nitrogen (N) rates after harvest when it is too late for the farmers to make in-season adjustments. We hypothesize that the use of a crop model as an in-season forecast tool will improve current N decision-making. To explore this, we used the Agricultural Production Systems sIMulator (APSIM) calibrated with long-term experimental data for central Iowa, USA (16-years in continuous corn and 15-years in soybean-corn rotation) combined with actual weather data up to a specific crop stage and historical weather data thereafter. The objectives were to: (1) evaluate the accuracy and uncertainty of corn yield and economic optimum N rate (EONR) predictions at four forecast times (planting time, 6th and 12th leaf, and silking phenological stages); (2) determine whether the use of analogous historical weather years based on precipitation and temperature patterns as opposed to using a 35-year dataset could improve the accuracy of the forecast; and (3) quantify the value added by the crop model in predicting annual EONR and yields using the site-mean EONR and the yield at the EONR to benchmark predicted values. Results indicated that the mean corn yield predictions at planting time ( R 2 = 0.77) using 35-years of historical weather was close to the observed and predicted yield at maturity ( R 2 = 0.81). Across all forecasting times, the EONR predictions were more accurate in corn-corn than soybean-corn rotation (relative root mean square error, RRMSE, of 25 vs. 45%, respectively). At planting time, the APSIM model predicted the direction of optimum N rates (above, below or at average site-mean EONR) in 62% of the cases examined ( n = 31) with an average error range of ±38 kg N ha -1 (22% of the average N rate). Across all forecast times, prediction error of EONR was about three times higher than yield predictions. The use of the 35-year weather record was better than using selected historical weather years to forecast (RRMSE was on average 3% lower). Overall, the proposed approach of using the crop model as a forecasting tool could improve year-to-year predictability of corn yields and optimum N rates. Further improvements in modeling and set-up protocols are needed toward more accurate forecast, especially for extreme weather years with the most significant economic and environmental cost.
A Systems Modeling Approach to Forecast Corn Economic Optimum Nitrogen Rate
Puntel, Laila A.; Sawyer, John E.; Barker, Daniel W.; Thorburn, Peter J.; Castellano, Michael J.; Moore, Kenneth J.; VanLoocke, Andrew; Heaton, Emily A.; Archontoulis, Sotirios V.
2018-01-01
Historically crop models have been used to evaluate crop yield responses to nitrogen (N) rates after harvest when it is too late for the farmers to make in-season adjustments. We hypothesize that the use of a crop model as an in-season forecast tool will improve current N decision-making. To explore this, we used the Agricultural Production Systems sIMulator (APSIM) calibrated with long-term experimental data for central Iowa, USA (16-years in continuous corn and 15-years in soybean-corn rotation) combined with actual weather data up to a specific crop stage and historical weather data thereafter. The objectives were to: (1) evaluate the accuracy and uncertainty of corn yield and economic optimum N rate (EONR) predictions at four forecast times (planting time, 6th and 12th leaf, and silking phenological stages); (2) determine whether the use of analogous historical weather years based on precipitation and temperature patterns as opposed to using a 35-year dataset could improve the accuracy of the forecast; and (3) quantify the value added by the crop model in predicting annual EONR and yields using the site-mean EONR and the yield at the EONR to benchmark predicted values. Results indicated that the mean corn yield predictions at planting time (R2 = 0.77) using 35-years of historical weather was close to the observed and predicted yield at maturity (R2 = 0.81). Across all forecasting times, the EONR predictions were more accurate in corn-corn than soybean-corn rotation (relative root mean square error, RRMSE, of 25 vs. 45%, respectively). At planting time, the APSIM model predicted the direction of optimum N rates (above, below or at average site-mean EONR) in 62% of the cases examined (n = 31) with an average error range of ±38 kg N ha−1 (22% of the average N rate). Across all forecast times, prediction error of EONR was about three times higher than yield predictions. The use of the 35-year weather record was better than using selected historical weather years to forecast (RRMSE was on average 3% lower). Overall, the proposed approach of using the crop model as a forecasting tool could improve year-to-year predictability of corn yields and optimum N rates. Further improvements in modeling and set-up protocols are needed toward more accurate forecast, especially for extreme weather years with the most significant economic and environmental cost. PMID:29706974
Real-time localization of mobile device by filtering method for sensor fusion
NASA Astrophysics Data System (ADS)
Fuse, Takashi; Nagara, Keita
2017-06-01
Most of the applications with mobile devices require self-localization of the devices. GPS cannot be used in indoor environment, the positions of mobile devices are estimated autonomously by using IMU. Since the self-localization is based on IMU of low accuracy, and then the self-localization in indoor environment is still challenging. The selflocalization method using images have been developed, and the accuracy of the method is increasing. This paper develops the self-localization method without GPS in indoor environment by integrating sensors, such as IMU and cameras, on mobile devices simultaneously. The proposed method consists of observations, forecasting and filtering. The position and velocity of the mobile device are defined as a state vector. In the self-localization, observations correspond to observation data from IMU and camera (observation vector), forecasting to mobile device moving model (system model) and filtering to tracking method by inertial surveying and coplanarity condition and inverse depth model (observation model). Positions of a mobile device being tracked are estimated by system model (forecasting step), which are assumed as linearly moving model. Then estimated positions are optimized referring to the new observation data based on likelihood (filtering step). The optimization at filtering step corresponds to estimation of the maximum a posterior probability. Particle filter are utilized for the calculation through forecasting and filtering steps. The proposed method is applied to data acquired by mobile devices in indoor environment. Through the experiments, the high performance of the method is confirmed.
The Use of Ambient Humidity Conditions to Improve Influenza Forecast
NASA Astrophysics Data System (ADS)
Shaman, J. L.; Kandula, S.; Yang, W.; Karspeck, A. R.
2017-12-01
Laboratory and epidemiological evidence indicate that ambient humidity modulates the survival and transmission of influenza. Here we explore whether the inclusion of humidity forcing in mathematical models describing influenza transmission improves the accuracy of forecasts generated with those models. We generate retrospective forecasts for 95 cities over 10 seasons in the United States and assess both forecast accuracy and error. Overall, we find that humidity forcing improves forecast performance and that forecasts generated using daily climatological humidity forcing generally outperform forecasts that utilize daily observed humidity forcing. These findings hold for predictions of outbreak peak intensity, peak timing, and incidence over 2- and 4-week horizons. The results indicate that use of climatological humidity forcing is warranted for current operational influenza forecast and provide further evidence that humidity modulates rates of influenza transmission.
Forecasting the future burden of opioids for osteoarthritis.
Ackerman, I N; Zomer, E; Gilmartin-Thomas, J F-M; Liew, D
2018-03-01
To quantify the current national burden of opioids for osteoarthritis (OA) pain in Australia in terms of number of dispensed opioid prescriptions and associated costs, and to forecast the likely burden to the year 2030/31. Epidemiological modelling. Published data were obtained on rates of opioid prescribing for people with OA and national OA prevalence projections. Trends in opioid dispensing from 2006 to 2016, and average costs for common opioid subtypes were obtained from the Pharmaceutical Benefits Scheme and Medicare Australia Statistics. Using these inputs, a model was developed to estimate the likely number of dispensed opioid prescriptions and costs to the public healthcare system by 2030/31. In 2015/16, an estimated 1.1 million opioid prescriptions were dispensed in Australia for 403,954 people with OA (of a total 2.2 million Australians with OA). Based on recent dispensing trends and OA prevalence projections, the number of dispensed opioid prescriptions is expected to nearly triple to 3,032,332 by 2030/31, for an estimated 562,610 people with OA. The estimated cost to the Australian healthcare system was $AUD25.2 million in 2015/16, rising to $AUD72.4 million by 2030/31. OA-related opioid dispensing and associated costs are set to increase substantially in Australia from 2015/16 to 2030/31. Use of opioids for OA pain is concerning given joint disease chronicity and the risk of adverse events, particularly among older people. These projections represent a conservative estimate of the full financial burden given additional costs associated with opioid-related harms and out-of-pocket costs borne by patients. Copyright © 2017 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
The trade-off between hospital cost and quality of care. An exploratory empirical analysis.
Morey, R C; Fine, D J; Loree, S W; Retzlaff-Roberts, D L; Tsubakitani, S
1992-08-01
The debate concerning quality of care in hospitals, its "value" and affordability, is increasingly of concern to providers, consumers, and purchasers in the United States and elsewhere. We undertook an exploratory study to estimate the impact on hospital-wide costs if quality-of-care levels were varied. To do so, we obtained costs and service output data regarding 300 U.S. hospitals, representing approximately a 5% cross section of all hospitals operating in 1983; both inpatient and outpatient services were included. The quality-of-care measure used for the exploratory analysis was the ratio of actual deaths in the hospital for the year in question to the forecasted number of deaths for the hospital; the hospital mortality forecaster had earlier (and elsewhere) been built from analyses of 6 million discharge abstracts, and took into account each hospital's actual individual admissions, including key patient descriptors for each admission. Such adjusted death rates have increasingly been used as potential indicators of quality, with recent research lending support for the viability of that linkage. The authors then utilized the economic construct of allocative efficiency relying on "best practices" concepts and peer groupings, built using the "envelopment" philosophy of Data Envelopment Analysis and Pareto efficiency. These analytical techniques estimated the efficiently delivered costs required to meet prespecified levels of quality of care. The marginal additional cost per each death deferred in 1983 was estimated to be approximately $29,000 (in 1990 dollars) for the average efficient hospital. Also, over a feasible range, a 1% increase in the level of quality of care delivered was estimated to increase hospital cost by an average of 1.34%. This estimated elasticity of quality on cost also increased with the number of beds in the hospital.
Bayesian methods in reliability
NASA Astrophysics Data System (ADS)
Sander, P.; Badoux, R.
1991-11-01
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
Wheat productivity estimates using LANDSAT data. [Michigan
NASA Technical Reports Server (NTRS)
Nalepka, R. F.; Colwell, J. (Principal Investigator); Rice, D. P.
1977-01-01
The author has identified the following significant results. An initial demonstration was made of the capability to make direct production forecasts for winter wheat using early season LANDSAT data. The approach offers the potential to make production forecasts quickly and simply, possibly avoiding some of the complexities of alternate procedures.
Defining conservation priorities using fragmentation forecasts
David Wear; John Pye; Kurt H. Riitters
2004-01-01
Methods are developed for forecasting the effects of population and economic growth on the distribution of interior forest habitat. An application to the southeastern United States shows that models provide significant explanatory power with regard to the observed distribution of interior forest. Estimates for economic and biophysical variables are significant and...
Validation of Seasonal Forecast of Indian Summer Monsoon Rainfall
NASA Astrophysics Data System (ADS)
Das, Sukanta Kumar; Deb, Sanjib Kumar; Kishtawal, C. M.; Pal, Pradip Kumar
2015-06-01
The experimental seasonal forecast of Indian summer monsoon (ISM) rainfall during June through September using Community Atmosphere Model (CAM) version 3 has been carried out at the Space Applications Centre Ahmedabad since 2009. The forecasts, based on a number of ensemble members (ten minimum) of CAM, are generated in several phases and updated on regular basis. On completion of 5 years of experimental seasonal forecasts in operational mode, it is required that the overall validation or correctness of the forecast system is quantified and that the scope is assessed for further improvements of the forecast over time, if any. The ensemble model climatology generated by a set of 20 identical CAM simulations is considered as the model control simulation. The performance of the forecast has been evaluated by assuming the control simulation as the model reference. The forecast improvement factor shows positive improvements, with higher values for the recent forecasted years as compared to the control experiment over the Indian landmass. The Taylor diagram representation of the Pearson correlation coefficient (PCC), standard deviation and centered root mean square difference has been used to demonstrate the best PCC, in the order of 0.74-0.79, recorded for the seasonal forecast made during 2013. Further, the bias score of different phases of experiment revealed the fact that the ISM rainfall forecast is affected by overestimation in predicting the low rain-rate (less than 7 mm/day), but by underestimation in the medium and high rain-rate (higher than 11 mm/day). Overall, the analysis shows significant improvement of the ISM forecast over the last 5 years, viz. 2009-2013, due to several important modifications that have been implemented in the forecast system. The validation exercise has also pointed out a number of shortcomings in the forecast system; these will be addressed in the upcoming years of experiments to improve the quality of the ISM prediction.
Latent fluctuation periods and long-term forecasting of the level of Markakol lake
NASA Astrophysics Data System (ADS)
Madibekov, A. S.; Babkin, A. V.; Musakulkyzy, A.; Cherednichenko, A. V.
2018-01-01
The analysis of time series of the level of Markakol Lake by the method of “Periodicities” reveals in its variations the harmonics with the periods of 12 and 14 years, respectively. The verification forecasts of the lake level by the trend tendency and by its combination with these sinusoids were computed with the lead time of 5 and 10 years. The estimation of the forecast results by the new independent data permitted to conclude that forecasts by the combination of the sinusoids and trend tendency are better than by the trend tendency only. They are no worse than the mean value prediction.
A simple Lagrangian forecast system with aviation forecast potential
NASA Technical Reports Server (NTRS)
Petersen, R. A.; Homan, J. H.
1983-01-01
A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.
The role of ensemble post-processing for modeling the ensemble tail
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert; Vannitsem, Stéphane
2016-04-01
The past decades the numerical weather prediction community has witnessed a paradigm shift from deterministic to probabilistic forecast and state estimation (Buizza and Leutbecher, 2015; Buizza et al., 2008), in an attempt to quantify the uncertainties associated with initial-condition and model errors. An important benefit of a probabilistic framework is the improved prediction of extreme events. However, one may ask to what extent such model estimates contain information on the occurrence probability of extreme events and how this information can be optimally extracted. Different approaches have been proposed and applied on real-world systems which, based on extreme value theory, allow the estimation of extreme-event probabilities conditional on forecasts and state estimates (Ferro, 2007; Friederichs, 2010). Using ensemble predictions generated with a model of low dimensionality, a thorough investigation is presented quantifying the change of predictability of extreme events associated with ensemble post-processing and other influencing factors including the finite ensemble size, lead time and model assumption and the use of different covariates (ensemble mean, maximum, spread...) for modeling the tail distribution. Tail modeling is performed by deriving extreme-quantile estimates using peak-over-threshold representation (generalized Pareto distribution) or quantile regression. Common ensemble post-processing methods aim to improve mostly the ensemble mean and spread of a raw forecast (Van Schaeybroeck and Vannitsem, 2015). Conditional tail modeling, on the other hand, is a post-processing in itself, focusing on the tails only. Therefore, it is unclear how applying ensemble post-processing prior to conditional tail modeling impacts the skill of extreme-event predictions. This work is investigating this question in details. Buizza, Leutbecher, and Isaksen, 2008: Potential use of an ensemble of analyses in the ECMWF Ensemble Prediction System, Q. J. R. Meteorol. Soc. 134: 2051-2066.Buizza and Leutbecher, 2015: The forecast skill horizon, Q. J. R. Meteorol. Soc. 141: 3366-3382.Ferro, 2007: A probability model for verifying deterministic forecasts of extreme events. Weather and Forecasting 22 (5), 1089-1100.Friederichs, 2010: Statistical downscaling of extreme precipitation events using extreme value theory. Extremes 13, 109-132.Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.
Post-processing method for wind speed ensemble forecast using wind speed and direction
NASA Astrophysics Data System (ADS)
Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin
2017-04-01
Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.
Evaluation of NU-WRF Rainfall Forecasts for IFloodS
NASA Technical Reports Server (NTRS)
Wu, Di; Peters-Lidard, Christa; Tao, Wei-Kuo; Petersen, Walter
2016-01-01
The Iowa Flood Studies (IFloodS) campaign was conducted in eastern Iowa as a pre- GPM-launch campaign from 1 May to 15 June 2013. During the campaign period, real time forecasts are conducted utilizing NASA-Unified Weather Research and Forecasting (NU-WRF) model to support the everyday weather briefing. In this study, two sets of the NU-WRF rainfall forecasts are evaluated with Stage IV and Multi-Radar Multi-Sensor (MRMS) Quantitative Precipitation Estimation (QPE), with the objective to understand the impact of Land Surface initialization on the predicted precipitation. NU-WRF is also compared with North American Mesoscale Forecast System (NAM) 12 kilometer forecast. In general, NU-WRF did a good job at capturing individual precipitation events. NU-WRF is also able to replicate a better rainfall spatial distribution compare with NAM. Further sensitivity tests show that the high-resolution makes a positive impact on rainfall forecast. The two sets of NU-WRF simulations produce very close rainfall characteristics. The Land surface initialization do not show significant impact on short term rainfall forecast, and it is largely due to the soil conditions during the field campaign period.
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Walker, Gregory K.; Mahanama, Sarith P.; Reichle, Rolf H.
2013-01-01
Offline simulations over the conterminous United States (CONUS) with a land surface model are used to address two issues relevant to the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which a realistic increase in the spatial resolution of forecasted precipitation would improve streamflow forecasts. The addition of error to a soil moisture initialization field is found to lead to a nearly proportional reduction in streamflow forecast skill. The linearity of the response allows the determination of a lower bound for the increase in streamflow forecast skill achievable through improved soil moisture estimation, e.g., through satellite-based soil moisture measurements. An increase in the resolution of precipitation is found to have an impact on large-scale streamflow forecasts only when evaporation variance is significant relative to the precipitation variance. This condition is met only in the western half of the CONUS domain. Taken together, the two studies demonstrate the utility of a continental-scale land surface modeling system as a tool for addressing the science of hydrological prediction.
NASA Astrophysics Data System (ADS)
Clark, Elizabeth; Wood, Andy; Nijssen, Bart; Mendoza, Pablo; Newman, Andy; Nowak, Kenneth; Arnold, Jeffrey
2017-04-01
In an automated forecast system, hydrologic data assimilation (DA) performs the valuable function of correcting raw simulated watershed model states to better represent external observations, including measurements of streamflow, snow, soil moisture, and the like. Yet the incorporation of automated DA into operational forecasting systems has been a long-standing challenge due to the complexities of the hydrologic system, which include numerous lags between state and output variations. To help demonstrate that such methods can succeed in operational automated implementations, we present results from the real-time application of an ensemble particle filter (PF) for short-range (7 day lead) ensemble flow forecasts in western US river basins. We use the System for Hydromet Applications, Research and Prediction (SHARP), developed by the National Center for Atmospheric Research (NCAR) in collaboration with the University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. SHARP is a fully automated platform for short-term to seasonal hydrologic forecasting applications, incorporating uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions through ensemble methods. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 temperature and precipitation time series through conceptual and physically-oriented models. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. The PF selects and/or weights and resamples the IHCs that are most consistent with external streamflow observations, and uses the particles to initialize a streamflow forecast ensemble driven by ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS). We apply this method in real-time for several basins in the western US that are important for water resources management, and perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts skill. This presentation describes findings, including a comparison of sequential and non-sequential particle weighting methods.
Replacement Beef Cow Valuation under Data Availability Constraints
Hagerman, Amy D.; Thompson, Jada M.; Ham, Charlotte; Johnson, Kamina K.
2017-01-01
Economists are often tasked with estimating the benefits or costs associated with livestock production losses; however, lack of available data or absence of consistent reporting can reduce the accuracy of these valuations. This work looks at three potential estimation techniques for determining the value for replacement beef cows with varying types of market data to proxy constrained data availability and discusses the potential margin of error for each technique. Oklahoma bred replacement cows are valued using hedonic pricing based on Oklahoma bred cow data—a best case scenario—vector error correction modeling (VECM) based on national cow sales data and cost of production (COP) based on just a representative enterprise budget and very limited sales data. Each method was then used to perform a within-sample forecast of 2016 January to December, and forecasts are compared with the 2016 monthly observed market prices in Oklahoma using the mean absolute percent error (MAPE). Hedonic pricing methods tend to overvalue for within-sample forecasting but performed best, as measured by MAPE for high quality cows. The VECM tended to undervalue cows but performed best for younger animals. COP performed well, compared with the more data intensive methods. Examining each method individually across eight representative replacement beef female types, the VECM forecast resulted in a MAPE under 10% for 33% of forecasted months, followed by hedonic pricing at 24% of the forecasted months and COP at 14% of the forecasted months for average quality beef females. For high quality females, the hedonic pricing method worked best producing a MAPE under 10% in 36% of the forecasted months followed by the COP method at 21% of months and the VECM at 14% of the forecasted months. These results suggested that livestock valuation method selection was not one-size-fits-all and may need to vary based not only on the data available but also on the characteristics (e.g., quality or age) of the livestock being valued. PMID:29164141
Forecasting of the electrical actuators condition using stator’s current signals
NASA Astrophysics Data System (ADS)
Kruglova, T. N.; Yaroshenko, I. V.; Rabotalov, N. N.; Melnikov, M. A.
2017-02-01
This article describes a forecasting method for electrical actuators realized through the combination of Fourier transformation and neural network techniques. The method allows finding the value of diagnostic functions in the iterating operating cycle and the number of operational cycles in time before the BLDC actuator fails. For forecasting of the condition of the actuator, we propose a hierarchical structure of the neural network aiming to reduce the training time of the neural network and improve estimation accuracy.
Soil moisture data as a constraint for groundwater recharge estimation
NASA Astrophysics Data System (ADS)
Mathias, Simon A.; Sorensen, James P. R.; Butler, Adrian P.
2017-09-01
Estimating groundwater recharge rates is important for water resource management studies. Modeling approaches to forecast groundwater recharge typically require observed historic data to assist calibration. It is generally not possible to observe groundwater recharge rates directly. Therefore, in the past, much effort has been invested to record soil moisture content (SMC) data, which can be used in a water balance calculation to estimate groundwater recharge. In this context, SMC data is measured at different depths and then typically integrated with respect to depth to obtain a single set of aggregated SMC values, which are used as an estimate of the total water stored within a given soil profile. This article seeks to investigate the value of such aggregated SMC data for conditioning groundwater recharge models in this respect. A simple modeling approach is adopted, which utilizes an emulation of Richards' equation in conjunction with a soil texture pedotransfer function. The only unknown parameters are soil texture. Monte Carlo simulation is performed for four different SMC monitoring sites. The model is used to estimate both aggregated SMC and groundwater recharge. The impact of conditioning the model to the aggregated SMC data is then explored in terms of its ability to reduce the uncertainty associated with recharge estimation. Whilst uncertainty in soil texture can lead to significant uncertainty in groundwater recharge estimation, it is found that aggregated SMC is virtually insensitive to soil texture.
Gharbi, M; Moore, L S P; Gilchrist, M; Thomas, C P; Bamford, K; Brannigan, E T; Holmes, A H
2015-08-01
This study aimed to forecast the incidence rate of carbapenem resistance and to assess the impact of an antimicrobial stewardship intervention using routine antimicrobial consumption surveillance data. Following an outbreak of OXA-48-producing Klebsiella pneumoniae (January 2008-April 2010) in a renal cohort in London, a forecasting ARIMA model was derived using meropenem consumption data [defined daily dose per 100 occupied bed-days (DDD/100OBD)] from 2005-2014 as a predictor of the incidence rate of OXA-48-producing organisms (number of new cases/year/100,000OBD). Interrupted times series assessed the impact of meropenem consumption restriction as part of the outbreak control. Meropenem consumption at lag -1 year (the preceding year), highly correlated with the incidence of OXA-48-producing organisms (r=0.71; P=0.005), was included as a predictor within the forecasting model. The number of cases/100,000OBD for 2014-2015 was estimated to be 4.96 (95% CI 2.53-7.39). Analysis of meropenem consumption pre- and post-intervention demonstrated an increase of 7.12 DDD/100OBD/year (95% CI 2.97-11.27; P<0.001) in the 4 years preceding the intervention, but a decrease thereafter. The change in slope was -9.11 DDD/100OBD/year (95% CI -13.82 to -4.39). Analysis of alternative antimicrobials showed a significant increase in amikacin consumption post-intervention from 0.54 to 3.41 DDD/100OBD/year (slope +0.72, 95% CI 0.29-1.15; P=0.01). Total antimicrobials significantly decreased from 176.21 to 126.24 DDD/100OBD/year (P=0.05). Surveillance of routinely collected antimicrobial consumption data may provide a key warning indicator to anticipate increased incidence of carbapenem-resistant organisms. Further validation using real-time data is needed. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Improved Rainfall Estimates and Predictions for 21st Century Drought Early Warning
NASA Technical Reports Server (NTRS)
Funk, Chris; Peterson, Pete; Shukla, Shraddhanand; Husak, Gregory; Landsfeld, Marty; Hoell, Andrew; Pedreros, Diego; Roberts, J. B.; Robertson, F. R.; Tadesse, Tsegae;
2015-01-01
As temperatures increase, the onset and severity of droughts is likely to become more intense. Improved tools for understanding, monitoring and predicting droughts will be a key component of 21st century climate adaption. The best drought monitoring systems will bring together accurate precipitation estimates with skillful climate and weather forecasts. Such systems combine the predictive power inherent in the current land surface state with the predictive power inherent in low frequency ocean-atmosphere dynamics. To this end, researchers at the Climate Hazards Group (CHG), in collaboration with partners at the USGS and NASA, have developed i) a long (1981-present) quasi-global (50degS-50degN, 180degW-180degE) high resolution (0.05deg) homogenous precipitation data set designed specifically for drought monitoring, ii) tools for understanding and predicting East African boreal spring droughts, and iii) an integrated land surface modeling (LSM) system that combines rainfall observations and predictions to provide effective drought early warning. This talk briefly describes these three components. Component 1: CHIRPS The Climate Hazards group InfraRed Precipitation with Stations (CHIRPS), blends station data with geostationary satellite observations to provide global near real time daily, pentadal and monthly precipitation estimates. We describe the CHIRPS algorithm and compare CHIRPS and other estimates to validation data. The CHIRPS is shown to have high correlation, low systematic errors (bias) and low mean absolute errors. Component 2: Hybrid statistical-dynamic forecast strategies East African droughts have increased in frequency, but become more predictable as Indo- Pacific SST gradients and Walker circulation disruptions intensify. We describe hybrid statistical-dynamic forecast strategies that are far superior to the raw output of coupled forecast models. These forecasts can be translated into probabilities that can be used to generate bootstrapped ensembles describing future climate conditions. Component 3: Assimilation using LSMs CHIRPS rainfall observations (component 1) and bootstrapped forecast ensembles (component 2) can be combined using LSMs to predict soil moisture deficits. We evaluate the skill such a system in East Africa, and demonstrate results for 2013.
Evaluation of Satellite and Model Precipitation Products Over Turkey
NASA Astrophysics Data System (ADS)
Yilmaz, M. T.; Amjad, M.
2017-12-01
Satellite-based remote sensing, gauge stations, and models are the three major platforms to acquire precipitation dataset. Among them satellites and models have the advantage of retrieving spatially and temporally continuous and consistent datasets, while the uncertainty estimates of these retrievals are often required for many hydrological studies to understand the source and the magnitude of the uncertainty in hydrological response parameters. In this study, satellite and model precipitation data products are validated over various temporal scales (daily, 3-daily, 7-daily, 10-daily and monthly) using in-situ measured precipitation observations from a network of 733 gauges from all over the Turkey. Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B42 version 7 and European Center of Medium-Range Weather Forecast (ECMWF) model estimates (daily, 3-daily, 7-daily and 10-daily accumulated forecast) are used in this study. Retrievals are evaluated for their mean and standard deviation and their accuracies are evaluated via bias, root mean square error, error standard deviation and correlation coefficient statistics. Intensity vs frequency analysis and some contingency table statistics like percent correct, probability of detection, false alarm ratio and critical success index are determined using daily time-series. Both ECMWF forecasts and TRMM observations, on average, overestimate the precipitation compared to gauge estimates; wet biases are 10.26 mm/month and 8.65 mm/month, respectively for ECMWF and TRMM. RMSE values of ECMWF forecasts and TRMM estimates are 39.69 mm/month and 41.55 mm/month, respectively. Monthly correlations between Gauges-ECMWF, Gauges-TRMM and ECMWF-TRMM are 0.76, 0.73 and 0.81, respectively. The model and the satellite error statistics are further compared against the gauges error statistics based on inverse distance weighting (IWD) analysis. Both the model and satellite data have less IWD errors (14.72 mm/month and 10.75 mm/month, respectively) compared to gauges IWD error (21.58 mm/month). These results show that, on average, ECMWF forecast data have higher skill than TRMM observations. Overall, both ECMWF forecast data and TRMM observations show good potential for catchment scale hydrological analysis.
US industrial battery forecast
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollingsworth, V. III
1996-09-01
Last year was strong year for the US industrial battery market with growth in all segments. Sales of industrial batteries in North America grew 19.2% in 1995, exceeding last year`s forecasted growth rate of 11.6%. The results of the recently completed BCI Membership Survey forecast 1996 sales to be up 10.5%, and to continue to increase at a 10.4% compound annual rate through the year 2000. This year`s survey includes further detail on the stationary battery market with the inclusion of less than 25 Ampere-Hour batteries for the first time.
NASA Technical Reports Server (NTRS)
Mauldin, L. E.
1994-01-01
Business travel planning within an organization is often a time-consuming task. Travel Forecaster is a menu-driven, easy-to-use program which plans, forecasts cost, and tracks actual vs. planned cost for business-related travel of a division or branch of an organization and compiles this information into a database to aid the travel planner. The program's ability to handle multiple trip entries makes it a valuable time-saving device. Travel Forecaster takes full advantage of relational data base properties so that information that remains constant, such as per diem rates and airline fares (which are unique for each city), needs entering only once. A typical entry would include selection with the mouse of the traveler's name and destination city from pop-up lists, and typed entries for number of travel days and purpose of the trip. Multiple persons can be selected from the pop-up lists and multiple trips are accommodated by entering the number of days by each appropriate month on the entry form. An estimated travel cost is not required of the user as it is calculated by a Fourth Dimension formula. With this information, the program can produce output of trips by month with subtotal and total cost for either organization or sub-entity of an organization; or produce outputs of trips by month with subtotal and total cost for international-only travel. It will also provide monthly and cumulative formats of planned vs. actual outputs in data or graph form. Travel Forecaster users can do custom queries to search and sort information in the database, and it can create custom reports with the user-friendly report generator. Travel Forecaster 1.1 is a database program for use with Fourth Dimension Runtime 2.1.1. It requires a Macintosh Plus running System 6.0.3 or later, 2Mb of RAM and a hard disk. The standard distribution medium for this package is one 3.5 inch 800K Macintosh format diskette. Travel Forecaster was developed in 1991. Macintosh is a registered trademark of Apple Computer, Inc. Fourth Dimension is a registered trademark of Acius, Inc.
NASA Astrophysics Data System (ADS)
Haiyang, Yu; Yanmei, Liu; Guijun, Yang; Xiaodong, Yang; Dong, Ren; Chenwei, Nie
2014-03-01
To achieve dynamic winter wheat quality monitoring and forecasting in larger scale regions, the objective of this study was to design and develop a winter wheat quality monitoring and forecasting system by using a remote sensing index and environmental factors. The winter wheat quality trend was forecasted before the harvest and quality was monitored after the harvest, respectively. The traditional quality-vegetation index from remote sensing monitoring and forecasting models were improved. Combining with latitude information, the vegetation index was used to estimate agronomy parameters which were related with winter wheat quality in the early stages for forecasting the quality trend. A combination of rainfall in May, temperature in May, illumination at later May, the soil available nitrogen content and other environmental factors established the quality monitoring model. Compared with a simple quality-vegetation index, the remote sensing monitoring and forecasting model used in this system get greatly improved accuracy. Winter wheat quality was monitored and forecasted based on the above models, and this system was completed based on WebGIS technology. Finally, in 2010 the operation process of winter wheat quality monitoring system was presented in Beijing, the monitoring and forecasting results was outputted as thematic maps.
Moisture Forecast Bias Correction in GEOS DAS
NASA Technical Reports Server (NTRS)
Dee, D.
1999-01-01
Data assimilation methods rely on numerous assumptions about the errors involved in measuring and forecasting atmospheric fields. One of the more disturbing of these is that short-term model forecasts are assumed to be unbiased. In case of atmospheric moisture, for example, observational evidence shows that the systematic component of errors in forecasts and analyses is often of the same order of magnitude as the random component. we have implemented a sequential algorithm for estimating forecast moisture bias from rawinsonde data in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The algorithm is designed to remove the systematic component of analysis errors and can be easily incorporated in an existing statistical data assimilation system. We will present results of initial experiments that show a significant reduction of bias in the GEOS DAS moisture analyses.
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2017-04-01
Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes distance, stream-connectivity and size of the catchment areas into account. This can in some cases have a slight negative impact on the calibration error, but avoids large differences between parameters of nearby locations, whether stream connected or not. The spatial calibration also makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.
A forecasting method to reduce estimation bias in self-reported cell phone data.
Redmayne, Mary; Smith, Euan; Abramson, Michael J
2013-01-01
There is ongoing concern that extended exposure to cell phone electromagnetic radiation could be related to an increased risk of negative health effects. Epidemiological studies seek to assess this risk, usually relying on participants' recalled use, but recall is notoriously poor. Our objectives were primarily to produce a forecast method, for use by such studies, to reduce estimation bias in the recalled extent of cell phone use. The method we developed, using Bayes' rule, is modelled with data we collected in a cross-sectional cluster survey exploring cell phone user-habits among New Zealand adolescents. Participants recalled their recent extent of SMS-texting and retrieved from their provider the current month's actual use-to-date. Actual use was taken as the gold standard in the analyses. Estimation bias arose from a large random error, as observed in all cell phone validation studies. We demonstrate that this seriously exaggerates upper-end forecasts of use when used in regression models. This means that calculations using a regression model will lead to underestimation of heavy-users' relative risk. Our Bayesian method substantially reduces estimation bias. In cases where other studies' data conforms to our method's requirements, application should reduce estimation bias, leading to a more accurate relative risk calculation for mid-to-heavy users.
Gusso, Anibal; Arvor, Damien; Ducati, Jorge Ricardo; Veronez, Mauricio Roberto; da Silveira, Luiz Gonzaga
2014-01-01
Estimations of crop area were made based on the temporal profiles of the Enhanced Vegetation Index (EVI) obtained from moderate resolution imaging spectroradiometer (MODIS) images. Evaluation of the ability of the MODIS crop detection algorithm (MCDA) to estimate soybean crop areas was performed for fields in the Mato Grosso state, Brazil. Using the MCDA approach, soybean crop area estimations can be provided for December (first forecast) using images from the sowing period and for February (second forecast) using images from the sowing period and the maximum crop development period. The area estimates were compared to official agricultural statistics from the Brazilian Institute of Geography and Statistics (IBGE) and from the National Company of Food Supply (CONAB) at different crop levels from 2000/2001 to 2010/2011. At the municipality level, the estimates were highly correlated, with R (2) = 0.97 and RMSD = 13,142 ha. The MCDA was validated using field campaign data from the 2006/2007 crop year. The overall map accuracy was 88.25%, and the Kappa Index of Agreement was 0.765. By using pre-defined parameters, MCDA is able to provide the evolution of annual soybean maps, forecast of soybean cropping areas, and the crop area expansion in the Mato Grosso state.
Verification of forecast ensembles in complex terrain including observation uncertainty
NASA Astrophysics Data System (ADS)
Dorninger, Manfred; Kloiber, Simon
2017-04-01
Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.
Valuing hydrological forecasts for a pumped storage assisted hydro facility
NASA Astrophysics Data System (ADS)
Zhao, Guangzhi; Davison, Matt
2009-07-01
SummaryThis paper estimates the value of a perfectly accurate short-term hydrological forecast to the operator of a hydro electricity generating facility which can sell its power at time varying but predictable prices. The expected value of a less accurate forecast will be smaller. We assume a simple random model for water inflows and that the costs of operating the facility, including water charges, will be the same whether or not its operator has inflow forecasts. Thus, the improvement in value from better hydrological prediction results from the increased ability of the forecast using facility to sell its power at high prices. The value of the forecast is therefore the difference between the sales of a facility operated over some time horizon with a perfect forecast, and the sales of a similar facility operated over the same time horizon with similar water inflows which, though governed by the same random model, cannot be forecast. This paper shows that the value of the forecast is an increasing function of the inflow process variance and quantifies how much the value of this perfect forecast increases with the variance of the water inflow process. Because the lifetime of hydroelectric facilities is long, the small increase observed here can lead to an increase in the profitability of hydropower investments.
Marques-Toledo, Cecilia de Almeida; Degener, Carolin Marlen; Vinhal, Livia; Coelho, Giovanini; Meira, Wagner; Codeço, Claudia Torres; Teixeira, Mauro Martins
2017-07-01
Infectious diseases are a leading threat to public health. Accurate and timely monitoring of disease risk and progress can reduce their impact. Mentioning a disease in social networks is correlated with physician visits by patients, and can be used to estimate disease activity. Dengue is the fastest growing mosquito-borne viral disease, with an estimated annual incidence of 390 million infections, of which 96 million manifest clinically. Dengue burden is likely to increase in the future owing to trends toward increased urbanization, scarce water supplies and, possibly, environmental change. The epidemiological dynamic of Dengue is complex and difficult to predict, partly due to costly and slow surveillance systems. In this study, we aimed to quantitatively assess the usefulness of data acquired by Twitter for the early detection and monitoring of Dengue epidemics, both at country and city level at a weekly basis. Here, we evaluated and demonstrated the potential of tweets modeling for Dengue estimation and forecast, in comparison with other available web-based data, Google Trends and Wikipedia access logs. Also, we studied the factors that might influence the goodness-of-fit of the model. We built a simple model based on tweets that was able to 'nowcast', i.e. estimate disease numbers in the same week, but also 'forecast' disease in future weeks. At the country level, tweets are strongly associated with Dengue cases, and can estimate present and future Dengue cases until 8 weeks in advance. At city level, tweets are also useful for estimating Dengue activity. Our model can be applied successfully to small and less developed cities, suggesting a robust construction, even though it may be influenced by the incidence of the disease, the activity of Twitter locally, and social factors, including human development index and internet access. Tweets association with Dengue cases is valuable to assist traditional Dengue surveillance at real-time and low-cost. Tweets are able to successfully nowcast, i.e. estimate Dengue in the present week, but also forecast, i.e. predict Dengue at until 8 weeks in the future, both at country and city level with high estimation capacity.
Seasonal fire danger forecasts for the USA
J. Roads; F. Fujioka; S. Chen; R. Burgan
2005-01-01
The Scripps Experimental Climate Prediction Center has been making experimental, near-real-time, weekly to seasonal fire danger forecasts for the past 5 years. US fire danger forecasts and validations are based on standard indices from the National Fire Danger Rating System (DFDRS), which include the ignition component (IC), energy release component (ER), burning...
A petroleum discovery-rate forecast revisited-The problem of field growth
Drew, L.J.; Schuenemeyer, J.H.
1992-01-01
A forecast of the future rates of discovery of crude oil and natural gas for the 123,027-km2 Miocene/Pliocene trend in the Gulf of Mexico was made in 1980. This forecast was evaluated in 1988 by comparing two sets of data: (1) the actual versus the forecasted number of fields discovered, and (2) the actual versus the forecasted volumes of crude oil and natural gas discovered with the drilling of 1,820 wildcat wells along the trend between January 1, 1977, and December 31, 1985. The forecast specified that this level of drilling would result in the discovery of 217 fields containing 1.78 billion barrels of oil equivalent; however, 238 fields containing 3.57 billion barrels of oil equivalent were actually discovered. This underestimation is attributed to biases introduced by field growth and, to a lesser degree, the artificially low, pre-1970's price of natural gas that prevented many smaller gas fields from being brought into production at the time of their discovery; most of these fields contained less than 50 billion cubic feet of producible natural gas. ?? 1992 Oxford University Press.
Forecast Verification: Identification of small changes in weather forecasting skill
NASA Astrophysics Data System (ADS)
Weatherhead, E. C.; Jensen, T. L.
2017-12-01
Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.
Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment
NASA Astrophysics Data System (ADS)
Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection
2011-12-01
Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.
NASA Astrophysics Data System (ADS)
Strader, Anne; Schneider, Max; Schorlemmer, Danijel; Liukis, Maria
2016-04-01
The Collaboratory for the Study of Earthquake Predictability (CSEP) was developed to rigorously test earthquake forecasts retrospectively and prospectively through reproducible, completely transparent experiments within a controlled environment (Zechar et al., 2010). During 2006-2011, thirteen five-year time-invariant prospective earthquake mainshock forecasts developed by the Regional Earthquake Likelihood Models (RELM) working group were evaluated through the CSEP testing center (Schorlemmer and Gerstenberger, 2007). The number, spatial, and magnitude components of the forecasts were compared to the respective observed seismicity components using a set of consistency tests (Schorlemmer et al., 2007, Zechar et al., 2010). In the initial experiment, all but three forecast models passed every test at the 95% significance level, with all forecasts displaying consistent log-likelihoods (L-test) and magnitude distributions (M-test) with the observed seismicity. In the ten-year RELM experiment update, we reevaluate these earthquake forecasts over an eight-year period from 2008-2016, to determine the consistency of previous likelihood testing results over longer time intervals. Additionally, we test the Uniform California Earthquake Rupture Forecast (UCERF2), developed by the U.S. Geological Survey (USGS), and the earthquake rate model developed by the California Geological Survey (CGS) and the USGS for the National Seismic Hazard Mapping Program (NSHMP) against the RELM forecasts. Both the UCERF2 and NSHMP forecasts pass all consistency tests, though the Helmstetter et al. (2007) and Shen et al. (2007) models exhibit greater information gain per earthquake according to the T- and W- tests (Rhoades et al., 2011). Though all but three RELM forecasts pass the spatial likelihood test (S-test), multiple forecasts fail the M-test due to overprediction of the number of earthquakes during the target period. Though there is no significant difference between the UCERF2 and NSHMP models, residual scores show that the NSHMP model is preferred in locations with earthquake occurrence, due to the lower seismicity rates forecasted by the UCERF2 model.
Optimal Scaling of Aftershock Zones using Ground Motion Forecasts
NASA Astrophysics Data System (ADS)
Wilson, John Max; Yoder, Mark R.; Rundle, John B.
2018-02-01
The spatial distribution of aftershocks following major earthquakes has received significant attention due to the shaking hazard these events pose for structures and populations in the affected region. Forecasting the spatial distribution of aftershock events is an important part of the estimation of future seismic hazard. A simple spatial shape for the zone of activity has often been assumed in the form of an ellipse having semimajor axis to semiminor axis ratio of 2.0. However, since an important application of these calculations is the estimation of ground shaking hazard, an effective criterion for forecasting future aftershock impacts is to use ground motion prediction equations (GMPEs) in addition to the more usual approach of using epicentral or hypocentral locations. Based on these ideas, we present an aftershock model that uses self-similarity and scaling relations to constrain parameters as an option for such hazard assessment. We fit the spatial aspect ratio to previous earthquake sequences in the studied regions, and demonstrate the effect of the fitting on the likelihood of post-disaster ground motion forecasts for eighteen recent large earthquakes. We find that the forecasts in most geographic regions studied benefit from this optimization technique, while some are better suited to the use of the a priori aspect ratio.
Gambling score in earthquake prediction analysis
NASA Astrophysics Data System (ADS)
Molchan, G.; Romashkova, L.
2011-03-01
The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.
Hubbell, Stephen P; He, Fangliang; Condit, Richard; Borda-de-Agua, Luís; Kellner, James; Ter Steege, Hans
2008-08-12
New roads, agricultural projects, logging, and mining are claiming an ever greater area of once-pristine Amazonian forest. The Millennium Ecosystems Assessment (MA) forecasts the extinction of a large fraction of Amazonian tree species based on projected loss of forest cover over the next several decades. How accurate are these estimates of extinction rates? We use neutral theory to estimate the number, relative abundance, and range size of tree species in the Amazon metacommunity and estimate likely tree-species extinctions under published optimistic and nonoptimistic Amazon scenarios. We estimate that the Brazilian portion of the Amazon Basin has (or had) 11,210 tree species that reach sizes >10 cm DBH (stem diameter at breast height). Of these, 3,248 species have population sizes >1 million individuals, and, ignoring possible climate-change effects, almost all of these common species persist under both optimistic and nonoptimistic scenarios. At the rare end of the abundance spectrum, however, neutral theory predicts the existence of approximately 5,308 species with <10,000 individuals each that are expected to suffer nearly a 50% extinction rate under the nonoptimistic deforestation scenario and an approximately 37% loss rate even under the optimistic scenario. Most of these species have small range sizes and are highly vulnerable to local habitat loss. In ensembles of 100 stochastic simulations, we found mean total extinction rates of 20% and 33% of tree species in the Brazilian Amazon under the optimistic and nonoptimistic scenarios, respectively.
NASA Astrophysics Data System (ADS)
Boisserie, Marie
The goal of this dissertation research is to produce empirical soil moisture initial conditions (soil moisture analysis) and investigate its impact on the short-term (2 weeks) to subseasonal (2 months) forecasting skill of 2-m air temperature and precipitation. Because of soil moisture has a long memory and plays a role in controlling the surface water and energy budget, an accurate soil moisture analysis is today widely recognized as having the potential to increase summertime climate forecasting skill. However, because of a lack of global observations of soil moisture, there has been no scientific consensus on the importance of the contribution of a soil moisture initialization as close to the truth as possible to climate forecasting skill. In this study, the initial conditions are generated using a Precipitation Assimilation Reanalysis (PAR) technique to produce a soil moisture analysis. This technique consists mainly of nudging precipitation in the atmosphere component of a land-atmosphere model by adjusting the vertical air humidity profile based on the difference between the rate of the model-derived precipitation rate and the observed rate. The unique aspects of the PAR technique are the following: (1) based on the PAR technique, the soil moisture analysis is generated using a coupled land-atmosphere forecast model; therefore, no bias between the initial conditions and the forecast model (spinup problem) is encountered; and (2) the PAR technique is physically consistent; the surface and radiative fluxes remains in conjunction with the soil moisture analysis. To our knowledge, there has been no attempt to use a physically consistent soil moisture land assimilation system into a land-atmosphere model in a coupled mode. The effect of the PAR technique on the model soil moisture estimates is evaluated using the Global Soil Wetness Project Phase 2 (GSWP-2) multimodel analysis product (used as a proxy for global soil moisture observations) and actual in-situ observations from the state of Illinois. The results show that overall the PAR technique is effective; across most of the globe, the seasonal and anomaly variability of the model soil moisture estimates well reproduce the values of GSWP-2 in the top 1.5 m soil layer; by comparing to in-situ observations in Illinois, we find that the seasonal and anomaly soil moisture variability is also well represented deep into the soil. Therefore, in this study, we produce a new global soil moisture analysis dataset that can be used for many land surface studies (crop modeling, water resource management, soil erosion, etc.). Then, the contribution of the resulting soil moisture analysis (used as initial conditions) on air temperature and precipitation forecasts are investigated. For this, we follow the experimental set up of a model intercomparison study over the time period 1986-1995, the Global Land-Atmosphere Coupling Experiment second phase (GLACE-2), in which the FSU/COAPS climate model has participated. The results of the summertime air temperature forecasts show a significant increase in skill across most of the U.S. at short-term to subseasonal time scales. No increase in summertime precipitation forecasting skill is found at short-term to subseasonal time scales between 1986 and 1995, except for the anomalous drought year of 1988. We also analyze the forecasts of two extreme hydrological events, the 1988 U.S. drought and the 1993 U.S. flood. In general, the comparison of these two extreme hydrological event forecasts shows greater improvement for the summertime of 1988 than that of 1993, suggesting that soil moisture contributes more to the development of a drought than a flood. This result is consistent with Dirmeyer and Brubaker [1999] and Weaver et al. [2009]. By analyzing the evaporative sources of these two extreme events using the back-trajectory methodology of Dirmeyer and Brubaker [1999], we find similar results as this latter paper; the soil moisture-precipitation feedback mechanism seems to play a greater role during the drought year of 1988 than the flood year of 1993. Finally, the accuracy of this soil moisture initialization depends upon the quality of the precipitation dataset that is assimilated. Because of the lack of observed precipitation at a high temporal resolution (3-hourly) for the study period (1986-1995), a reanalysis product is used for precipitation assimilation in this study. It is important to keep in mind that precipitation data in reanalysis sometimes differ significantly from observations since precipitation is often not assimilated into the reanalysis model. In order to investigate that aspect, a similar analysis to that we performed in this study could be done using the 3-hourly Tropical Rainfall Measuring Mission (TRMM) dataset available for a the time period 1998-present. Then, since the TRMM dataset is a fully observational dataset, we expect the soil moisture initialization to be improved over that obtained in this study, which, in turn, may further increase the forecast skill.
A medical cost estimation with fuzzy neural network of acute hepatitis patients in emergency room.
Kuo, R J; Cheng, W C; Lien, W C; Yang, T J
2015-10-01
Taiwan is an area where chronic hepatitis is endemic. Liver cancer is so common that it has been ranked first among cancer mortality rates since the early 1980s in Taiwan. Besides, liver cirrhosis and chronic liver diseases are the sixth or seventh in the causes of death. Therefore, as shown by the active research on hepatitis, it is not only a health threat, but also a huge medical cost for the government. The estimated total number of hepatitis B carriers in the general population aged more than 20 years old is 3,067,307. Thus, a case record review was conducted from all patients with diagnosis of acute hepatitis admitted to the Emergency Department (ED) of a well-known teaching-oriented hospital in Taipei. The cost of medical resource utilization is defined as the total medical fee. In this study, a fuzzy neural network is employed to develop the cost forecasting model. A total of 110 patients met the inclusion criteria. The computational results indicate that the FNN model can provide more accurate forecasts than the support vector regression (SVR) or artificial neural network (ANN). In addition, unlike SVR and ANN, FNN can also provide fuzzy IF-THEN rules for interpretation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Poveda, G.; Pineda, K.
2009-12-01
Clear-cut evidences of global environmental change in Colombia are discussed for diverse hydro-climatic records, and illustrated herein for increasing minimum temperature and decreasing annual maximum river flows records. As a consequence, eight tropical glaciers disappeared from the Colombian Andes during the 20th century, and the remaining six have experienced alarming retreat rates during the last decade. Here we report an updated estimation of retreat rates in the six remaining glacierized mountain ranges of Colombia for the period 1987-2007, using Landsat TM and TM+ imagery. Analyses are performed using detailed pre-processing, processing and post-processing satellite imagery techniques. Alarming retreat rates are confirmed in the studied glaciers, with an overall area shrinkage from 60 km2 in 2002, to 55.4 km2 in 2003, to less than 45 km2 in 2007. Assuming such linear loss rate (~3 km2 per year), for the near and medium term, the total collapse of the Colombian glaciers can be foreseen by 2022, but diverse physical mechanisms discussed herein would exacerbate the shrinkage processes, thus prompting us to forecast a much earlier deadline by the late 2010-2020 decade, long before the 100 years foreseen by the 2007 IPCC Fourth Assessment Report. This forecast demands detailed monitoring studies of mass and energy balances. Our updated estimations of Colombia's glacier retreat rates posse serious challenges for highly valuable ecosystem services, including water supply of several large cities and hundreds of rural settlements along the Colombian Andes, but also for cheap and renewable hydropower generation which provides 80% of Colombia's demand. Also, the identified changes threaten the survivability of unique and fragile ecosystems like paramos and cloud forests, in turn contributing to exacerbate social unrest and ongoing environmental problems in the tropical Andes which have been identified as the most critical hotspot for biodiversity on Earth. Colombia requires support from the global adaptation fund to develop research, and to design policies, strategies and tools to cope with these urgent social and environmental threats.
Linking seasonal climate forecasts with crop models in Iberian Peninsula
NASA Astrophysics Data System (ADS)
Capa, Mirian; Ines, Amor; Baethgen, Walter; Rodriguez-Fonseca, Belen; Han, Eunjin; Ruiz-Ramos, Margarita
2015-04-01
Translating seasonal climate forecasts into agricultural production forecasts could help to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse conditions. In this study, we use seasonal rainfall forecasts and crop models to improve predictability of wheat yield in the Iberian Peninsula (IP). Additionally, we estimate economic margins and production risks associated with extreme scenarios of seasonal rainfall forecast. This study evaluates two methods for disaggregating seasonal climate forecasts into daily weather data: 1) a stochastic weather generator (CondWG), and 2) a forecast tercile resampler (FResampler). Both methods were used to generate 100 (with FResampler) and 110 (with CondWG) weather series/sequences for three scenarios of seasonal rainfall forecasts. Simulated wheat yield is computed with the crop model CERES-wheat (Ritchie and Otter, 1985), which is included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at two locations in northeastern Spain where the crop model was calibrated and validated with independent field data. Once simulated yields were obtained, an assessment of farmer's gross margin for different seasonal climate forecasts was accomplished to estimate production risks under different climate scenarios. This methodology allows farmers to assess the benefits and risks of a seasonal weather forecast in IP prior to the crop growing season. The results of this study may have important implications on both, public (agricultural planning) and private (decision support to farmers, insurance companies) sectors. Acknowledgements Research by M. Capa-Morocho has been partly supported by a PICATA predoctoral fellowship of the Moncloa Campus of International Excellence (UCM-UPM) and MULCLIVAR project (CGL2012-38923-C02-02) References Hoogenboom, G. et al., 2010. The Decision Support System for Agrotechnology Transfer (DSSAT).Version 4.5 [CD-ROM].University of Hawaii, Honolulu, Hawaii. Ritchie, J.T., Otter, S., 1985. Description and performanceof CERES-Wheat: a user-oriented wheat yield model. In: ARS Wheat Yield Project. ARS-38.Natl Tech Info Serv, Springfield, Missouri, pp. 159-175.
Forecasting malaria in a highly endemic country using environmental and clinical predictors.
Zinszer, Kate; Kigozi, Ruth; Charland, Katia; Dorsey, Grant; Brewer, Timothy F; Brownstein, John S; Kamya, Moses R; Buckeridge, David L
2015-06-18
Malaria thrives in poor tropical and subtropical countries where local resources are limited. Accurate disease forecasts can provide public and clinical health services with the information needed to implement targeted approaches for malaria control that make effective use of limited resources. The objective of this study was to determine the relevance of environmental and clinical predictors of malaria across different settings in Uganda. Forecasting models were based on health facility data collected by the Uganda Malaria Surveillance Project and satellite-derived rainfall, temperature, and vegetation estimates from 2006 to 2013. Facility-specific forecasting models of confirmed malaria were developed using multivariate autoregressive integrated moving average models and produced weekly forecast horizons over a 52-week forecasting period. The model with the most accurate forecasts varied by site and by forecast horizon. Clinical predictors were retained in the models with the highest predictive power for all facility sites. The average error over the 52 forecasting horizons ranged from 26 to 128% whereas the cumulative burden forecast error ranged from 2 to 22%. Clinical data, such as drug treatment, could be used to improve the accuracy of malaria predictions in endemic settings when coupled with environmental predictors. Further exploration of malaria forecasting is necessary to improve its accuracy and value in practice, including examining other environmental and intervention predictors, including insecticide-treated nets.
76 FR 9696 - Equipment Price Forecasting in Energy Conservation Standards Analysis
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-22
... for particular efficiency design options, an empirical experience curve fit to the available data may be used to forecast future costs of such design option technologies. If a statistical evaluation indicates a low level of confidence in estimates of the design option cost trend, this method should not be...
NASA Astrophysics Data System (ADS)
Bulatov, S. V.
2018-05-01
The article considers the method of short-term combined forecasting, which includes theoretical and experimental estimates of the need for details of units and assemblies, which allows obtaining the optimum number of spare parts necessary for rolling stock operation without downtime in repair areas.
The Field Production of Water for Injection
1985-12-01
L/day Bedridden Patient 0.75 L/day Average Diseased Patient 0.50 L/day e (There is no feasible methodology to forecast the number of procedures per... Bedridden Patient 0.75 All Diseased Patients 0.50 An estimate of the liters/day needed may be calculated based on a forecasted patient stream, including