Sample records for forecasting methodologies details

  1. Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,

    DTIC Science & Technology

    1975-12-01

    Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and

  2. Evaluation of Wind Power Forecasts from the Vermont Weather Analytics Center and Identification of Improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Optis, Michael; Scott, George N.; Draxl, Caroline

    The goal of this analysis was to assess the wind power forecast accuracy of the Vermont Weather Analytics Center (VTWAC) forecast system and to identify potential improvements to the forecasts. Based on the analysis at Georgia Mountain, the following recommendations for improving forecast performance were made: 1. Resolve the significant negative forecast bias in February-March 2017 (50% underprediction on average) 2. Improve the ability of the forecast model to capture the strong diurnal cycle of wind power 3. Add ability for forecast model to assess internal wake loss, particularly at sites where strong diurnal shifts in wind direction are present.more » Data availability and quality limited the robustness of this forecast assessment. A more thorough analysis would be possible given a longer period of record for the data (at least one full year), detailed supervisory control and data acquisition data for each wind plant, and more detailed information on the forecast system input data and methodologies.« less

  3. Quality Assessment of the Cobel-Isba Numerical Forecast System of Fog and Low Clouds

    NASA Astrophysics Data System (ADS)

    Bergot, Thierry

    2007-06-01

    Short-term forecasting of fog is a difficult issue which can have a large societal impact. Fog appears in the surface boundary layer and is driven by the interactions between land surface and the lower layers of the atmosphere. These interactions are still not well parameterized in current operational NWP models, and a new methodology based on local observations, an adaptive assimilation scheme and a local numerical model is tested. The proposed numerical forecast method of foggy conditions has been run during three years at Paris-CdG international airport. This test over a long-time period allows an in-depth evaluation of the forecast quality. This study demonstrates that detailed 1-D models, including detailed physical parameterizations and high vertical resolution, can reasonably represent the major features of the life cycle of fog (onset, development and dissipation) up to +6 h. The error on the forecast onset and burn-off time is typically 1 h. The major weakness of the methodology is related to the evolution of low clouds (stratus lowering). Even if the occurrence of fog is well forecasted, the value of the horizontal visibility is only crudely forecasted. Improvements in the microphysical parameterization and in the translation algorithm converting NWP prognostic variables into a corresponding horizontal visibility seems necessary to accurately forecast the value of the visibility.

  4. Documentation of volume 3 of the 1978 Energy Information Administration annual report to congress

    NASA Astrophysics Data System (ADS)

    1980-02-01

    In a preliminary overview of the projection process, the relationship between energy prices, supply, and demand is addressed. Topics treated in detail include a description of energy economic interactions, assumptions regarding world oil prices, and energy modeling in the long term beyond 1995. Subsequent sections present the general approach and methodology underlying the forecasts, and define and describe the alternative projection series and their associated assumptions. Short term forecasting, midterm forecasting, long term forecasting of petroleum, coal, and gas supplies are included. The role of nuclear power as an energy source is also discussed.

  5. Multilayer Stock Forecasting Model Using Fuzzy Time Series

    PubMed Central

    Javedani Sadaei, Hossein; Lee, Muhammad Hisyam

    2014-01-01

    After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS. PMID:24605058

  6. Increasing the temporal resolution of direct normal solar irradiance forecasted series

    NASA Astrophysics Data System (ADS)

    Fernández-Peruchena, Carlos M.; Gastón, Martin; Schroedter-Homscheidt, Marion; Marco, Isabel Martínez; Casado-Rubio, José L.; García-Moya, José Antonio

    2017-06-01

    A detailed knowledge of the solar resource is a critical point in the design and control of Concentrating Solar Power (CSP) plants. In particular, accurate forecasting of solar irradiance is essential for the efficient operation of solar thermal power plants, the management of energy markets, and the widespread implementation of this technology. Numerical weather prediction (NWP) models are commonly used for solar radiation forecasting. In the ECMWF deterministic forecasting system, all forecast parameters are commercially available worldwide at 3-hourly intervals. Unfortunately, as Direct Normal solar Irradiance (DNI) exhibits a great variability due to the dynamic effects of passing clouds, 3-h time resolution is insufficient for accurate simulations of CSP plants due to their nonlinear response to DNI, governed by various thermal inertias due to their complex response characteristics. DNI series of hourly or sub-hourly frequency resolution are normally used for an accurate modeling and analysis of transient processes in CSP technologies. In this context, the objective of this study is to propose a methodology for generating synthetic DNI time series at 1-h (or higher) temporal resolution from 3-h DNI series. The methodology is based upon patterns as being defined with help of the clear-sky envelope approach together with a forecast of maximum DNI value, and it has been validated with high quality measured DNI data.

  7. El Niño-Southern Oscillation and the seasonal predictability of

    Science.gov Websites

    relationships and can be utilized to provide seasonal forecasts of tropical cyclones. Details of methodologies thunderstorm systems (called mesoscale convective complexes [MCCs]) often produce an inertially stable, warm , they considered hurricanes and intense hurricanes that occurred anywhere within these water boundaries

  8. Methodology for Air Quality Forecast Downscaling from Regional- to Street-Scale

    NASA Astrophysics Data System (ADS)

    Baklanov, Alexander; Nuterman, Roman; Mahura, Alexander; Amstrup, Bjarne; Hansen Saas, Bent; Havskov Sørensen, Jens; Lorenzen, Thomas; Weismann, Jakob

    2010-05-01

    The most serious air pollution events occur in cities where there is a combination of high population density and air pollution, e.g. from vehicles. The pollutants can lead to serious human health problems, including asthma, irritation of the lungs, bronchitis, pneumonia, decreased resistance to respiratory infections, and premature death. In particular air pollution is associated with increase in cardiovascular disease and lung cancer. In 2000 WHO estimated that between 2.5 % and 11 % of total annual deaths are caused by exposure to air pollution. However, European-scale air quality models are not suited for local forecasts, as their grid-cell is typically of the order of 5 to 10km and they generally lack detailed representation of urban effects. Two suites are used in the framework of the EC FP7 project MACC (Monitoring of Atmosphere Composition and Climate) to demonstrate how downscaling from the European MACC ensemble to local-scale air quality forecast will be carried out: one will illustrate capabilities for the city of Copenhagen (Denmark); the second will focus on the city of Bucharest (Romania). This work is devoted to the first suite, where methodological aspects of downscaling from regional (European/ Denmark) to urban scale (Copenhagen), and from the urban down to street scale. The first results of downscaling according to the proposed methodology are presented. The potential for downscaling of European air quality forecasts by operating urban and street-level forecast models is evaluated. This will bring a strong support for continuous improvement of the regional forecast modelling systems for air quality in Europe, and underline clear perspectives for the future regional air quality core and downstream services for end-users. At the end of the MACC project, requirements on "how-to-do" downscaling of European air-quality forecasts to the city and street levels with different approaches will be formulated.

  9. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Astrophysics Data System (ADS)

    Robertson, F. R.; Roberts, J. B.

    2014-12-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  10. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, Jason B.

    2014-01-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  11. Outside users payload model

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  12. Model documentation report: Residential sector demand module of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less

  13. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-02-26

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of amore » two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendes, J.; Bessa, R.J.; Keko, H.

    Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highlymore » dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.« less

  15. A Delphi Forecast of Technology in Education.

    ERIC Educational Resources Information Center

    Robinson, Burke E.

    The forecast reported here surveys expected utilization levels, organizational structures, and values concerning technology in education in 1990. The focus is upon educational technology and forecasting methodology; televised instruction, computer-assisted instruction (CAI), and information services are considered. The methodology employed…

  16. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias

    With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less

  17. Model documentation report: Transportation sector model of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-03-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less

  18. Worldwide transportation/energy demand, 1975-2000. Revised Variflex model projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayres, R.U.; Ayres, L.W.

    1980-03-01

    The salient features of the transportation-energy relationships that characterize the world of 1975 are reviewed, and worldwide (34 countries) long-range transportation demand by mode to the year 2000 is reviewed. A worldwide model is used to estimate future energy demand for transportation. Projections made by the forecasting model indicate that in the year 2000, every region will be more dependent on petroleum for the transportation sector than it was in 1975. This report is intended to highlight certain trends and to suggest areas for further investigation. Forecast methodology and model output are described in detail in the appendices. The reportmore » is one of a series addressing transportation energy consumption; it supplants and replaces an earlier version published in October 1978 (ORNL/Sub-78/13536/1).« less

  19. Technology requirements for future Earth-to-geosynchronous orbit transportation systems. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Caluori, V. A.; Conrad, R. T.; Jenkins, J. C.

    1980-01-01

    Technological requirements and forecasts of rocket engine parameters and launch vehicles for future Earth to geosynchronous orbit transportation systems are presented. The parametric performance, weight, and envelope data for the LOX/CH4, fuel cooled, staged combustion cycle and the hydrogen cooled, expander bleed cycle engine concepts are discussed. The costing methodology and ground rules used to develop the engine study are summarized. The weight estimating methodology for winged launched vehicles is described and summary data, used to evaluate and compare weight data for dedicated and integrated O2/H2 subsystems for the SSTO, HLLV and POTV are presented. Detail weights, comparisons, and weight scaling equations are provided.

  20. Short-term energy outlook. Volume 2. Methodology

    NASA Astrophysics Data System (ADS)

    1983-05-01

    Recent changes in forecasting methodology for nonutility distillate fuel oil demand and for the near-term petroleum forecasts are discussed. The accuracy of previous short-term forecasts of most of the major energy sources published in the last 13 issues of the Outlook is evaluated. Macroeconomic and weather assumptions are included in this evaluation. Energy forecasts for 1983 are compared. Structural change in US petroleum consumption, the use of appropriate weather data in energy demand modeling, and petroleum inventories, imports, and refinery runs are discussed.

  1. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  2. Arab energy: prospects to 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-01-01

    The energy situation of 21 Arab countries for the period between 1960 and 2000 is examined. Attempts to forecast the demand and supply of energy in the Arab world for 1985, 1990 and 2000 are discussed. Following a description of the methodology employed, crude petroleum, petroleum production, natural gas and electricity are explored in detail. The national programs of the Arab countries for electric-power generation include conventional thermal electricity, hydroelectricity, nuclear power, solar energy, biomass conversion, and geothermal and wind energy. 23 references.

  3. Economic consequences of improved temperature forecasts: An experiment with the Florida citrus growers (control group results). [weather forecasting

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A demonstration experiment is being planned to show that frost and freeze prediction improvements are possible utilizing timely Synchronous Meteorological Satellite temperature measurements and that this information can affect Florida citrus grower operations and decisions. An economic experiment was carried out which will monitor citrus growers' decisions, actions, costs and losses, and meteorological forecasts and actual weather events and will establish the economic benefits of improved temperature forecasts. A summary is given of the economic experiment, the results obtained to date, and the work which still remains to be done. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service, and Federal Crop Insurance Corp., resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements.

  4. Long-term ensemble forecast of snowmelt inflow into the Cheboksary Reservoir under two different weather scenarios

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.

    2018-04-01

    A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.

  5. A versatile data-visualization application for the Norwegian flood forecasting service

    NASA Astrophysics Data System (ADS)

    Kobierska, Florian; Langsholt, Elin G.; Hamududu, Byman H.; Engeland, Kolbjørn

    2017-04-01

    - General motivation A graphical user interface has been developed to visualize multi-model hydrological forecasts at the flood forecasting service of the Norwegian water and energy directorate. It is based on the R 'shiny' package, with which interactive web applications can quickly be prototyped. The app queries multiple data sources, building a comprehensive infographics dashboard for the decision maker. - Main features of the app The visualization application comprises several tabs, each built with different functionality and focus. A map of forecast stations gives a rapid insight of the flood situation and serves, concurrently, as a map station selection (based on the 'leaflet' package). The map selection is linked to multi-panel forecast plots which can present input, state or runoff parameters. Another tab focuses on past model performance and calibration runs. - Software design choices The application was programmed with a focus on flexibility regarding data-sources. The parsing of text-based model results was explicitly separated from the app (in the separate R package 'NVEDATA'), so that it only loads standardized RData binary files. We focused on allowing re-usability in other contexts by structuring the app into specific 'shiny' modules. The code was bundled into an R package, which is available on GitHub. - Documentation efforts A documentation website is under development. For easier collaboration, we chose to host it on the 'GitHub Pages' branch of the repository and build it automatically with a continuous integration service. The aim is to gather all information about the flood forecasting methodology at NVE in one location. This encompasses details on each hydrological model used as well as the documentation of the data-visualization application. - Outlook for further development The ability to select a group of stations by filtering a table (i.e. past performance, past major flood events, catchment parameters) and exporting it to the forecast tab could be of interest for detailed model analysis. The design choices for this app were motivated by a need for extensibility and modularity and those qualities will be tested and improved as new datasets need integrating into this to​ol.

  6. Methodological Problems in the Forecasting of Education

    ERIC Educational Resources Information Center

    Kostanian, S. L.

    1978-01-01

    Examines how forecasting of educational development in the Soviet Union can be coordinated with forecasts of scientific and technical progress. Predicts that the efficiency of social forecasting will increase when more empirical data on macro- and micro-processes is collected. (Author/DB)

  7. Louisiana Airport System Plan aviation activity forecasts 1990-2010.

    DOT National Transportation Integrated Search

    1991-07-01

    This report documents the methodology used to develop the aviation activity forecasts prepared as a part of the update to the Louisiana Airport System Plan and provides Louisiana aviation forecasts for the years 1990 to 2010. In general, the forecast...

  8. An Econometric Model for Forecasting Income and Employment in Hawaii.

    ERIC Educational Resources Information Center

    Chau, Laurence C.

    This report presents the methodology for short-run forecasting of personal income and employment in Hawaii. The econometric model developed in the study is used to make actual forecasts through 1973 of income and employment, with major components forecasted separately. Several sets of forecasts are made, under different assumptions on external…

  9. Geothermal energy employment and requirements 1977-1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-12-01

    An assessment of the manpower needs of the geothermal industry is presented. The specific objectives were to: derive a base line estimate of the manpower involved in geothermal activities, determine if there is any current or impending likelihood of skill shortages, forecast future employment in the geothermal industry, conduct a technology assessment to ascertain the possibilities of some sudden breakthrough, and suggest alternatives commensurate with the findings. The methodology for fulfilling the objectives is described. Detailed results of these pursuits (objectives) are presented. Alternatives that are suggested, based upon the findings of the study, are summarized.

  10. How bootstrap can help in forecasting time series with more than one seasonal pattern

    NASA Astrophysics Data System (ADS)

    Cordeiro, Clara; Neves, M. Manuela

    2012-09-01

    The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.

  11. An experimental system for flood risk forecasting at global scale

    NASA Astrophysics Data System (ADS)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  12. Diabatic forcing and initialization with assimilation of cloud and rain water in a forecast model: Methodology

    NASA Technical Reports Server (NTRS)

    Raymond, William H.; Olson, William S.; Callan, Geary

    1990-01-01

    The focus of this part of the investigation is to find one or more general modeling techniques that will help reduce the time taken by numerical forecast models to initiate or spin-up precipitation processes and enhance storm intensity. If the conventional data base could explain the atmospheric mesoscale flow in detail, then much of our problem would be eliminated. But the data base is primarily synoptic scale, requiring that a solution must be sought either in nonconventional data, in methods to initialize mesoscale circulations, or in ways of retaining between forecasts the model generated mesoscale dynamics and precipitation fields. All three methods are investigated. The initialization and assimilation of explicit cloud and rainwater quantities computed from conservation equations in a mesoscale regional model are examined. The physical processes include condensation, evaporation, autoconversion, accretion, and the removal of rainwater by fallout. The question of how to initialize the explicit liquid water calculations in numerical models and how to retain information about precipitation processes during the 4-D assimilation cycle are important issues that are addressed. The explicit cloud calculations were purposely kept simple so that different initialization techniques can be easily and economically tested. Precipitation spin-up processes associated with three different types of weather phenomena are examined. Our findings show that diabatic initialization, or diabatic initialization in combination with a new diabatic forcing procedure, work effectively to enhance the spin-up of precipitation in a mesoscale numerical weather prediction forecast. Also, the retention of cloud and rain water during the analysis phase of the 4-D data assimilation procedure is shown to be valuable. Without detailed observations, the vertical placement of the diabatic heating remains a critical problem.

  13. Metric optimisation for analogue forecasting by simulated annealing

    NASA Astrophysics Data System (ADS)

    Bliefernicht, J.; Bárdossy, A.

    2009-04-01

    It is well known that weather patterns tend to recur from time to time. This property of the atmosphere is used by analogue forecasting techniques. They have a long history in weather forecasting and there are many applications predicting hydrological variables at the local scale for different lead times. The basic idea of the technique is to identify past weather situations which are similar (analogue) to the predicted one and to take the local conditions of the analogues as forecast. But the forecast performance of the analogue method depends on user-defined criteria like the choice of the distance function and the size of the predictor domain. In this study we propose a new methodology of optimising both criteria by minimising the forecast error with simulated annealing. The performance of the methodology is demonstrated for the probability forecast of daily areal precipitation. It is compared with a traditional analogue forecasting algorithm, which is used operational as an element of a hydrological forecasting system. The study is performed for several meso-scale catchments located in the Rhine basin in Germany. The methodology is validated by a jack-knife method in a perfect prognosis framework for a period of 48 years (1958-2005). The predictor variables are derived from the NCEP/NCAR reanalysis data set. The Brier skill score and the economic value are determined to evaluate the forecast skill and value of the technique. In this presentation we will present the concept of the optimisation algorithm and the outcome of the comparison. It will be also demonstrated how a decision maker should apply a probability forecast to maximise the economic benefit from it.

  14. Short-term forecasting of turbidity in trunk main networks.

    PubMed

    Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward

    2017-11-01

    Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Using the Random Nearest Neighbor Data Mining Method to Extract Maximum Information Content from Weather Forecasts from Multiple Predictors of Weather and One Predictand (Low-Level Turbulence)

    DTIC Science & Technology

    2014-10-30

    Force Weather Agency (AFWA) WRF 15-km atmospheric model forecast data and low-level turbulence. Archives of historical model data forecast predictors...Relationships between WRF model predictors and PIREPS were developed using the new data mining methodology. The new methodology was inspired...convection. Predictors of turbulence were collected from the AFWA WRF 15km model, and corresponding PIREPS (the predictand) were collected between 2013

  16. [50 years of the methodology of weather forecasting for medicine].

    PubMed

    Grigor'ev, K I; Povazhnaia, E L

    2014-01-01

    The materials reported in the present article illustrate the possibility of weather forecasting for the medical purposes in the historical aspect. The main characteristics of the relevant organizational and methodological approaches to meteoprophylaxis based of the standard medical forecasts are presented. The emphasis is laid on the priority of the domestic medical school in the development of the principles of diagnostics and treatment of meteosensitivity and meteotropic complications in the patients presenting with various diseases with special reference to their age-related characteristics.

  17. A novel hybrid ensemble learning paradigm for tourism forecasting

    NASA Astrophysics Data System (ADS)

    Shabri, Ani

    2015-02-01

    In this paper, a hybrid forecasting model based on Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) is proposed to forecast tourism demand. This methodology first decomposes the original visitor arrival series into several Intrinsic Model Function (IMFs) components and one residual component by EMD technique. Then, IMFs components and the residual components is forecasted respectively using GMDH model whose input variables are selected by using Partial Autocorrelation Function (PACF). The final forecasted result for tourism series is produced by aggregating all the forecasted results. For evaluating the performance of the proposed EMD-GMDH methodologies, the monthly data of tourist arrivals from Singapore to Malaysia are used as an illustrative example. Empirical results show that the proposed EMD-GMDH model outperforms the EMD-ARIMA as well as the GMDH and ARIMA (Autoregressive Integrated Moving Average) models without time series decomposition.

  18. Past speculations of the future: a review of the methods used for forecasting emerging health technologies

    PubMed Central

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2016-01-01

    Objectives Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3–20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Design Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. Participants People are not needed in this study. Data sources The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Main outcome measure Studies reporting methods used to predict future health technologies within a 3–20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. Results 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. Conclusions The methodological fundamentals of formal 3–20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. PMID:26966060

  19. A time series model: First-order integer-valued autoregressive (INAR(1))

    NASA Astrophysics Data System (ADS)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  20. Action-based flood forecasting for triggering humanitarian action

    NASA Astrophysics Data System (ADS)

    Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin

    2016-09-01

    Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.

  1. A new method for determining the optimal lagged ensemble

    PubMed Central

    DelSole, T.; Tippett, M. K.; Pegion, K.

    2017-01-01

    Abstract We propose a general methodology for determining the lagged ensemble that minimizes the mean square forecast error. The MSE of a lagged ensemble is shown to depend only on a quantity called the cross‐lead error covariance matrix, which can be estimated from a short hindcast data set and parameterized in terms of analytic functions of time. The resulting parameterization allows the skill of forecasts to be evaluated for an arbitrary ensemble size and initialization frequency. Remarkably, the parameterization also can estimate the MSE of a burst ensemble simply by taking the limit of an infinitely small interval between initialization times. This methodology is applied to forecasts of the Madden Julian Oscillation (MJO) from version 2 of the Climate Forecast System version 2 (CFSv2). For leads greater than a week, little improvement is found in the MJO forecast skill when ensembles larger than 5 days are used or initializations greater than 4 times per day. We find that if the initialization frequency is too infrequent, important structures of the lagged error covariance matrix are lost. Lastly, we demonstrate that the forecast error at leads ≥10 days can be reduced by optimally weighting the lagged ensemble members. The weights are shown to depend only on the cross‐lead error covariance matrix. While the methodology developed here is applied to CFSv2, the technique can be easily adapted to other forecast systems. PMID:28580050

  2. Approaches to Forecasting Demands for Library Network Services. Report No. 10.

    ERIC Educational Resources Information Center

    Kang, Jong Hoa

    The problem of forecasting monthly demands for library network services is considered in terms of using forecasts as inputs to policy analysis models, and in terms of using forecasts to aid in the making of budgeting and staffing decisions. Box-Jenkins time-series methodology, adaptive filtering, and regression approaches are examined and compared…

  3. Using volcanic tremor for eruption forecasting at White Island volcano (Whakaari), New Zealand

    NASA Astrophysics Data System (ADS)

    Chardot, Lauriane; Jolly, Arthur D.; Kennedy, Ben M.; Fournier, Nicolas; Sherburn, Steven

    2015-09-01

    Eruption forecasting is a challenging task because of the inherent complexity of volcanic systems. Despite remarkable efforts to develop complex models in order to explain volcanic processes prior to eruptions, the material Failure Forecast Method (FFM) is one of the very few techniques that can provide a forecast time for an eruption. However, the method requires testing and automation before being used as a real-time eruption forecasting tool at a volcano. We developed an automatic algorithm to issue forecasts from volcanic tremor increase episodes recorded by Real-time Seismic Amplitude Measurement (RSAM) at one station and optimised this algorithm for the period August 2011-January 2014 which comprises the recent unrest period at White Island volcano (Whakaari), New Zealand. A detailed residual analysis was paramount to select the most appropriate model explaining the RSAM time evolutions. In a hindsight simulation, four out of the five small eruptions reported during this period occurred within a failure window forecast by our optimised algorithm and the probability of an eruption on a day within a failure window was 0.21, which is 37 times higher than the probability of having an eruption on any day during the same period (0.0057). Moreover, the forecasts were issued prior to the eruptions by a few hours which is important from an emergency management point of view. Whereas the RSAM time evolutions preceding these four eruptions have a similar goodness-of-fit with the FFM, their spectral characteristics are different. The duration-amplitude distributions of the precursory tremor episodes support the hypothesis that several processes were likely occurring prior to these eruptions. We propose that slow rock failure and fluid flow processes are plausible candidates for the tremor source of these episodes. This hindsight exercise can be useful for future real-time implementation of the FFM at White Island. A similar methodology could also be tested at other volcanoes even if only a limited network is available.

  4. The SPoRT-WRF: Evaluating the Impact of NASA Datasets on Convective Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Kozlowski, Danielle; Case, Jonathan; Molthan, Andrew

    2012-01-01

    Short-term Prediction Research and Transition (SPoRT) seeks to improve short-term, regional weather forecasts using unique NASA products and capabilities SPoRT has developed a unique, real-time configuration of the NASA Unified Weather Research and Forecasting (WRF)WRF (ARW) that integrates all SPoRT modeling research data: (1) 2-km SPoRT Sea Surface Temperature (SST) Composite, (2) 3-km LIS with 1-km Greenness Vegetation Fraction (GVFs) (3) 45-km AIRS retrieved profiles. Transitioned this real-time forecast to NOAA's Hazardous Weather Testbed (HWT) as deterministic model at Experimental Forecast Program (EFP). Feedback from forecasters/participants and internal evaluation of SPoRT-WRF shows a cool, dry bias that appears to suppress convection likely related to methodology for assimilation of AIRS profiles Version 2 of the SPoRT-WRF will premier at the 2012 EFP and include NASA physics, cycling data assimilation methodology, better coverage of precipitation forcing, and new GVFs

  5. Modeled Forecasts of Dengue Fever in San Juan, Puerto Rico Using NASA Satellite Enhanced Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.

    2015-12-01

    Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.

  6. Bayesian Hierarchical Models to Augment the Mediterranean Forecast System

    DTIC Science & Technology

    2010-09-30

    In part 2 (Bonazzi et al., 2010), the impact of the ensemble forecast methodology based on MFS-Wind-BHM perturbations is documented. Forecast...absence of dt data stage inputs, the forecast impact of MFS-Error-BHM is neutral. Experiments are underway now to introduce dt back into the MFS-Error...BHM and quantify forecast impacts at MFS. MFS-SuperEnsemble-BHM We have assembled all needed datasets and completed algorithmic development

  7. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting.

    PubMed

    Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.

  8. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting

    PubMed Central

    Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627

  9. Past speculations of the future: a review of the methods used for forecasting emerging health technologies.

    PubMed

    Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew

    2016-03-10

    Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3-20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. People are not needed in this study. The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Studies reporting methods used to predict future health technologies within a 3-20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. The methodological fundamentals of formal 3-20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Baseline and target values for regional and point PV power forecasts: Toward improved solar forecasting

    DOE PAGES

    Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; ...

    2015-11-10

    Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less

  11. Adaptation of Mesoscale Weather Models to Local Forecasting

    NASA Technical Reports Server (NTRS)

    Manobianco, John T.; Taylor, Gregory E.; Case, Jonathan L.; Dianic, Allan V.; Wheeler, Mark W.; Zack, John W.; Nutter, Paul A.

    2003-01-01

    Methodologies have been developed for (1) configuring mesoscale numerical weather-prediction models for execution on high-performance computer workstations to make short-range weather forecasts for the vicinity of the Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) and (2) evaluating the performances of the models as configured. These methodologies have been implemented as part of a continuing effort to improve weather forecasting in support of operations of the U.S. space program. The models, methodologies, and results of the evaluations also have potential value for commercial users who could benefit from tailoring their operations and/or marketing strategies based on accurate predictions of local weather. More specifically, the purpose of developing the methodologies for configuring the models to run on computers at KSC and CCAFS is to provide accurate forecasts of winds, temperature, and such specific thunderstorm-related phenomena as lightning and precipitation. The purpose of developing the evaluation methodologies is to maximize the utility of the models by providing users with assessments of the capabilities and limitations of the models. The models used in this effort thus far include the Mesoscale Atmospheric Simulation System (MASS), the Regional Atmospheric Modeling System (RAMS), and the National Centers for Environmental Prediction Eta Model ( Eta for short). The configuration of the MASS and RAMS is designed to run the models at very high spatial resolution and incorporate local data to resolve fine-scale weather features. Model preprocessors were modified to incorporate surface, ship, buoy, and rawinsonde data as well as data from local wind towers, wind profilers, and conventional or Doppler radars. The overall evaluation of the MASS, Eta, and RAMS was designed to assess the utility of these mesoscale models for satisfying the weather-forecasting needs of the U.S. space program. The evaluation methodology includes objective and subjective verification methodologies. Objective (e.g., statistical) verification of point forecasts is a stringent measure of model performance, but when used alone, it is not usually sufficient for quantifying the value of the overall contribution of the model to the weather-forecasting process. This is especially true for mesoscale models with enhanced spatial and temporal resolution that may be capable of predicting meteorologically consistent, though not necessarily accurate, fine-scale weather phenomena. Therefore, subjective (phenomenological) evaluation, focusing on selected case studies and specific weather features, such as sea breezes and precipitation, has been performed to help quantify the added value that cannot be inferred solely from objective evaluation.

  12. A Delphi forecast of technology in education

    NASA Technical Reports Server (NTRS)

    Robinson, B. E.

    1973-01-01

    The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.

  13. Financial options methodology for analyzing investments in new technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenning, B.D.

    1994-12-31

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisionsmore » are being contemplated.« less

  14. Financial options methodology for analyzing investments in new technology

    NASA Technical Reports Server (NTRS)

    Wenning, B. D.

    1995-01-01

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  15. Long- Range Forecasting Of The Onset Of Southwest Monsoon Winds And Waves Near The Horn Of Africa

    DTIC Science & Technology

    2017-12-01

    SUMMARY OF CLIMATE ANALYSIS AND LONG-RANGE FORECAST METHODOLOGY Prior theses from Heidt (2006) and Lemke (2010) used methods similar to ours and to...6 II. DATA AND METHODS .......................................................................................7 A...9 D. ANALYSIS AND FORECAST METHODS .........................................10 1. Predictand Selection

  16. Forecasting--A Systematic Modeling Methodology. Paper No. 489.

    ERIC Educational Resources Information Center

    Mabert, Vincent A.; Radcliffe, Robert C.

    In an attempt to bridge the gap between academic understanding and practical business use, the Box-Jenkins technique of time series analysis for forecasting future events is presented with a minimum of mathematical notation. The method is presented in three stages: a discussion of traditional forecasting techniques, focusing on traditional…

  17. A Short-Term Forecasting Procedure for Institution Enrollments.

    ERIC Educational Resources Information Center

    Pfitzner, Charles Barry

    1987-01-01

    Applies the Box-Jenkins time series methodology to enrollment data for the Virginia community college system. Describes the enrollment data set, the Box-Jenkins approach, and the forecasting results. Discusses the value of one-quarter ahead enrollment forecasts and implications for practice. Provides a technical discussion of the model. (DMM)

  18. Novel methodology for pharmaceutical expenditure forecast.

    PubMed

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.

  19. Client-Friendly Forecasting: Seasonal Runoff Predictions Using Out-of-the-Box Indices

    NASA Astrophysics Data System (ADS)

    Weil, P.

    2013-12-01

    For more than a century, statistical relationships have been recognized between atmospheric conditions at locations separated by thousands of miles, referred to as teleconnections. Some of the recognized teleconnections provide useful information about expected hydrologic conditions, so certain records of atmospheric conditions are quantified and published as hydroclimate indices. Certain hydroclimate indices can serve as strong leading indicators of climate patterns over North America and can be used to make skillful forecasts of seasonal runoff. The methodology described here creates a simple-to-use model that utilizes easily accessed data to make forecasts of April through September runoff months before the runoff season begins. For this project, forecasting models were developed for two snowmelt-driven river systems in Colorado and Wyoming. In addition to the global hydroclimate indices, the methodology uses several local hydrologic variables including the previous year's drought severity, headwater snow water equivalent and the reservoir contents for the major reservoirs in each basin. To improve the skill of the forecasts, logistic regression is used to develop a model that provides the likelihood that a year will fall into the upper, middle or lower tercile of historical flows. Categorical forecasting has two major advantages over modeling of specific flow amounts: (1) with less prediction outcomes models tend to have better predictive skill and (2) categorical models are very useful to clients and agencies with specific flow thresholds that dictate major changes in water resources management. The resulting methodology and functional forecasting model product is highly portable, applicable to many major river systems and easily explained to a non-technical audience.

  20. Toward the assimilation of biogeochemical data in the CMEMS BIOMER coupled physical-biogeochemical operational system

    NASA Astrophysics Data System (ADS)

    Lamouroux, Julien; Testut, Charles-Emmanuel; Lellouche, Jean-Michel; Perruche, Coralie; Paul, Julien

    2017-04-01

    The operational production of data-assimilated biogeochemical state of the ocean is one of the challenging core projects of the Copernicus Marine Environment Monitoring Service. In that framework - and with the April 2018 CMEMS V4 release as a target - Mercator Ocean is in charge of improving the realism of its global ¼° BIOMER coupled physical-biogeochemical (NEMO/PISCES) simulations, analyses and re-analyses, and to develop an effective capacity to routinely estimate the biogeochemical state of the ocean, through the implementation of biogeochemical data assimilation. Primary objectives are to enhance the time representation of the seasonal cycle in the real time and reanalysis systems, and to provide a better control of the production in the equatorial regions. The assimilation of BGC data will rely on a simplified version of the SEEK filter, where the error statistics do not evolve with the model dynamics. The associated forecast error covariances are based on the statistics of a collection of 3D ocean state anomalies. The anomalies are computed from a multi-year numerical experiment (free run without assimilation) with respect to a running mean in order to estimate the 7-day scale error on the ocean state at a given period of the year. These forecast error covariances rely thus on a fixed-basis seasonally variable ensemble of anomalies. This methodology, which is currently implemented in the "blue" component of the CMEMS operational forecast system, is now under adaptation to be applied to the biogeochemical part of the operational system. Regarding observations - and as a first step - the system shall rely on the CMEMS GlobColour Global Ocean surface chlorophyll concentration products, delivered in NRT. The objective of this poster is to provide a detailed overview of the implementation of the aforementioned data assimilation methodology in the CMEMS BIOMER forecasting system. Focus shall be put on (1) the assessment of the capabilities of this data assimilation methodology to provide satisfying statistics of the model variability errors (through space-time analysis of dedicated representers of satellite surface Chla observations), (2) the dedicated features of the data assimilation configuration that have been implemented so far (e.g. log-transformation of the analysis state, multivariate Chlorophyll-Nutrient control vector, etc.) and (3) the assessment of the performances of this future operational data assimilation configuration.

  1. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  2. Forecasting of UV-Vis absorbance time series using artificial neural networks combined with principal component analysis.

    PubMed

    Plazas-Nossa, Leonardo; Hofer, Thomas; Gruber, Günter; Torres, Andres

    2017-02-01

    This work proposes a methodology for the forecasting of online water quality data provided by UV-Vis spectrometry. Therefore, a combination of principal component analysis (PCA) to reduce the dimensionality of a data set and artificial neural networks (ANNs) for forecasting purposes was used. The results obtained were compared with those obtained by using discrete Fourier transform (DFT). The proposed methodology was applied to four absorbance time series data sets composed by a total number of 5705 UV-Vis spectra. Absolute percentage errors obtained by applying the proposed PCA/ANN methodology vary between 10% and 13% for all four study sites. In general terms, the results obtained were hardly generalizable, as they appeared to be highly dependent on specific dynamics of the water system; however, some trends can be outlined. PCA/ANN methodology gives better results than PCA/DFT forecasting procedure by using a specific spectra range for the following conditions: (i) for Salitre wastewater treatment plant (WWTP) (first hour) and Graz West R05 (first 18 min), from the last part of UV range to all visible range; (ii) for Gibraltar pumping station (first 6 min) for all UV-Vis absorbance spectra; and (iii) for San Fernando WWTP (first 24 min) for all of UV range to middle part of visible range.

  3. Net-zero Building Cluster Simulations and On-line Energy Forecasting for Adaptive and Real-Time Control and Decisions

    NASA Astrophysics Data System (ADS)

    Li, Xiwang

    Buildings consume about 41.1% of primary energy and 74% of the electricity in the U.S. Moreover, it is estimated by the National Energy Technology Laboratory that more than 1/4 of the 713 GW of U.S. electricity demand in 2010 could be dispatchable if only buildings could respond to that dispatch through advanced building energy control and operation strategies and smart grid infrastructure. In this study, it is envisioned that neighboring buildings will have the tendency to form a cluster, an open cyber-physical system to exploit the economic opportunities provided by a smart grid, distributed power generation, and storage devices. Through optimized demand management, these building clusters will then reduce overall primary energy consumption and peak time electricity consumption, and be more resilient to power disruptions. Therefore, this project seeks to develop a Net-zero building cluster simulation testbed and high fidelity energy forecasting models for adaptive and real-time control and decision making strategy development that can be used in a Net-zero building cluster. The following research activities are summarized in this thesis: 1) Development of a building cluster emulator for building cluster control and operation strategy assessment. 2) Development of a novel building energy forecasting methodology using active system identification and data fusion techniques. In this methodology, a systematic approach for building energy system characteristic evaluation, system excitation and model adaptation is included. The developed methodology is compared with other literature-reported building energy forecasting methods; 3) Development of the high fidelity on-line building cluster energy forecasting models, which includes energy forecasting models for buildings, PV panels, batteries and ice tank thermal storage systems 4) Small scale real building validation study to verify the performance of the developed building energy forecasting methodology. The outcomes of this thesis can be used for building cluster energy forecasting model development and model based control and operation optimization. The thesis concludes with a summary of the key outcomes of this research, as well as a list of recommendations for future work.

  4. Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2014-05-01

    This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.

  5. Educational Forecasting Methodologies: State of the Art, Trends, and Highlights.

    ERIC Educational Resources Information Center

    Hudson, Barclay; Bruno, James

    This overview of both quantitative and qualitative methods of educational forecasting is introduced by a discussion of a general typology of forecasting methods. In each of the following sections, discussion follows the same general format: a number of basic approaches are identified (e.g. extrapolation, correlation, systems modelling), and each…

  6. Semi-nonparametric VaR forecasts for hedge funds during the recent crisis

    NASA Astrophysics Data System (ADS)

    Del Brio, Esther B.; Mora-Valencia, Andrés; Perote, Javier

    2014-05-01

    The need to provide accurate value-at-risk (VaR) forecasting measures has triggered an important literature in econophysics. Although these accurate VaR models and methodologies are particularly demanded for hedge fund managers, there exist few articles specifically devoted to implement new techniques in hedge fund returns VaR forecasting. This article advances in these issues by comparing the performance of risk measures based on parametric distributions (the normal, Student’s t and skewed-t), semi-nonparametric (SNP) methodologies based on Gram-Charlier (GC) series and the extreme value theory (EVT) approach. Our results show that normal-, Student’s t- and Skewed t- based methodologies fail to forecast hedge fund VaR, whilst SNP and EVT approaches accurately success on it. We extend these results to the multivariate framework by providing an explicit formula for the GC copula and its density that encompasses the Gaussian copula and accounts for non-linear dependences. We show that the VaR obtained by the meta GC accurately captures portfolio risk and outperforms regulatory VaR estimates obtained through the meta Gaussian and Student’s t distributions.

  7. Evaluating the Contribution of NASA Remotely-Sensed Data Sets on a Convection-Allowing Forecast Model

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley T.; Case, Jonathan L.; Molthan, Andrew L.

    2012-01-01

    The Short-term Prediction Research and Transition (SPoRT) Center is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service forecast offices. SPoRT provides real-time NASA products and capabilities to help its partners address specific operational forecast challenges. One challenge that forecasters face is using guidance from local and regional deterministic numerical models configured at convection-allowing resolution to help assess a variety of mesoscale/convective-scale phenomena such as sea-breezes, local wind circulations, and mesoscale convective weather potential on a given day. While guidance from convection-allowing models has proven valuable in many circumstances, the potential exists for model improvements by incorporating more representative land-water surface datasets, and by assimilating retrieved temperature and moisture profiles from hyper-spectral sounders. In order to help increase the accuracy of deterministic convection-allowing models, SPoRT produces real-time, 4-km CONUS forecasts using a configuration of the Weather Research and Forecasting (WRF) model (hereafter SPoRT-WRF) that includes unique NASA products and capabilities including 4-km resolution soil initialization data from the Land Information System (LIS), 2-km resolution SPoRT SST composites over oceans and large water bodies, high-resolution real-time Green Vegetation Fraction (GVF) composites derived from the Moderate-resolution Imaging Spectroradiometer (MODIS) instrument, and retrieved temperature and moisture profiles from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI). NCAR's Model Evaluation Tools (MET) verification package is used to generate statistics of model performance compared to in situ observations and rainfall analyses for three months during the summer of 2012 (June-August). Detailed analyses of specific severe weather outbreaks during the summer will be presented to assess the potential added-value of the SPoRT datasets and data assimilation methodology compared to a WRF configuration without the unique datasets and data assimilation.

  8. Short term load forecasting of anomalous load using hybrid soft computing methods

    NASA Astrophysics Data System (ADS)

    Rasyid, S. A.; Abdullah, A. G.; Mulyadi, Y.

    2016-04-01

    Load forecast accuracy will have an impact on the generation cost is more economical. The use of electrical energy by consumers on holiday, show the tendency of the load patterns are not identical, it is different from the pattern of the load on a normal day. It is then defined as a anomalous load. In this paper, the method of hybrid ANN-Particle Swarm proposed to improve the accuracy of anomalous load forecasting that often occur on holidays. The proposed methodology has been used to forecast the half-hourly electricity demand for power systems in the Indonesia National Electricity Market in West Java region. Experiments were conducted by testing various of learning rate and learning data input. Performance of this methodology will be validated with real data from the national of electricity company. The result of observations show that the proposed formula is very effective to short-term load forecasting in the case of anomalous load. Hybrid ANN-Swarm Particle relatively simple and easy as a analysis tool by engineers.

  9. Online learning algorithm for time series forecasting suitable for low cost wireless sensor networks nodes.

    PubMed

    Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma

    2015-04-21

    Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources.

  10. Online Learning Algorithm for Time Series Forecasting Suitable for Low Cost Wireless Sensor Networks Nodes

    PubMed Central

    Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma

    2015-01-01

    Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources. PMID:25905698

  11. Forecasting Natural Gas Prices Using Wavelets, Time Series, and Artificial Neural Networks

    PubMed Central

    2015-01-01

    Following the unconventional gas revolution, the forecasting of natural gas prices has become increasingly important because the association of these prices with those of crude oil has weakened. With this as motivation, we propose some modified hybrid models in which various combinations of the wavelet approximation, detail components, autoregressive integrated moving average, generalized autoregressive conditional heteroskedasticity, and artificial neural network models are employed to predict natural gas prices. We also emphasize the boundary problem in wavelet decomposition, and compare results that consider the boundary problem case with those that do not. The empirical results show that our suggested approach can handle the boundary problem, such that it facilitates the extraction of the appropriate forecasting results. The performance of the wavelet-hybrid approach was superior in all cases, whereas the application of detail components in the forecasting was only able to yield a small improvement in forecasting performance. Therefore, forecasting with only an approximation component would be acceptable, in consideration of forecasting efficiency. PMID:26539722

  12. Forecasting Natural Gas Prices Using Wavelets, Time Series, and Artificial Neural Networks.

    PubMed

    Jin, Junghwan; Kim, Jinsoo

    2015-01-01

    Following the unconventional gas revolution, the forecasting of natural gas prices has become increasingly important because the association of these prices with those of crude oil has weakened. With this as motivation, we propose some modified hybrid models in which various combinations of the wavelet approximation, detail components, autoregressive integrated moving average, generalized autoregressive conditional heteroskedasticity, and artificial neural network models are employed to predict natural gas prices. We also emphasize the boundary problem in wavelet decomposition, and compare results that consider the boundary problem case with those that do not. The empirical results show that our suggested approach can handle the boundary problem, such that it facilitates the extraction of the appropriate forecasting results. The performance of the wavelet-hybrid approach was superior in all cases, whereas the application of detail components in the forecasting was only able to yield a small improvement in forecasting performance. Therefore, forecasting with only an approximation component would be acceptable, in consideration of forecasting efficiency.

  13. Assessment and forecasting of lightning potential and its effect on launch operations at Cape Canaveral Air Force Station and John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Weems, J.; Wyse, N.; Madura, J.; Secrist, M.; Pinder, C.

    1991-01-01

    Lightning plays a pivotal role in the operation decision process for space and ballistic launches at Cape Canaveral Air Force Station and Kennedy Space Center. Lightning forecasts are the responsibility of Detachment 11, 4th Weather Wing's Cape Canaveral Forecast Facility. These forecasts are important to daily ground processing as well as launch countdown decisions. The methodology and equipment used to forecast lightning are discussed. Impact on a recent mission is summarized.

  14. A temporal-spatial postprocessing model for probabilistic run-off forecast. With a case study from Ulla-Førre with five catchments and ten lead times

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.

    2012-04-01

    This work is driven by the needs of next generation short term optimization methodology for hydro power production. Stochastic optimization are about to be introduced; i.e. optimizing when available resources (water) and utility (prices) are uncertain. In this paper we focus on the available resources, i.e. water, where uncertainty mainly comes from uncertainty in future runoff. When optimizing a water system all catchments and several lead times have to be considered simultaneously. Depending on the system of hydropower reservoirs, it might be a set of headwater catchments, a system of upstream /downstream reservoirs where water used from one catchment /dam arrives in a lower catchment maybe days later, or a combination of both. The aim of this paper is therefore to construct a simultaneous probabilistic forecast for several catchments and lead times, i.e. to provide a predictive distribution for the forecasts. Stochastic optimization methods need samples/ensembles of run-off forecasts as input. Hence, it should also be possible to sample from our probabilistic forecast. A post-processing approach is taken, and an error model based on Box- Cox transformation, power transform and a temporal-spatial copula model is used. It accounts for both between catchment and between lead time dependencies. In operational use it is strait forward to sample run-off ensembles from this models that inherits the catchment and lead time dependencies. The methodology is tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times are constructed. The methodology has enough flexibility to model operationally important features in this case study such as hetroscadasety, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. Our model is evaluated using CRPS for marginal predictive distributions and energy score for joint predictive distribution. It is tested against deterministic run-off forecast, climatology forecast and a persistent forecast, and is found to be the better probabilistic forecast for lead time grater then two. From an operational point of view the results are interesting as the between catchment dependency gets stronger with longer lead-times.

  15. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  16. Deciding the Future: A Forecast of Responsibilities of Secondary Teachers of English, 1970-2000 AD.

    ERIC Educational Resources Information Center

    Farrell, Edmund J.

    This document is a slightly revised version of author's Ph.D. Dissertation, "A Forecast of Responsibilities of Secondary Teachers of English 1970-2000 A.D., with Implications for Teacher Education" (ED 049 253). A study in two parts, Part I presents the need for future planning in education; discusses briefly methodologies for forecasting the…

  17. STATUS AND PROGRESS IN PARTICULATE MATTER FORECASTING: INITIAL APPLICATION OF THE ETA- CMAQ FORECAST MODEL

    EPA Science Inventory

    This presentation reviews the status and progress in forecasting particulate matter distributions. The shortcomings in representation of particulate matter formation in current atmospheric chemistry/transport models are presented based on analyses and detailed comparisons with me...

  18. Resolution of Probabilistic Weather Forecasts with Application in Disease Management.

    PubMed

    Hughes, G; McRoberts, N; Burnett, F J

    2017-02-01

    Predictive systems in disease management often incorporate weather data among the disease risk factors, and sometimes this comes in the form of forecast weather data rather than observed weather data. In such cases, it is useful to have an evaluation of the operational weather forecast, in addition to the evaluation of the disease forecasts provided by the predictive system. Typically, weather forecasts and disease forecasts are evaluated using different methodologies. However, the information theoretic quantity expected mutual information provides a basis for evaluating both kinds of forecast. Expected mutual information is an appropriate metric for the average performance of a predictive system over a set of forecasts. Both relative entropy (a divergence, measuring information gain) and specific information (an entropy difference, measuring change in uncertainty) provide a basis for the assessment of individual forecasts.

  19. Methodology for the Assessment of the Macroeconomic Impacts of Stricter CAFE Standards - Addendum

    EIA Publications

    2002-01-01

    This assessment of the economic impacts of Corporate Average Fuel Economy (CAFÉ) standards marks the first time the Energy Information Administration has used the new direct linkage of the DRI-WEFA Macroeconomic Model to the National Energy Modeling System (NEMS) in a policy setting. This methodology assures an internally consistent solution between the energy market concepts forecast by NEMS and the aggregate economy as forecast by the DRI-WEFA Macroeconomic Model of the U.S. Economy.

  20. Neural network based short-term load forecasting using weather compensation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, T.W.S.; Leung, C.T.

    This paper presents a novel technique for electric load forecasting based on neural weather compensation. The proposed method is a nonlinear generalization of Box and Jenkins approach for nonstationary time-series prediction. A weather compensation neural network is implemented for one-day ahead electric load forecasting. The weather compensation neural network can accurately predict the change of actual electric load consumption from the previous day. The results, based on Hong Kong Island historical load demand, indicate that this methodology is capable of providing a more accurate load forecast with a 0.9% reduction in forecast error.

  1. 2018 one‐year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Rukstales, Kenneth S.; McNamara, Daniel E.; Williams, Robert A.; Shumway, Allison; Powers, Peter; Earle, Paul; Llenos, Andrea L.; Michael, Andrew J.; Rubinstein, Justin L.; Norbeck, Jack; Cochran, Elizabeth S.

    2018-01-01

    This article describes the U.S. Geological Survey (USGS) 2018 one‐year probabilistic seismic hazard forecast for the central and eastern United States from induced and natural earthquakes. For consistency, the updated 2018 forecast is developed using the same probabilistic seismicity‐based methodology as applied in the two previous forecasts. Rates of earthquakes across the United States M≥3.0">M≥3.0 grew rapidly between 2008 and 2015 but have steadily declined over the past 3 years, especially in areas of Oklahoma and southern Kansas where fluid injection has decreased. The seismicity pattern in 2017 was complex with earthquakes more spatially dispersed than in the previous years. Some areas of west‐central Oklahoma experienced increased activity rates where industrial activity increased. Earthquake rates in Oklahoma (429 earthquakes of M≥3">M≥3 and 4 M≥4">M≥4), Raton basin (Colorado/New Mexico border, six earthquakes M≥3">M≥3), and the New Madrid seismic zone (11 earthquakes M≥3">M≥3) continue to be higher than historical levels. Almost all of these earthquakes occurred within the highest hazard regions of the 2017 forecast. Even though rates declined over the past 3 years, the short‐term hazard for damaging ground shaking across much of Oklahoma remains at high levels due to continuing high rates of smaller earthquakes that are still hundreds of times higher than at any time in the state’s history. Fine details and variability between the 2016–2018 forecasts are obscured by significant uncertainties in the input model. These short‐term hazard levels are similar to active regions in California. During 2017, M≥3">M≥3 earthquakes also occurred in or near Ohio, West Virginia, Missouri, Kentucky, Tennessee, Arkansas, Illinois, Oklahoma, Kansas, Colorado, New Mexico, Utah, and Wyoming.

  2. Quantifying automobile refinishing VOC air emissions - a methodology with estimates and forecasts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S.P.; Rubick, C.

    1996-12-31

    Automobile refinishing coatings (referred to as paints), paint thinners, reducers, hardeners, catalysts, and cleanup solvents used during their application, contain volatile organic compounds (VOCs) which are precursors to ground level ozone formation. Some of these painting compounds create hazardous air pollutants (HAPs) which are toxic. This paper documents the methodology, data sets, and the results of surveys (conducted in the fall of 1995) used to develop revised per capita emissions factors for estimating and forecasting the VOC air emissions from the area source category of automobile refinishing. Emissions estimates, forecasts, trends, and reasons for these trends are presented. Future emissionsmore » inventory (EI) challenges are addressed in light of data availability and information networks.« less

  3. Potential Vorticity Analysis of Low Level Thunderstorm Dynamics in an Idealized Supercell Simulation

    DTIC Science & Technology

    2009-03-01

    Severe Weather, Supercell, Weather Research and Forecasting Model , Advanced WRF 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...27 A. ADVANCED RESEARCH WRF MODEL .................................................27 1. Data, Model Setup, and Methodology...03/11/2006 GFS model run. Top row: 11/12Z initialization. Middle row: 12 hour forecast valid at 12/00Z. Bottom row: 24 hour forecast valid at

  4. Post-processing Seasonal Precipitation Forecasts via Integrating Climate Indices and the Analog Approach

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhang, Y.; Wood, A.; Lee, H. S.; Wu, L.; Schaake, J. C.

    2016-12-01

    Seasonal precipitation forecasts are a primary driver for seasonal streamflow prediction that is critical for a range of water resources applications, such as reservoir operations and drought management. However, it is well known that seasonal precipitation forecasts from climate models are often biased and also too coarse in spatial resolution for hydrologic applications. Therefore, post-processing procedures such as downscaling and bias correction are often needed. In this presentation, we discuss results from a recent study that applies a two-step methodology to downscale and correct the ensemble mean precipitation forecasts from the Climate Forecast System (CFS). First, CFS forecasts are downscaled and bias corrected using monthly reforecast analogs: we identify past precipitation forecasts that are similar to the current forecast, and then use the finer-scale observational analysis fields from the corresponding dates to represent the post-processed ensemble forecasts. Second, we construct the posterior distribution of forecast precipitation from the post-processed ensemble by integrating climate indices: a correlation analysis is performed to identify dominant climate indices for the study region, which are then used to weight the analysis analogs selected in the first step using a Bayesian approach. The methodology is applied to the California Nevada River Forecast Center (CNRFC) and the Middle Atlantic River Forecast Center (MARFC) regions for 1982-2015, using the North American Land Data Assimilation System (NLDAS-2) precipitation as the analysis. The results from cross validation show that the post-processed CFS precipitation forecast are considerably more skillful than the raw CFS with the analog approach only. Integrating climate indices can further improve the skill if the number of ensemble members considered is large enough; however, the improvement is generally limited to the first couple of months when compared against climatology. Impacts of various factors such as ensemble size, lead time, and choice of climate indices will also be discussed.

  5. Flare forecasting at the Met Office Space Weather Operations Centre

    NASA Astrophysics Data System (ADS)

    Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.

    2017-04-01

    The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.

  6. How can we deal with ANN in flood forecasting? As a simulation model or updating kernel!

    NASA Astrophysics Data System (ADS)

    Hassan Saddagh, Mohammad; Javad Abedini, Mohammad

    2010-05-01

    Flood forecasting and early warning, as a non-structural measure for flood control, is often considered to be the most effective and suitable alternative to mitigate the damage and human loss caused by flood. Forecast results which are output of hydrologic, hydraulic and/or black box models should secure accuracy of flood values and timing, especially for long lead time. The application of the artificial neural network (ANN) in flood forecasting has received extensive attentions in recent years due to its capability to capture the dynamics inherent in complex processes including flood. However, results obtained from executing plain ANN as simulation model demonstrate dramatic reduction in performance indices as lead time increases. This paper is intended to monitor the performance indices as it relates to flood forecasting and early warning using two different methodologies. While the first method employs a multilayer neural network trained using back-propagation scheme to forecast output hydrograph of a hypothetical river for various forecast lead time up to 6.0 hr, the second method uses 1D hydrodynamic MIKE11 model as forecasting model and multilayer neural network as updating kernel to monitor and assess the performance indices compared to ANN alone in light of increase in lead time. Results presented in both graphical and tabular format indicate superiority of MIKE11 coupled with ANN as updating kernel compared to ANN as simulation model alone. While plain ANN produces more accurate results for short lead time, the errors increase expeditiously for longer lead time. The second methodology provides more accurate and reliable results for longer forecast lead time.

  7. FHWA travel analysis framework : development of VMT forecasting models for use by the Federal Highway Administration

    DOT National Transportation Integrated Search

    2014-05-12

    This document details the process that the Volpe National Transportation Systems Center (Volpe) used to develop travel forecasting models for the Federal Highway Administration (FHWA). The purpose of these models is to allow FHWA to forecast future c...

  8. Transition from Research to Operations: Assessing Value of Experimental Forecast Products within the NWSFO Environment

    NASA Technical Reports Server (NTRS)

    Lapenta, William M.; Wohlman, Richard; Bradshaw, Tom; Burks, Jason; Jedlovec, Gary; Goodman, Steve; Darden, Chris; Meyer, Paul

    2003-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center seeks to accelerate the infusion of NASA Earth Science Enterprise (ESE) observations, data assimilation and modeling research into NWS forecast operations and decision-making. To meet long-term program expectations, it is not sufficient simply to give forecasters sophisticated workstations or new forecast products without fully assessing the ways in which they will be utilized. Close communication must be established between the research and operational communities so that developers have a complete understanding of user needs. In turn, forecasters must obtain a more comprehensive knowledge of the modeling and sensing tools available to them. A major goal of the SPoRT Program is to develop metrics and conduct assessment studies with NWS forecasters to evaluate the impacts and benefits of ESE experimental products on forecast skill. At a glance the task seems relatively straightforward. However, performing assessment of experimental products in an operational environment is demanding. Given the tremendous time constraints placed on NWS forecasters, it is imperative that forecaster input be obtained in a concise unobtrusive manor. Great care must also be taken to ensure that forecasters understand their participation will eventually benefit them and WFO operations in general. Two requirements of the assessment plan developed under the SPoRT activity are that it 1) Can be implemented within the WFO environment; and 2) Provide tangible results for BOTH the research and operational communities. Supplemental numerical quantitative precipitation forecasts (QPF) were chosen as the first experimental SPoRT product to be evaluated during a Pilot Assessment Program conducted 1 May 2003 within the Huntsville AL National Weather Service Forecast Office. Forecast time periods were broken up into six- hour bins ranging from zero to twenty-four hours. Data were made available for display in AWIPS on an operational basis so they could be efficiently incorporated into the forecast process. The methodology used to assess the value of experimental QPFs compared to available operational products is best described as a three-tier approach involving both forecasters and research scientists. Tier-one is a web-based survey completed by duty forecasters on the aviation and public desks. The survey compiles information on how the experimental product was used in the forecast decision making process. Up to 6 responses per twenty-four hours can be compiled during a precipitation event. Tier-two consists of an event post mortem and experimental product assessment performed daily by the NASA/NWS Liaison. Tier-three is a detailed breakdown/analysis of specific events targeted by either the NWS SO0 or SPoRT team members. The task is performed by both NWS and NASA research scientists and may be conducted once every couple of months. The findings from the Pilot Assessment Program will be reported at the meeting.

  9. Influenza forecasting in human populations: a scoping review.

    PubMed

    Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A; McKenzie, F Ellis

    2014-01-01

    Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms "influenza AND (forecast* OR predict*)", excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials.

  10. Influenza Forecasting in Human Populations: A Scoping Review

    PubMed Central

    Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A.; McKenzie, F. Ellis

    2014-01-01

    Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms “influenza AND (forecast* OR predict*)”, excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials. PMID:24714027

  11. Improving GEFS Weather Forecasts for Indian Monsoon with Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankita; Salvi, Kaustubh; Ghosh, Subimal

    2014-05-01

    Weather forecast has always been a challenging research problem, yet of a paramount importance as it serves the role of 'key input' in formulating modus operandi for immediate future. Short range rainfall forecasts influence a wide range of entities, right from agricultural industry to a common man. Accurate forecasts actually help in minimizing the possible damage by implementing pre-decided plan of action and hence it is necessary to gauge the quality of forecasts which might vary with the complexity of weather state and regional parameters. Indian Summer Monsoon Rainfall (ISMR) is one such perfect arena to check the quality of weather forecast not only because of the level of intricacy in spatial and temporal patterns associated with it, but also the amount of damage it can cause (because of poor forecasts) to the Indian economy by affecting agriculture Industry. The present study is undertaken with the rationales of assessing, the ability of Global Ensemble Forecast System (GEFS) in predicting ISMR over central India and the skill of statistical downscaling technique in adding value to the predictions by taking them closer to evidentiary target dataset. GEFS is a global numerical weather prediction system providing the forecast results of different climate variables at a fine resolution (0.5 degree and 1 degree). GEFS shows good skills in predicting different climatic variables but fails miserably over rainfall predictions for Indian summer monsoon rainfall, which is evident from a very low to negative correlation values between predicted and observed rainfall. Towards the fulfilment of second rationale, the statistical relationship is established between the reasonably well predicted climate variables (GEFS) and observed rainfall. The GEFS predictors are treated with multicollinearity and dimensionality reduction techniques, such as principal component analysis (PCA) and least absolute shrinkage and selection operator (LASSO). Statistical relationship is established between the principal components and observed rainfall over training period and predictions are obtained for testing period. The validations show high improvements in correlation coefficient between observed and predicted data (0.25 to 0.55). The results speak in favour of statistical downscaling methodology which shows the capability to reduce the gap between observed data and predictions. A detailed study is required to be carried out by applying different downscaling techniques to quantify the improvements in predictions.

  12. Trends in methodological differences

    Treesearch

    Daniel J. Stynes; Malcolm I. Bevins; Tommy L. Brown

    1980-01-01

    Inconsistency in data collection has confounded attempts to identify and forecast outdoor recreation trends. Problems are highlighted through an evaluation of the methods employed in national outdoor recreation participation surveys and projections. Recommendations are advanced for improving data collection, trend measurement, and forecasting within outdoor recreation...

  13. Ecological Forecasting: Microbial Contamination and Atmospheric Loadings of Nutrients to Land and Water

    EPA Science Inventory

    The development of ecological forecasts, namely, methodologies to predict the chemical, biological, and physical changes in terrestrial and aquatic ecosystems is desirable so that effective strategies for reducing the adverse impacts of human activities and extreme natural events...

  14. Potential Technologies for Assessing Risk Associated with a Mesoscale Forecast

    DTIC Science & Technology

    2015-10-01

    American GFS models, and informally applied on the Weather Research and Forecasting ( WRF ) model. The current CI equation is as follows...Reen B, Penc R. Investigating surface bias errors in the Weather Research and Forecasting ( WRF ) model using a Geographic Information System (GIS). J...Forecast model ( WRF -ARW) with extensions that might include finer terrain resolutions and more detailed representations of the underlying atmospheric

  15. Ability of matrix models to explain the past and predict the future of plant populations.

    USGS Publications Warehouse

    McEachern, Kathryn; Crone, Elizabeth E.; Ellis, Martha M.; Morris, William F.; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlen, Johan; Kaye, Thomas N.; Knight, Tiffany M.; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer I.; Doak, Daniel F.; Ganesan, Rengaian; Thorpe, Andrea S.; Menges, Eric S.

    2013-01-01

    Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models.

  16. Ability of matrix models to explain the past and predict the future of plant populations.

    PubMed

    Crone, Elizabeth E; Ellis, Martha M; Morris, William F; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlén, Johan; Kaye, Thomas N; Knight, Tiffany M; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer L; Doak, Daniel F; Ganesan, Rengaian; McEachern, Kathyrn; Thorpe, Andrea S; Menges, Eric S

    2013-10-01

    Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models. © 2013 Society for Conservation Biology.

  17. Flood forecasting using non-stationarity in a river with tidal influence - a feasibility study

    NASA Astrophysics Data System (ADS)

    Killick, Rebecca; Kretzschmar, Ann; Ilic, Suzi; Tych, Wlodek

    2017-04-01

    Flooding is the most common natural hazard causing damage, disruption and loss of life worldwide. Despite improvements in modelling and forecasting of water levels and flood inundation (Kretzschmar et al., 2014; Hoitink and Jay, 2016), there are still large discrepancies between predictions and observations particularly during storm events when accurate predictions are most important. Many models exist for forecasting river levels (Smith et al., 2013; Leedal et al., 2013) however they commonly assume that the errors in the data are independent, stationary and normally distributed. This is generally not the case especially during storm events suggesting that existing models are not describing the drivers of river level in an appropriate fashion. Further challenges exist in the lower sections of a river influenced by both river and tidal flows and their interaction and there is scope for improvement in prediction. This paper investigates the use of a powerful statistical technique to adaptively forecast river levels by modelling the process as locally stationary. The proposed methodology takes information on both upstream and downstream river levels and incorporates meteorological information (rainfall forecasts) and tidal levels when required to forecast river levels at a specified location. Using this approach, a single model will be capable of predicting water levels in both tidal and non-tidal river reaches. In this pilot project, the methodology of Smith et al. (2013) using harmonic tidal analysis and data based mechanistic modelling is compared with the methodology developed by Killick et al. (2016) utilising data-driven wavelet decomposition to account for the information contained in the upstream and downstream river data to forecast a non-stationary time-series. Preliminary modelling has been carried out using the tidal stretch of the River Lune in North-west England and initial results are presented here. Future work includes expanding the methodology to forecast river levels at a network of locations simultaneously. References Hoitink, A. J. F., and D. A. Jay (2016), Tidal river dynamics: Implications for deltas, Rev. Geophys., 54, 240-272 Killick, R., Knight, M., Nason, G.P., Eckley, I.A. (2016) The Local Partial Autocorrelation Function and its Application to the Forecasting of Locally Stationary Time Series. Submitted Kretzschmar, Ann and Tych, Wlodek and Chappell, Nick A (2014) Reversing hydrology: estimation of sub-hourly rainfall time-series from streamflow. Env. Modell Softw., 60. pp. 290-301 D. Leedal, A. H. Weerts, P. J. Smith, & K. J. Beven. (2013). Application of data-based mechanistic modelling for flood forecasting at multiple locations in the Eden catchment in the National Flood Forecasting System (England and Wales). HESS, 17(1), 177-185. Smith, P., Beven, K., Horsburgh, K., Hardaker, P., & Collier, C. (2013). Data-based mechanistic modelling of tidally affected river reaches for flood warning purposes: An example on the River Dee, UK. , Q.J.R. Meteorol. Soc. 139(671), 340-349.

  18. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  19. Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J.; Hodge, B. M.; Florita, A.

    2013-10-01

    Forecasting solar energy generation is a challenging task due to the variety of solar power systems and weather regimes encountered. Forecast inaccuracies can result in substantial economic losses and power system reliability issues. This paper presents a suite of generally applicable and value-based metrics for solar forecasting for a comprehensive set of scenarios (i.e., different time horizons, geographic locations, applications, etc.). In addition, a comprehensive framework is developed to analyze the sensitivity of the proposed metrics to three types of solar forecasting improvements using a design of experiments methodology, in conjunction with response surface and sensitivity analysis methods. The resultsmore » show that the developed metrics can efficiently evaluate the quality of solar forecasts, and assess the economic and reliability impact of improved solar forecasting.« less

  20. Community-based early warning systems for flood risk mitigation in Nepal

    NASA Astrophysics Data System (ADS)

    Smith, Paul J.; Brown, Sarah; Dugar, Sumit

    2017-03-01

    This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.

  1. Population forecasts for Bangladesh, using a Bayesian methodology.

    PubMed

    Mahsin, Md; Hossain, Syed Shahadat

    2012-12-01

    Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.

  2. Bayesian Population Forecasting: Extending the Lee-Carter Method.

    PubMed

    Wiśniowski, Arkadiusz; Smith, Peter W F; Bijak, Jakub; Raymer, James; Forster, Jonathan J

    2015-06-01

    In this article, we develop a fully integrated and dynamic Bayesian approach to forecast populations by age and sex. The approach embeds the Lee-Carter type models for forecasting the age patterns, with associated measures of uncertainty, of fertility, mortality, immigration, and emigration within a cohort projection model. The methodology may be adapted to handle different data types and sources of information. To illustrate, we analyze time series data for the United Kingdom and forecast the components of population change to the year 2024. We also compare the results obtained from different forecast models for age-specific fertility, mortality, and migration. In doing so, we demonstrate the flexibility and advantages of adopting the Bayesian approach for population forecasting and highlight areas where this work could be extended.

  3. Factors that affect implementation of a nurse staffing directive: results from a qualitative multi-case evaluation.

    PubMed

    Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E

    2016-08-01

    To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  4. Obtaining high-resolution stage forecasts by coupling large-scale hydrologic models with sensor data

    NASA Astrophysics Data System (ADS)

    Fries, K. J.; Kerkez, B.

    2017-12-01

    We investigate how "big" quantities of distributed sensor data can be coupled with a large-scale hydrologic model, in particular the National Water Model (NWM), to obtain hyper-resolution forecasts. The recent launch of the NWM provides a great example of how growing computational capacity is enabling a new generation of massive hydrologic models. While the NWM spans an unprecedented spatial extent, there remain many questions about how to improve forecast at the street-level, the resolution at which many stakeholders make critical decisions. Further, the NWM runs on supercomputers, so water managers who may have access to their own high-resolution measurements may not readily be able to assimilate them into the model. To that end, we ask the question: how can the advances of the large-scale NWM be coupled with new local observations to enable hyper-resolution hydrologic forecasts? A methodology is proposed whereby the flow forecasts of the NWM are directly mapped to high-resolution stream levels using Dynamical System Identification. We apply the methodology across a sensor network of 182 gages in Iowa. Of these sites, approximately one third have shown to perform well in high-resolution flood forecasting when coupled with the outputs of the NWM. The quality of these forecasts is characterized using Principal Component Analysis and Random Forests to identify where the NWM may benefit from new sources of local observations. We also discuss how this approach can help municipalities identify where they should place low-cost sensors to most benefit from flood forecasts of the NWM.

  5. Novel methodology for pharmaceutical expenditure forecast

    PubMed Central

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. Conclusions This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making. PMID:27226843

  6. Assimilation of GPM GMI Rainfall Product with WRF GSI

    NASA Technical Reports Server (NTRS)

    Li, Xuanli; Mecikalski, John; Zavodsky, Bradley

    2015-01-01

    The Global Precipitation Measurement (GPM) is an international mission to provide next-generation observations of rain and snow worldwide. The GPM built on Tropical Rainfall Measuring Mission (TRMM) legacy, while the core observatory will extend the observations to higher latitudes. The GPM observations can help advance our understanding of precipitation microphysics and storm structures. Launched on February 27th, 2014, the GPM core observatory is carrying advanced instruments that can be used to quantify when, where, and how much it rains or snows around the world. Therefore, the use of GPM data in numerical modeling work is a new area and will have a broad impact in both research and operational communities. The goal of this research is to examine the methodology of assimilation of the GPM retrieved products. The data assimilation system used in this study is the community Gridpoint Statistical Interpolation (GSI) system for the Weather Research and Forecasting (WRF) model developed by the Development Testbed Center (DTC). The community GSI system runs in independently environment, yet works functionally equivalent to operational centers. With collaboration with the NASA Short-term Prediction Research and Transition (SPoRT) Center, this research explores regional assimilation of the GPM products with case studies. Our presentation will highlight our recent effort on the assimilation of the GPM product 2AGPROFGMI, the retrieved Microwave Imager (GMI) rainfall rate data for initializing a real convective storm. WRF model simulations and storm scale data assimilation experiments will be examined, emphasizing both model initialization and short-term forecast of precipitation fields and processes. In addition, discussion will be provided on the development of enhanced assimilation procedures in the GSI system with respect to other GPM products. Further details of the methodology of data assimilation, preliminary result and test on the impact of GPM data and the influence on precipitation forecast will be presented at the conference.

  7. Measuring the effectiveness of earthquake forecasting in insurance strategies

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  8. Economic consequences of improved temperature forecasts: An experiment with the Florida citrus growers (an update of control group results)

    NASA Technical Reports Server (NTRS)

    Braen, C.

    1978-01-01

    The economic experiment, the results obtained to date and the work which still remains to be done are summarized. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service and Federal Crop Insurance Corp. Resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements are discussed.

  9. Development of a methodology for the assessment of sea level rise impacts on Florida's transportation modes and infrastructure : [summary].

    DOT National Transportation Integrated Search

    2012-01-01

    In Florida, low elevations can make transportation infrastructure in coastal and low-lying areas potentially vulnerable to sea level rise (SLR). Becuase global SLR forecasts lack precision at local or regional scales, SLR forecasts or scenarios for p...

  10. Developing the Capacity of Farmers to Understand and Apply Seasonal Climate Forecasts through Collaborative Learning Processes

    ERIC Educational Resources Information Center

    Cliffe, Neil; Stone, Roger; Coutts, Jeff; Reardon-Smith, Kathryn; Mushtaq, Shahbaz

    2016-01-01

    Purpose: This paper documents and evaluates collaborative learning processes aimed at developing farmer's knowledge, skills and aspirations to use seasonal climate forecasting (SCF). Methodology: Thirteen workshops conducted in 2012 engaged over 200 stakeholders across Australian sugar production regions. Workshop design promoted participant…

  11. Forecast Occupational Supply: A Methodological Handbook.

    ERIC Educational Resources Information Center

    McKinlay, Bruce; Johnson, Lowell E.

    Greater concern with unemployment in recent years has increased the need for accurate forecasting of future labor market requirements, in order to plan for vocational education and other manpower programs. However, past emphasis has been placed on labor demand, rather than supply, even though either side by itself is useless in determining skill…

  12. Post-Secondary Enrolment Forecasting with Traditional and Cross Pressure-Impact Methodologies.

    ERIC Educational Resources Information Center

    Hoffman, Bernard B.

    A model for forecasting postsecondary enrollment, the PDEM-1, is considered, which combines the traditional with a cross-pressure impact decision-making model. The model is considered in relation to its background, assumptions, survey instrument, model conception, applicability to educational environments, and implementation difficulties. The…

  13. 77 FR 1761 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... quantitative research and evaluation process that forecasts economic excess sector returns (over/under the... proprietary SectorSAM quantitative research and evaluation process. \\8\\ The following convictions constitute... Allocation Methodology'' (``SectorSAM''), which is a proprietary quantitative analysis, to forecast each...

  14. Hydrological Forecasting Practices in Brazil

    NASA Astrophysics Data System (ADS)

    Fan, Fernando; Paiva, Rodrigo; Collischonn, Walter; Ramos, Maria-Helena

    2016-04-01

    This work brings a review on current hydrological and flood forecasting practices in Brazil, including the main forecasts applications, the different kinds of techniques that are currently being employed and the institutions involved on forecasts generation. A brief overview of Brazil is provided, including aspects related to its geography, climate, hydrology and flood hazards. A general discussion about the Brazilian practices on hydrological short and medium range forecasting is presented. Detailed examples of some hydrological forecasting systems that are operational or in a research/pre-operational phase using the large scale hydrological model MGB-IPH are also presented. Finally, some suggestions are given about how the forecasting practices in Brazil can be understood nowadays, and what are the perspectives for the future.

  15. Forecasting daily passenger traffic volumes in the Moscow metro

    NASA Astrophysics Data System (ADS)

    Ivanov, V. V.; Osetrov, E. S.

    2018-01-01

    In this paper we have developed a methodology for the medium-term prediction of daily volumes of passenger traffic in the Moscow metro. It includes three options for the forecast: (1) based on artificial neural networks (ANNs), (2) singular-spectral analysis implemented in the Caterpillar-SSA package, and (3) a combination of the ANN and Caterpillar-SSA approaches. The methods and algorithms allow the mediumterm forecasting of passenger traffic flows in the Moscow metro with reasonable accuracy.

  16. A methodology for reduced order modeling and calibration of the upper atmosphere

    NASA Astrophysics Data System (ADS)

    Mehta, Piyush M.; Linares, Richard

    2017-10-01

    Atmospheric drag is the largest source of uncertainty in accurately predicting the orbit of satellites in low Earth orbit (LEO). Accurately predicting drag for objects that traverse LEO is critical to space situational awareness. Atmospheric models used for orbital drag calculations can be characterized either as empirical or physics-based (first principles based). Empirical models are fast to evaluate but offer limited real-time predictive/forecasting ability, while physics based models offer greater predictive/forecasting ability but require dedicated parallel computational resources. Also, calibration with accurate data is required for either type of models. This paper presents a new methodology based on proper orthogonal decomposition toward development of a quasi-physical, predictive, reduced order model that combines the speed of empirical and the predictive/forecasting capabilities of physics-based models. The methodology is developed to reduce the high dimensionality of physics-based models while maintaining its capabilities. We develop the methodology using the Naval Research Lab's Mass Spectrometer Incoherent Scatter model and show that the diurnal and seasonal variations can be captured using a small number of modes and parameters. We also present calibration of the reduced order model using the CHAMP and GRACE accelerometer-derived densities. Results show that the method performs well for modeling and calibration of the upper atmosphere.

  17. Forecasting waste compositions: A case study on plastic waste of electronic display housings.

    PubMed

    Peeters, Jef R; Vanegas, Paul; Kellens, Karel; Wang, Feng; Huisman, Jaco; Dewulf, Wim; Duflou, Joost R

    2015-12-01

    Because of the rapid succession of technological developments, the architecture and material composition of many products used in daily life have drastically changed over the last decades. As a result, well-adjusted recycling technologies need to be developed and installed to cope with these evolutions. This is essential to guarantee continued access to materials and to reduce the ecological impact of our material consumption. However, limited information is currently available on the material composition of arising waste streams and even less on how these waste streams will evolve. Therefore, this paper presents a methodology to forecast trends in the material composition of waste streams. To demonstrate the applicability and value of the proposed methodology, it is applied to forecast the evolution of plastic housing waste from flat panel display (FPD) TVs, FPD monitors, cathode ray tube (CRT) TVs and CRT monitors. The results of the presented forecasts indicate that a wide variety of plastic types and additives, such as flame retardants, are found in housings of similar products. The presented case study demonstrates that the proposed methodology allows the identification of trends in the evolution of the material composition of waste streams. In addition, it is demonstrated that the recycling sector will need to adapt its processes to deal with the increasing complexity of plastics of end-of-life electronic displays while respecting relevant directives. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Forecasting of wet snow avalanche activity: Proof of concept and operational implementation

    NASA Astrophysics Data System (ADS)

    Gobiet, Andreas; Jöbstl, Lisa; Rieder, Hannes; Bellaire, Sascha; Mitterer, Christoph

    2017-04-01

    State-of-the-art tools for the operational assessment of avalanche danger include field observations, recordings from automatic weather stations, meteorological analyses and forecasts, and recently also indices derived from snowpack models. In particular, an index for identifying the onset of wet-snow avalanche cycles (LWCindex), has been demonstrated to be useful. However, its value for operational avalanche forecasting is currently limited, since detailed, physically based snowpack models are usually driven by meteorological data from automatic weather stations only and have therefore no prognostic ability. Since avalanche risk management heavily relies on timely information and early warnings, many avalanche services in Europe nowadays start issuing forecasts for the following days, instead of the traditional assessment of the current avalanche danger. In this context, the prognostic operation of detailed snowpack models has recently been objective of extensive research. In this study a new, observationally constrained setup for forecasting the onset of wet-snow avalanche cycles with the detailed snow cover model SNOWPACK is presented and evaluated. Based on data from weather stations and different numerical weather prediction models, we demonstrate that forecasts of the LWCindex as indicator for wet-snow avalanche cycles can be useful for operational warning services, but is so far not reliable enough to be used as single warning tool without considering other factors. Therefore, further development currently focuses on the improvement of the forecasts by applying ensemble techniques and suitable post processing approaches to the output of numerical weather prediction models. In parallel, the prognostic meteo-snow model chain is operationally used by two regional avalanche warning services in Austria since winter 2016/2017 for the first time. Experiences from the first operational season and first results from current model developments will be reported.

  19. Assimilation of Dual-Polarimetric Radar Observations with WRF GSI

    NASA Technical Reports Server (NTRS)

    Li, Xuanli; Mecikalski, John; Fehnel, Traci; Zavodsky, Bradley; Srikishen, Jayanthi

    2014-01-01

    Dual-polarimetric (dual-pol) radar typically transmits both horizontally and vertically polarized radio wave pulses. From the two different reflected power returns, more accurate estimate of liquid and solid cloud and precipitation can be provided. The upgrade of the traditional NWS WSR-88D radar to include dual-pol capabilities will soon be completed for the entire NEXRAD network. Therefore, the use of dual-pol radar network will have a broad impact in both research and operational communities. The assimilation of dual-pol radar data is especially challenging as few guidelines have been provided by previous research. It is our goal to examine how to best use dual-pol radar data to improve forecast of severe storm and forecast initialization. In recent years, the Development Testbed Center (DTC) has released the community Gridpoint Statistical Interpolation (GSI) DA system for the Weather Research and Forecasting (WRF) model. The community GSI system runs in independently environment, yet works functionally equivalent to operational centers. With collaboration with the NASA Short-term Prediction Research and Transition (SPoRT) Center, this study explores regional assimilation of the dual-pol radar variables from the WSR-88D radars for real case storms. Our presentation will highlight our recent effort on incorporating the horizontal reflectivity (ZH), differential reflectivity (ZDR), specific differential phase (KDP), and radial velocity (VR) data for initializing convective storms, with a significant focus being on an improved representation of hydrometeor fields. In addition, discussion will be provided on the development of enhanced assimilation procedures in the GSI system with respect to dual-pol variables. Beyond the dual-pol variable assimilation procedure developing within a GSI framework, highresolution (=1 km) WRF model simulations and storm scale data assimilation experiments will be examined, emphasizing both model initialization and short-term forecast of precipitation fields and processes. Further details of the methodology of data assimilation, the impact of different dual-pol variables, the influence on precipitation forecast will be presented at the conference.

  20. The Use of Factorial Forecasting to Predict Public Response

    ERIC Educational Resources Information Center

    Weiss, David J.

    2012-01-01

    Policies that call for members of the public to change their behavior fail if people don't change; predictions of whether the requisite changes will take place are needed prior to implementation. I propose to solve the prediction problem with Factorial Forecasting, a version of functional measurement methodology that employs group designs. Aspects…

  1. The Field Production of Water for Injection

    DTIC Science & Technology

    1985-12-01

    L/day Bedridden Patient 0.75 L/day Average Diseased Patient 0.50 L/day e (There is no feasible methodology to forecast the number of procedures per... Bedridden Patient 0.75 All Diseased Patients 0.50 An estimate of the liters/day needed may be calculated based on a forecasted patient stream, including

  2. Smoothing Forecasting Methods for Academic Library Circulations: An Evaluation and Recommendation.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    1986-01-01

    Circulation time-series data from 50 midwest academic libraries were used to test 110 variants of 8 smoothing forecasting methods. Data and methodologies and illustrations of two recommended methods--the single exponential smoothing method and Brown's one-parameter linear exponential smoothing method--are given. Eight references are cited. (EJS)

  3. Toward a science of tumor forecasting for clinical oncology

    DOE PAGES

    Yankeelov, Thomas E.; Quaranta, Vito; Evans, Katherine J.; ...

    2015-03-15

    We propose that the quantitative cancer biology community makes a concerted effort to apply lessons from weather forecasting to develop an analogous methodology for predicting and evaluating tumor growth and treatment response. Currently, the time course of tumor response is not predicted; instead, response is only assessed post hoc by physical examination or imaging methods. This fundamental practice within clinical oncology limits optimization of a treatment regimen for an individual patient, as well as to determine in real time whether the choice was in fact appropriate. This is especially frustrating at a time when a panoply of molecularly targeted therapiesmore » is available, and precision genetic or proteomic analyses of tumors are an established reality. By learning from the methods of weather and climate modeling, we submit that the forecasting power of biophysical and biomathematical modeling can be harnessed to hasten the arrival of a field of predictive oncology. Furthermore, with a successful methodology toward tumor forecasting, it should be possible to integrate large tumor-specific datasets of varied types and effectively defeat one cancer patient at a time.« less

  4. Towards a Science of Tumor Forecasting for Clinical Oncology

    PubMed Central

    Yankeelov, Thomas E.; Quaranta, Vito; Evans, Katherine J.; Rericha, Erin C.

    2015-01-01

    We propose that the quantitative cancer biology community make a concerted effort to apply lessons from weather forecasting to develop an analogous methodology for predicting and evaluating tumor growth and treatment response. Currently, the time course of tumor response is not predicted; instead, response is- only assessed post hoc by physical exam or imaging methods. This fundamental practice within clinical oncology limits optimization of atreatment regimen for an individual patient, as well as to determine in real time whether the choice was in fact appropriate. This is especially frustrating at a time when a panoply of molecularly targeted therapies is available, and precision genetic or proteomic analyses of tumors are an established reality. By learning from the methods of weather and climate modeling, we submit that the forecasting power of biophysical and biomathematical modeling can be harnessed to hasten the arrival of a field of predictive oncology. With a successful methodology towards tumor forecasting, it should be possible to integrate large tumor specific datasets of varied types, and effectively defeat cancer one patient at a time. PMID:25592148

  5. Toward a science of tumor forecasting for clinical oncology.

    PubMed

    Yankeelov, Thomas E; Quaranta, Vito; Evans, Katherine J; Rericha, Erin C

    2015-03-15

    We propose that the quantitative cancer biology community makes a concerted effort to apply lessons from weather forecasting to develop an analogous methodology for predicting and evaluating tumor growth and treatment response. Currently, the time course of tumor response is not predicted; instead, response is only assessed post hoc by physical examination or imaging methods. This fundamental practice within clinical oncology limits optimization of a treatment regimen for an individual patient, as well as to determine in real time whether the choice was in fact appropriate. This is especially frustrating at a time when a panoply of molecularly targeted therapies is available, and precision genetic or proteomic analyses of tumors are an established reality. By learning from the methods of weather and climate modeling, we submit that the forecasting power of biophysical and biomathematical modeling can be harnessed to hasten the arrival of a field of predictive oncology. With a successful methodology toward tumor forecasting, it should be possible to integrate large tumor-specific datasets of varied types and effectively defeat one cancer patient at a time. ©2015 American Association for Cancer Research.

  6. Synoptic scale forecast skill and systematic errors in the MASS 2.0 model. [Mesoscale Atmospheric Simulation System

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.

    1985-01-01

    The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.

  7. Consumption trend analysis in the industrial sector: Existing forecasts

    NASA Astrophysics Data System (ADS)

    1981-08-01

    The Gas Research Institute (GRI) is engaged in medium- to long-range research and development in various sectors of the economy that depend on gasing technologies and equipment. To assess the potential demand for natural gas in the industrial sector, forecasts available from private and public sources were compared and analyzed. More than 20 projections were examined, and 10 of the most appropriate long-range demand forecasts were analyzed and compared with respect to the various assumptions, methodologies and criteria on which each was based.

  8. CARA: Cognitive Architecture for Reasoning About Adversaries

    DTIC Science & Technology

    2012-01-20

    synthesis approach taken here the KIDS principle (Keep It Descriptive, Stupid ) applies, and agents and organizations are profiled in great detail...developed two algorithms to make forecasts about adversarial behavior. We developed game-theoretical approaches to reason about group behavior. We...to automatically make forecasts about group behavior together with methods to quantify the uncertainty inherent in such forecasts; • Developed

  9. A hybrid wavelet transform based short-term wind speed forecasting approach.

    PubMed

    Wang, Jujie

    2014-01-01

    It is important to improve the accuracy of wind speed forecasting for wind parks management and wind power utilization. In this paper, a novel hybrid approach known as WTT-TNN is proposed for wind speed forecasting. In the first step of the approach, a wavelet transform technique (WTT) is used to decompose wind speed into an approximate scale and several detailed scales. In the second step, a two-hidden-layer neural network (TNN) is used to predict both approximated scale and detailed scales, respectively. In order to find the optimal network architecture, the partial autocorrelation function is adopted to determine the number of neurons in the input layer, and an experimental simulation is made to determine the number of neurons within each hidden layer in the modeling process of TNN. Afterwards, the final prediction value can be obtained by the sum of these prediction results. In this study, a WTT is employed to extract these different patterns of the wind speed and make it easier for forecasting. To evaluate the performance of the proposed approach, it is applied to forecast Hexi Corridor of China's wind speed. Simulation results in four different cases show that the proposed method increases wind speed forecasting accuracy.

  10. A Hybrid Wavelet Transform Based Short-Term Wind Speed Forecasting Approach

    PubMed Central

    Wang, Jujie

    2014-01-01

    It is important to improve the accuracy of wind speed forecasting for wind parks management and wind power utilization. In this paper, a novel hybrid approach known as WTT-TNN is proposed for wind speed forecasting. In the first step of the approach, a wavelet transform technique (WTT) is used to decompose wind speed into an approximate scale and several detailed scales. In the second step, a two-hidden-layer neural network (TNN) is used to predict both approximated scale and detailed scales, respectively. In order to find the optimal network architecture, the partial autocorrelation function is adopted to determine the number of neurons in the input layer, and an experimental simulation is made to determine the number of neurons within each hidden layer in the modeling process of TNN. Afterwards, the final prediction value can be obtained by the sum of these prediction results. In this study, a WTT is employed to extract these different patterns of the wind speed and make it easier for forecasting. To evaluate the performance of the proposed approach, it is applied to forecast Hexi Corridor of China's wind speed. Simulation results in four different cases show that the proposed method increases wind speed forecasting accuracy. PMID:25136699

  11. A method for the determination of potentially profitable service patterns for commuter air carriers

    NASA Technical Reports Server (NTRS)

    Ransone, R. K.; Kuhlthau, A. R.; Deptula, D. A.

    1975-01-01

    A methodology for estimating market conception was developed as a part of the short-haul air transportation program. It is based upon an analysis of actual documents which provide a record of known travel history. Applying this methodology a forecast was made of the demand for an air feeder service between Charlottesville, Virginia and Dulles International Airport. Local business travel vouchers and local travel agent records were selected to provide the documentation. The market was determined to be profitable for an 8-passenger Cessna 402B aircraft flying a 2-hour daily service pattern designed to mesh to the best extent possible with the connecting schedules at Dulles. The Charlottesville - Dulles air feeder service market conception forecast and its methodology are documented.

  12. Methodology of risk assessment of loss of water resources due to climate changes

    NASA Astrophysics Data System (ADS)

    Israfilov, Yusif; Israfilov, Rauf; Guliyev, Hatam; Afandiyev, Galib

    2016-04-01

    For sustainable development and management of rational use of water resources of Azerbaijan Republic it is actual to forecast their changes taking into account different scenarios of climate changes and assessment of possible risks of loss of sections of water resources. The major part of the Azerbaijani territory is located in the arid climate and the vast majority of water is used in the national economic production. An optimal use of conditional groundwater and surface water is of great strategic importance for economy of the country in terms of lack of common water resources. Low annual rate of sediments, high evaporation and complex natural and hydrogeological conditions prevent sustainable formation of conditioned resources of ground and surface water. In addition, reserves of fresh water resources are not equally distributed throughout the Azerbaijani territory. The lack of the common water balance creates tension in the rational use of fresh water resources in various sectors of the national economy, especially in agriculture, and as a result, in food security of the republic. However, the fresh water resources of the republic have direct proportional dependence on climatic factors. 75-85% of the resources of ground stratum-pore water of piedmont plains and fracture-vein water of mountain regions are formed by the infiltration of rainfall and condensate water. Changes of climate parameters involve changes in the hydrological cycle of the hydrosphere and as a rule, are reflected on their resources. Forecasting changes of water resources of the hydrosphere with different scenarios of climate change in regional mathematical models allowed estimating the extent of their relationship and improving the quality of decisions. At the same time, it is extremely necessary to obtain additional data for risk assessment and management to reduce water resources for a detailed analysis, forecasting the quantitative and qualitative parameters of resources, and also for optimization the use of water resources. In this regard, we have developed the methodology of risk assessment including statistical fuzzy analysis of the relationship "probability-consequences", classification of probabilities, the consequences on degree of severity and risk. The current methodology allow providing the possibility of practical use of the obtained results and giving effectual help in the sustainable development and reduction of risk degree of optimal use of water resources of the republic and, as a consequence, the national strategy of economic development.

  13. A forecast of bridge engineering, 1980-2000.

    DOT National Transportation Integrated Search

    1979-01-01

    A three-pronged study was undertaken to forecast the nature of bridge engineering and construction for the years 1980 to 2000. First, the history of bridge engineering was explored to extrapolate likely future developments. Second, a detailed questio...

  14. Short-term volcano-tectonic earthquake forecasts based on a moving mean recurrence time algorithm: the El Hierro seismo-volcanic crisis experience

    NASA Astrophysics Data System (ADS)

    García, Alicia; De la Cruz-Reyna, Servando; Marrero, José M.; Ortiz, Ramón

    2016-05-01

    Under certain conditions, volcano-tectonic (VT) earthquakes may pose significant hazards to people living in or near active volcanic regions, especially on volcanic islands; however, hazard arising from VT activity caused by localized volcanic sources is rarely addressed in the literature. The evolution of VT earthquakes resulting from a magmatic intrusion shows some orderly behaviour that may allow the occurrence and magnitude of major events to be forecast. Thus governmental decision makers can be supplied with warnings of the increased probability of larger-magnitude earthquakes on the short-term timescale. We present here a methodology for forecasting the occurrence of large-magnitude VT events during volcanic crises; it is based on a mean recurrence time (MRT) algorithm that translates the Gutenberg-Richter distribution parameter fluctuations into time windows of increased probability of a major VT earthquake. The MRT forecasting algorithm was developed after observing a repetitive pattern in the seismic swarm episodes occurring between July and November 2011 at El Hierro (Canary Islands). From then on, this methodology has been applied to the consecutive seismic crises registered at El Hierro, achieving a high success rate in the real-time forecasting, within 10-day time windows, of volcano-tectonic earthquakes.

  15. Applications of Machine Learning to Downscaling and Verification

    NASA Astrophysics Data System (ADS)

    Prudden, R.

    2017-12-01

    Downscaling, sometimes known as super-resolution, means converting model data into a more detailed local forecast. It is a problem which could be highly amenable to machine learning approaches, provided that sufficient historical forecast data and observations are available. It is also closely linked to the subject of verification, since improving a forecast requires a way to measure that improvement. This talk will describe some early work towards downscaling Met Office ensemble forecasts, and discuss how the output may be usefully evaluated.

  16. Oxygenate Supply/Demand Balances in the Short-Term Integrated Forecasting Model (Short-Term Energy Outlook Supplement March 1998)

    EIA Publications

    1998-01-01

    The blending of oxygenates, such as fuel ethanol and methyl tertiary butyl ether (MTBE), into motor gasoline has increased dramatically in the last few years because of the oxygenated and reformulated gasoline programs. Because of the significant role oxygenates now have in petroleum product markets, the Short-Term Integrated Forecasting System (STIFS) was revised to include supply and demand balances for fuel ethanol and MTBE. The STIFS model is used for producing forecasts in the Short-Term Energy Outlook. A review of the historical data sources and forecasting methodology for oxygenate production, imports, inventories, and demand is presented in this report.

  17. Short-Term fo F2 Forecast: Present Day State of Art

    NASA Astrophysics Data System (ADS)

    Mikhailov, A. V.; Depuev, V. H.; Depueva, A. H.

    An analysis of the F2-layer short-term forecast problem has been done. Both objective and methodological problems prevent us from a deliberate F2-layer forecast issuing at present. An empirical approach based on statistical methods may be recommended for practical use. A forecast method based on a new aeronomic index (a proxy) AI has been proposed and tested over selected 64 severe storm events. The method provides an acceptable prediction accuracy both for strongly disturbed and quiet conditions. The problems with the prediction of the F2-layer quiet-time disturbances as well as some other unsolved problems are discussed

  18. Impacts of Short-Term Solar Power Forecasts in System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez, Eduardo; Krad, Ibrahim; Hodge, Bri-Mathias

    2016-05-05

    Solar generation is experiencing an exponential growth in power systems worldwide and, along with wind power, is posing new challenges to power system operations. Those challenges are characterized by an increase of system variability and uncertainty across many time scales: from days, down to hours, minutes, and seconds. Much of the research in the area has focused on the effect of solar forecasting across hours or days. This paper presents a methodology to capture the effect of short-term forecasting strategies and analyzes the economic and reliability implications of utilizing a simple, yet effective forecasting method for solar PV in intra-daymore » operations.« less

  19. Forecasting daily meteorological time series using ARIMA and regression models

    NASA Astrophysics Data System (ADS)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  20. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  1. Present and future hydropower scheduling in Statkraft

    NASA Astrophysics Data System (ADS)

    Bruland, O.

    2012-12-01

    Statkraft produces close to 40 TWH in an average year and is one of the largest hydropower producers in Europe. For hydropower producers the scheduling of electricity generation is the key to success and this depend on optimal use of the water resources. The hydrologist and his forecasts both on short and on long terms are crucial to this success. The hydrological forecasts in Statkraft and most hydropower companies in Scandinavia are based on lumped models and the HBV concept. But before the hydrological model there is a complex system for collecting, controlling and correcting data applied in the models and the production scheduling and, equally important, routines for surveillance of the processes and manual intervention. Prior to the forecasting the states in the hydrological models are updated based on observations. When snow is present in the catchments snow surveys are an important source for model updating. The meteorological forecast is another premise provider to the hydrological forecast and to get as precise meteorological forecast as possible Statkraft hires resources from the governmental forecasting center. Their task is to interpret the meteorological situation, describe the uncertainties and if necessary use their knowledge and experience to manually correct the forecast in the hydropower production regions. This is one of several forecast applied further in the scheduling process. Both to be able to compare and evaluate different forecast providers and to ensure that we get the best available forecast, forecasts from different sources are applied. Some of these forecasts have undergone statistical corrections to reduce biases. The uncertainties related to the meteorological forecast have for a long time been approached and described by ensemble forecasts. But also the observations used for updating the model have a related uncertainty. Both to the observations itself and to how well they represent the catchment. Though well known, these uncertainties have thus far been handled superficially. Statkraft has initiated a program called ENKI to approach these issues. A part of this program is to apply distributed models for hydrological forecasting. Developing methodologies to handle uncertainties in the observations, the meteorological forecasts, the model itself and how to update the model with this information are other parts of the program. Together with energy price expectations and information about the state of the energy production system the hydrological forecast is input to the next step in the production scheduling both on short and long term. The long term schedule for reservoir filling is premise provider to the short term optimizing of water. The long term schedule is based on the actual reservoir levels, snow storages and a long history of meteorological observations and gives an overall schedule at a regional level. Within the regions a more detailed tool is used for short term optimizing of the hydropower production Each reservoir is scheduled taking into account restrictions in the water courses and cost of start and stop of aggregates. The value of the water is calculated for each reservoir and reflects the risk of water spillage. This compared to the energy price determines whether an aggregate will run or not. In a gradually more complex energy system with relatively lower regulated capacity this is an increasingly more challenging task.

  2. Model documentation report: Commercial Sector Demand Module of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-01-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. The NEMS Commercial Sector Demand Module is a simulation tool based upon economic and engineering relationships that models commercial sector energy demands at the nine Census Division level of detail for eleven distinct categories of commercial buildings. Commercial equipment selections are performed for the major fuels of electricity, natural gas,more » and distillate fuel, for the major services of space heating, space cooling, water heating, ventilation, cooking, refrigeration, and lighting. The algorithm also models demand for the minor fuels of residual oil, liquefied petroleum gas, steam coal, motor gasoline, and kerosene, the renewable fuel sources of wood and municipal solid waste, and the minor services of office equipment. Section 2 of this report discusses the purpose of the model, detailing its objectives, primary input and output quantities, and the relationship of the Commercial Module to the other modules of the NEMS system. Section 3 of the report describes the rationale behind the model design, providing insights into further assumptions utilized in the model development process to this point. Section 3 also reviews alternative commercial sector modeling methodologies drawn from existing literature, providing a comparison to the chosen approach. Section 4 details the model structure, using graphics and text to illustrate model flows and key computations.« less

  3. On the skill of various ensemble spread estimators for probabilistic short range wind forecasting

    NASA Astrophysics Data System (ADS)

    Kann, A.

    2012-05-01

    A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.

  4. System learning approach to assess sustainability and forecast trends in regional dynamics: The San Luis Basin study, Colorado, U.S.A.

    EPA Science Inventory

    This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key metho...

  5. Health Sciences Libraries Forecasting Information Service Trends for Researchers: Models Applicable to All Academic Libraries

    ERIC Educational Resources Information Center

    Cain, Timothy J.; Cheek, Fern M.; Kupsco, Jeremy; Hartel, Lynda J.; Getselman, Anna

    2016-01-01

    To better understand the value of current information services and to forecast the evolving information and data management needs of researchers, a study was conducted at two research-intensive universities. The methodology and planning framework applied by health science librarians at Emory University and The Ohio State University focused on…

  6. Detecting and assessing Saharan dust contribution to PM10 loads: A pilot study within the EU-Life+10 project DIAPASON

    NASA Astrophysics Data System (ADS)

    Gobbi, Gian Paolo; Barnaba, Francesca; Bolignano, Andrea; Costabile, Francesca; Di Liberto, Luca; Dionisi, Davide; Drewnick, Frank; Lucarelli, Franco; Manigrasso, Maurizio; Nava, Silvia; Sauvage, Laurent; Sozzi, Roberto; Struckmeier, Caroline; Wille, Holger

    2015-04-01

    The EC LIFE+2010 DIAPASON Project (Desert dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs, www.diapason-life.eu) intends to contribute new methodologies to assess the role of aerosol advections of Saharan dust to the local PM loads recorded in Europe. To this goal, automated Polarization Lidar-Ceilometers (PLCs) were prototyped within DIAPASON to certify the presence of Saharan dust plumes and support evaluating their mass loadings in the lowermost atmosphere. The whole process also involves operational dust forecasts, as well as satellite and in-situ observations. Demonstration of the Project is implemented in the pilot region of Rome (Central Italy) where three networked DIAPASON PLCs started, in October 2013 a year-round, 24h/day monitoring of the altitude-resolved aerosol backscatter and depolarization profiles. Two intensive observational periods (IOPs) involving chemical analysis and detailed physical characterization of aerosol samples have also been carried out in this year-long campaign, namely in Fall 2013 and Spring 2014. These allowed for an extensive interpretation of the PLC observations, highlighting important synergies between the PLC and the in situ data. The presentation will address capabilities of the employed PLCs, observations agreement with model forecasts of dust advections, retrievals of aerosol properties and methodologies developed to detect Saharan advections and to evaluate the relevant mass contribution to PM10. This latter task is intended to provide suggestions on possible improvements to the current EC Guidelines (2011) on this matter. In fact, specific Guidelines are delivered by the European Commission to provide the Member States a common method to asses the Saharan dust contribution to the currently legislated PM-related Air Quality metrics. The DIAPASON experience shows that improvements can be proposed to make the current EC Methodology more robust and flexible. The methodology DIAPASON recommends has been designed and validated taking advantage of the PLC observations and highlights the benefits of the operational use of such systems in routine Air Quality applications. Concurrently, PLC activities are contributing to the COST Action "TOPROF", an European effort aiming at the setup and operational use of Lidar-Ceilometers networks for meteorological and safety purposes.

  7. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.

  8. Multi-scale landslide hazard assessment: Advances in global and regional methodologies

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang

    2010-05-01

    The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.

  9. Adaptive Regulation of the Northern California Reservoir System for Water, Energy, and Environmental Management

    NASA Astrophysics Data System (ADS)

    Georgakakos, A. P.; Kistenmacher, M.; Yao, H.; Georgakakos, K. P.

    2014-12-01

    The 2014 National Climate Assessment of the US Global Change Research Program emphasizes that water resources managers and planners in most US regions will have to cope with new risks, vulnerabilities, and opportunities, and recommends the development of adaptive capacity to effectively respond to the new water resources planning and management challenges. In the face of these challenges, adaptive reservoir regulation is becoming all the more ncessary. Water resources management in Northern California relies on the coordinated operation of several multi-objective reservoirs on the Trinity, Sacramento, American, Feather, and San Joaquin Rivers. To be effective, reservoir regulation must be able to (a) account for forecast uncertainty; (b) assess changing tradeoffs among water uses and regions; and (c) adjust management policies as conditions change; and (d) evaluate the socio-economic and environmental benefits and risks of forecasts and policies for each region and for the system as a whole. The Integrated Forecast and Reservoir Management (INFORM) prototype demonstration project operated in Northern California through the collaboration of several forecast and management agencies has shown that decision support systems (DSS) with these attributes add value to stakeholder decision processes compared to current, less flexible management practices. Key features of the INFORM DSS include: (a) dynamically downscaled operational forecasts and climate projections that maintain the spatio-temporal coherence of the downscaled land surface forcing fields within synoptic scales; (b) use of ensemble forecast methodologies for reservoir inflows; (c) assessment of relevant tradeoffs among water uses on regional and local scales; (d) development and evaluation of dynamic reservoir policies with explicit consideration of hydro-climatic forecast uncertainties; and (e) focus on stakeholder information needs.This article discusses the INFORM integrated design concept, underlying methodologies, and selected applications with the California water resources system.

  10. Estimation and prediction of origin-destination matrices for I-66.

    DOT National Transportation Integrated Search

    2011-09-01

    This project uses the Box-Jenkins time-series technique to model and forecast the traffic flows and then : uses the flow forecasts to predict the origin-destination matrices. First, a detailed analysis was conducted : to investigate the best data cor...

  11. Improving 7-Day Forecast Skill by Assimilation of Retrieved AIRS Temperature Profiles

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Rosenberg, Bob

    2016-01-01

    We conducted a new set of Data Assimilation Experiments covering the period January 1 to February 29, 2016 using the GEOS-5 DAS. Our experiments assimilate all data used operationally by GMAO (Control) with some modifications. Significant improvement in Global and Southern Hemisphere Extra-tropical 7-day forecast skill was obtained when: We assimilated AIRS Quality Controlled temperature profiles in place of observed AIRS radiances, and also did not assimilate CrISATMS radiances, nor did we assimilate radiosonde temperature profiles or aircraft temperatures. This new methodology did not improve or degrade 7-day Northern Hemispheric Extra-tropical forecast skill. We are conducting experiments aimed at further improving of Northern Hemisphere Extra-tropical forecast skill.

  12. New forecasting methodology indicates more disease and earlier mortality ahead for today's younger Americans.

    PubMed

    Reither, Eric N; Olshansky, S Jay; Yang, Yang

    2011-08-01

    Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.

  13. Next Day Price Forecasting in Deregulated Market by Combination of Artificial Neural Network and ARIMA Time Series Models

    NASA Astrophysics Data System (ADS)

    Areekul, Phatchakorn; Senjyu, Tomonobu; Urasaki, Naomitsu; Yona, Atsushi

    Electricity price forecasting is becoming increasingly relevant to power producers and consumers in the new competitive electric power markets, when planning bidding strategies in order to maximize their benefits and utilities, respectively. This paper proposed a method to predict hourly electricity prices for next-day electricity markets by combination methodology of ARIMA and ANN models. The proposed method is examined on the Australian National Electricity Market (NEM), New South Wales regional in year 2006. Comparison of forecasting performance with the proposed ARIMA, ANN and combination (ARIMA-ANN) models are presented. Empirical results indicate that an ARIMA-ANN model can improve the price forecasting accuracy.

  14. A global flash flood forecasting system

    NASA Astrophysics Data System (ADS)

    Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin

    2016-04-01

    The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial resolution appropriate to the NWP system. We then demonstrate how these warning areas could eventually complement existing global systems such as the Global Flood Awareness System (GloFAS), to give warnings of flash floods. This work demonstrates the possibility of creating a global flash flood forecasting system based on forecasts from existing global NWP systems. Future developments, in post-processing for example, will need to address an under-prediction bias, for extreme point rainfall, that is innate to current-generation global models.

  15. Concepts and Methodology for Labour Market Forecasts by Occupation and Qualification in the Context of a Flexible Labour Market.

    ERIC Educational Resources Information Center

    Borghans, Lex; de Grip, Andries; Heijke, Hans

    The problem of planning and making labor market forecasts by occupation and qualification in the context of a constantly changing labor market was examined. The examination focused on the following topics: assumptions, benefits, and pitfalls of the labor requirement model of projecting future imbalances between labor supply and demand for certain…

  16. Major challenges for correlational ecological niche model projections to future climate conditions.

    PubMed

    Peterson, A Townsend; Cobos, Marlon E; Jiménez-García, Daniel

    2018-06-20

    Species-level forecasts of distributional potential and likely distributional shifts, in the face of changing climates, have become popular in the literature in the past 20 years. Many refinements have been made to the methodology over the years, and the result has been an approach that considers multiple sources of variation in geographic predictions, and how that variation translates into both specific predictions and uncertainty in those predictions. Although numerous previous reviews and overviews of this field have pointed out a series of assumptions and caveats associated with the methodology, three aspects of the methodology have important impacts but have not been treated previously in detail. Here, we assess those three aspects: (1) effects of niche truncation on model transfers to future climate conditions, (2) effects of model selection procedures on future-climate transfers of ecological niche models, and (3) relative contributions of several factors (replicate samples of point data, general circulation models, representative concentration pathways, and alternative model parameterizations) to overall variance in model outcomes. Overall, the view is one of caution: although resulting predictions are fascinating and attractive, this paradigm has pitfalls that may bias and limit confidence in niche model outputs as regards the implications of climate change for species' geographic distributions. © 2018 New York Academy of Sciences.

  17. The UCERF3 grand inversion: Solving for the long‐term rate of ruptures in a fault system

    USGS Publications Warehouse

    Page, Morgan T.; Field, Edward H.; Milner, Kevin; Powers, Peter M.

    2014-01-01

    We present implementation details, testing, and results from a new inversion‐based methodology, known colloquially as the “grand inversion,” developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long‐term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip‐rate, paleoseismic event‐rate, and magnitude‐distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude‐distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, integrated quantities such as magnitude–frequency distributions and, most importantly, hazard metrics, are much more robust.

  18. Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    2017-07-01

    In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.

  19. Parameter estimation and forecasting for multiplicative log-normal cascades.

    PubMed

    Leövey, Andrés E; Lux, Thomas

    2012-04-01

    We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing et al. [Physica D 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica D 193, 195 (2004)] and Kiyono et al. [Phys. Rev. E 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono et al.'s procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.

  20. Identifying needs for streamflow forecasting in the Incomati basin, Southern Africa

    NASA Astrophysics Data System (ADS)

    Sunday, Robert; Werner, Micha; Masih, Ilyas; van der Zaag, Pieter

    2013-04-01

    Despite being widely recognised as an efficient tool in the operational management of water resources, rainfall and streamflow forecasts are currently not utilised in water management practice in the Incomati Basin in Southern Africa. Although, there have been initiatives for forecasting streamflow in the Sabie and Crocodile sub-basins, the outputs of these have found little use because of scepticism on the accuracy and reliability of the information, or the relevance of the information provided to the needs of the water managers. The process of improving these forecasts is underway, but as yet the actual needs of the forecasts are unclear and scope of the ongoing initiatives remains very limited. In this study questionnaires and focused group interviews were used to establish the need, potential use, benefit and required accuracy of rainfall and streamflow forecasts in the Incomati Basin. Thirty five interviews were conducted with professionals engaged in water sector and detailed discussions were held with water institutions, including the Inkomati Catchment Management Agency (ICMA), Komati Basin Water Authority (KOBWA), South African Weather Service (SAWS), water managers, dam operators, water experts, farmers and other water users in the Basin. Survey results show that about 97% of the respondents receive weather forecasts. In contrast to expectations, only 5% have access to the streamflow forecast. In the weather forecast, the most important variables were considered to be rainfall and temperature at daily and weekly time scales. Moreover, forecasts of global climatic indices such as El Niño or La Niña were neither received nor demanded. There was limited demand and/or awareness of flood and drought forecasts including the information on their linkages with global climatic indices. While the majority of respondents indicate the need and indeed use the weather forecast, the provision, communication and interpretation were in general found to be with too little detail and clarity. In some cases this was attributed to the short time and space allotted in media such as television and newspapers respectively. Major uses of the weather forecast were made in personal planning i.e., travelling (29%) and dressing (23%). The usefulness in water sector was reported for water allocation (23%), farming (11%) and flood monitoring (9%), but was considered as a factor having minor influence on the actual decision making in operational water management mainly due to uncertainty of the weather forecast, difference in the time scale and institutional arrangements. In the incidences where streamflow forecasts were received (5% of the cases), its application in decision making was not carried out due to high uncertainty. Moreover, dam operators indicated weekly streamflow forecast as very important in releasing water for agriculture but this was not the format in which forecasts were available to them. Generally, users affirmed the accuracy and benefits of weather forecasts and had no major concerns on the impacts of wrong forecasts. However, respondents indicated the need to improve the accuracy and accessibility of the forecast. Likewise, water managers expressed the need for both rainfall and flow forecasts but indicated that they face hindrances due to financial and human resource constraints. This shows that there is a need to strengthen water related forecasts and the consequent uses in the basin. This can be done through collaboration among forecasting and water organisations such as the SAWS, Research Institutions and users like ICMA, KOBWA and farmers. Collaboration between the meteorology and water resources sectors is important to establish consistent forecast information. The forecasts themselves should be detailed and user specific to ensure these are indeed used and can answer to the needs of the users.

  1. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  2. An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases

    NASA Astrophysics Data System (ADS)

    Ramaswamy, V.; Saleh, F.

    2017-12-01

    Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.

  3. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE PAGES

    Yoo, Wucherl; Sim, Alex

    2016-06-24

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  4. A review and update of the Virginia Department of Transportation cash flow forecasting model.

    DOT National Transportation Integrated Search

    1996-01-01

    This report details the research done to review and update components of the VDOT cash flow forecasting model. Specifically, the study updated the monthly factors submodel used to predict payments on construction contracts. For the other submodel rev...

  5. Basic principles, methodology, and applications of remote sensing in agriculture

    NASA Technical Reports Server (NTRS)

    Moreira, M. A. (Principal Investigator); Deassuncao, G. V.

    1984-01-01

    The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.

  6. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  7. Testing an innovative framework for flood forecasting, monitoring and mapping in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos

    2017-04-01

    Between May and June 2016, France was hit by severe floods, particularly in the Loire and Seine river basins. In this work, we use this case study to test an innovative framework for flood forecasting, mapping and monitoring. More in detail, the system integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. We explore in detail the performance of each component of the system, demonstrating the improvements in respect to stand-alone flood forecasting and monitoring systems. We show how the performances of the forecasting component can be refined using the real-time feedback from social media monitoring to identify which areas were flooded, to evaluate the flood intensity, and therefore to correct impact estimations. Moreover, we show how the integration with impact forecast and social media monitoring can improve the timeliness and efficiency of satellite based emergency mapping, and reduce the chances of missing areas where flooding is already happening. These results illustrate how the new integrated approach leads to a better and earlier decision making and a timely evaluation of impacts.

  8. Air Quality Forecasts Using the NASA GEOS Model

    NASA Technical Reports Server (NTRS)

    Keller, Christoph A.; Knowland, K. Emma; Nielsen, Jon E.; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Follette-Cook, Melanie; Liu, Junhua; hide

    2018-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  9. The definitive analysis of the Bendandi's methodology performed with a specific software

    NASA Astrophysics Data System (ADS)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  10. Model documentation, Coal Market Module of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the objectives and the conceptual and methodological approach used in the development of the National Energy Modeling System`s (NEMS) Coal Market Module (CMM) used to develop the Annual Energy Outlook 1998 (AEO98). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of CMM`s two submodules. These are the Coal Production Submodule (CPS) and the Coal Distribution Submodule (CDS). CMM provides annual forecasts of prices, production, and consumption of coal for NEMS. In general, the CDS integrates the supply inputs from the CPS to satisfy demands for coal from exogenous demand models. The internationalmore » area of the CDS forecasts annual world coal trade flows from major supply to major demand regions and provides annual forecasts of US coal exports for input to NEMS. Specifically, the CDS receives minemouth prices produced by the CPS, demand and other exogenous inputs from other NEMS components, and provides delivered coal prices and quantities to the NEMS economic sectors and regions.« less

  11. Comparative Analysis of NOAA REFM and SNB3GEO Tools for the Forecast of the Fluxes of High-Energy Electrons at GEO

    NASA Technical Reports Server (NTRS)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  12. Comparative analysis of NOAA REFM and SNB3GEO tools for the forecast of the fluxes of high-energy electrons at GEO.

    PubMed

    Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.

  13. Use of Data to Improve Seasonal-to-Interannual Forecasts Simulated by Intermediate Coupled Models

    NASA Technical Reports Server (NTRS)

    Perigaud, C.; Cassou, C.; Dewitte, B.; Fu, L-L.; Neelin, J.

    1999-01-01

    This paper provides a detailed illustration that it can be much more beneficial for ENSO forecasting to use data to improve the model parameterizations rather than to modify the initial conditions to gain in consistency with the simulated coupled system.

  14. Monthly forecasting of agricultural pests in Switzerland

    NASA Astrophysics Data System (ADS)

    Hirschi, M.; Dubrovsky, M.; Spirig, C.; Samietz, J.; Calanca, P.; Weigel, A. P.; Fischer, A. M.; Rotach, M. W.

    2012-04-01

    Given the repercussions of pests and diseases on agricultural production, detailed forecasting tools have been developed to simulate the degree of infestation depending on actual weather conditions. The life cycle of pests is most successfully predicted if the micro-climate of the immediate environment (habitat) of the causative organisms can be simulated. Sub-seasonal pest forecasts therefore require weather information for the relevant habitats and the appropriate time scale. The pest forecasting system SOPRA (www.sopra.info) currently in operation in Switzerland relies on such detailed weather information, using hourly weather observations up to the day the forecast is issued, but only a climatology for the forecasting period. Here, we aim at improving the skill of SOPRA forecasts by transforming the weekly information provided by ECMWF monthly forecasts (MOFCs) into hourly weather series as required for the prediction of upcoming life phases of the codling moth, the major insect pest in apple orchards worldwide. Due to the probabilistic nature of operational monthly forecasts and the limited spatial and temporal resolution, their information needs to be post-processed for use in a pest model. In this study, we developed a statistical downscaling approach for MOFCs that includes the following steps: (i) application of a stochastic weather generator to generate a large pool of daily weather series consistent with the climate at a specific location, (ii) a subsequent re-sampling of weather series from this pool to optimally represent the evolution of the weekly MOFC anomalies, and (iii) a final extension to hourly weather series suitable for the pest forecasting model. Results show a clear improvement in the forecast skill of occurrences of upcoming codling moth life phases when incorporating MOFCs as compared to the operational pest forecasting system. This is true both in terms of root mean squared errors and of the continuous rank probability scores of the probabilistic forecasts vs. the mean absolute errors of the deterministic system. Also, the application of the climate conserving recalibration (CCR, Weigel et al. 2009) technique allows for successful correction of the under-confidence in the forecasted occurrences of codling moth life phases. Reference: Weigel, A. P.; Liniger, M. A. & Appenzeller, C. (2009). Seasonal Ensemble Forecasts: Are Recalibrated Single Models Better than Multimodels? Mon. Wea. Rev., 137, 1460-1479.

  15. Forecasting Effects of MISO Actions: An ABM Methodology

    DTIC Science & Technology

    2013-12-01

    process of opinion change for polio vaccination in Ut- tar Pradesh, India. His analysis combines word-of-mouth and mass media broadcasting for agent...this abstraction is an appropriate comparison between treatments , rather than actual forecasting of specific levels of rebellion or anti-government...effect of breadth upon griev- ance. There is insufficient evidence to show that this effect differs between treatments . Breadth and campaign type

  16. Determining and Forecasting Savings from Competing Previously Sole Source/Noncompetitive Contracts

    DTIC Science & Technology

    1978-10-01

    SUMMARY A. BACKGROUND. Within the defense market , It is difficult to isolate, identify and quantify the impact of competition on acquisition costs...63 C. F04iCASTING METhODOLOGY .................. . 7 0. COMPETITION INDEX . . . . . . . . . . . . . . . . . . .. . 77 E . USE AS A FORECASTING TOOL...program is still active. e . From this projection, calculate the actual total contract price coiencing with the buy-out competition by multiplying the

  17. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  18. Computer-Aided Analysis of Patents for Product Technology Maturity Forecasting

    NASA Astrophysics Data System (ADS)

    Liang, Yanhong; Gan, Dequan; Guo, Yingchun; Zhang, Peng

    Product technology maturity foresting is vital for any enterprises to hold the chance for innovation and keep competitive for a long term. The Theory of Invention Problem Solving (TRIZ) is acknowledged both as a systematic methodology for innovation and a powerful tool for technology forecasting. Based on TRIZ, the state -of-the-art on the technology maturity of product and the limits of application are discussed. With the application of text mining and patent analysis technologies, this paper proposes a computer-aided approach for product technology maturity forecasting. It can overcome the shortcomings of the current methods.

  19. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  20. 'Emerging technologies for the changing global market' - Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt

    1993-01-01

    This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.

  1. Real-time forecasting of the April 11, 2012 Sumatra tsunami

    USGS Publications Warehouse

    Wang, Dailin; Becker, Nathan C.; Walsh, David; Fryer, Gerard J.; Weinstein, Stuart A.; McCreery, Charles S.; ,

    2012-01-01

    The April 11, 2012, magnitude 8.6 earthquake off the northern coast of Sumatra generated a tsunami that was recorded at sea-level stations as far as 4800 km from the epicenter and at four ocean bottom pressure sensors (DARTs) in the Indian Ocean. The governments of India, Indonesia, Sri Lanka, Thailand, and Maldives issued tsunami warnings for their coastlines. The United States' Pacific Tsunami Warning Center (PTWC) issued an Indian Ocean-wide Tsunami Watch Bulletin in its role as an Interim Service Provider for the region. Using an experimental real-time tsunami forecast model (RIFT), PTWC produced a series of tsunami forecasts during the event that were based on rapidly derived earthquake parameters, including initial location and Mwp magnitude estimates and the W-phase centroid moment tensor solutions (W-phase CMTs) obtained at PTWC and at the U. S. Geological Survey (USGS). We discuss the real-time forecast methodology and how successive, real-time tsunami forecasts using the latest W-phase CMT solutions improved the accuracy of the forecast.

  2. Forecasting inundation from debris flows that grow during travel, with application to the Oregon Coast Range, USA

    USGS Publications Warehouse

    Reid, Mark E.; Coe, Jeffrey A.; Brien, Dianne

    2016-01-01

    Many debris flows increase in volume as they travel downstream, enhancing their mobility and hazard. Volumetric growth can result from diverse physical processes, such as channel sediment entrainment, stream bank collapse, adjacent landsliding, hillslope erosion and rilling, and coalescence of multiple debris flows; incorporating these varied phenomena into physics-based debris-flow models is challenging. As an alternative, we embedded effects of debris-flow growth into an empirical/statistical approach to forecast potential inundation areas within digital landscapes in a GIS framework. Our approach used an empirical debris-growth function to account for the effects of growth phenomena. We applied this methodology to a debris-flow-prone area in the Oregon Coast Range, USA, where detailed mapping revealed areas of erosion and deposition along paths of debris flows that occurred during a large storm in 1996. Erosion was predominant in stream channels with slopes > 5°. Using pre- and post-event aerial photography, we derived upslope contributing area and channel-length growth factors. Our method reproduced the observed inundation patterns produced by individual debris flows; it also generated reproducible, objective potential inundation maps for entire drainage networks. These maps better matched observations than those using previous methods that focus on proximal or distal regions of a drainage network.

  3. Forecasting Chikungunya spread in the Americas via data-driven empirical approaches.

    PubMed

    Escobar, Luis E; Qiao, Huijie; Peterson, A Townsend

    2016-02-29

    Chikungunya virus (CHIKV) is endemic to Africa and Asia, but the Asian genotype invaded the Americas in 2013. The fast increase of human infections in the American epidemic emphasized the urgency of developing detailed predictions of case numbers and the potential geographic spread of this disease. We developed a simple model incorporating cases generated locally and cases imported from other countries, and forecasted transmission hotspots at the level of countries and at finer scales, in terms of ecological features. By late January 2015, >1.2 M CHIKV cases were reported from the Americas, with country-level prevalences between nil and more than 20 %. In the early stages of the epidemic, exponential growth in case numbers was common; later, however, poor and uneven reporting became more common, in a phenomenon we term "surveillance fatigue." Economic activity of countries was not associated with prevalence, but diverse social factors may be linked to surveillance effort and reporting. Our model predictions were initially quite inaccurate, but improved markedly as more data accumulated within the Americas. The data-driven methodology explored in this study provides an opportunity to generate descriptive and predictive information on spread of emerging diseases in the short-term under simple models based on open-access tools and data that can inform early-warning systems and public health intelligence.

  4. Improving of local ozone forecasting by integrated models.

    PubMed

    Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš

    2016-09-01

    This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.

  5. Forecasting Dust Storms Using the CARMA-Dust Model and MM5 Weather Data

    NASA Astrophysics Data System (ADS)

    Barnum, B. H.; Winstead, N. S.; Wesely, J.; Hakola, A.; Colarco, P.; Toon, O. B.; Ginoux, P.; Brooks, G.; Hasselbarth, L. M.; Toth, B.; Sterner, R.

    2002-12-01

    An operational model for the forecast of dust storms in Northern Africa, the Middle East and Southwest Asia has been developed for the United States Air Force Weather Agency (AFWA). The dust forecast model uses the 5th generation Penn State Mesoscale Meteorology Model (MM5), and a modified version of the Colorado Aerosol and Radiation Model for Atmospheres (CARMA). AFWA conducted a 60 day evaluation of the dust model to look at the model's ability to forecast dust storms for short, medium and long range (72 hour) forecast periods. The study used satellite and ground observations of dust storms to verify the model's effectiveness. Each of the main mesoscale forecast theaters was broken down into smaller sub-regions for detailed analysis. The study found the forecast model was able to forecast dust storms in Saharan Africa and the Sahel region with an average Probability of Detection (POD)exceeding 68%, with a 16% False Alarm Rate (FAR). The Southwest Asian theater had average POD's of 61% with FAR's averaging 10%.

  6. Soil thermal dynamics, snow cover, and frozen depth under five temperature treatments in an ombrotrophic bog: Constrained forecast with data assimilation: Forecast With Data Assimilation

    DOE PAGES

    Huang, Yuanyuan; Jiang, Jiang; Ma, Shuang; ...

    2017-08-18

    We report that accurate simulation of soil thermal dynamics is essential for realistic prediction of soil biogeochemical responses to climate change. To facilitate ecological forecasting at the Spruce and Peatland Responses Under Climatic and Environmental change site, we incorporated a soil temperature module into a Terrestrial ECOsystem (TECO) model by accounting for surface energy budget, snow dynamics, and heat transfer among soil layers and during freeze-thaw events. We conditioned TECO with detailed soil temperature and snow depth observations through data assimilation before the model was used for forecasting. The constrained model reproduced variations in observed temperature from different soil layers,more » the magnitude of snow depth, the timing of snowfall and snowmelt, and the range of frozen depth. The conditioned TECO forecasted probabilistic distributions of soil temperature dynamics in six soil layers, snow, and frozen depths under temperature treatments of +0.0, +2.25, +4.5, +6.75, and +9.0°C. Air warming caused stronger elevation in soil temperature during summer than winter due to winter snow and ice. And soil temperature increased more in shallow soil layers in summer in response to air warming. Whole ecosystem warming (peat + air warmings) generally reduced snow and frozen depths. The accuracy of forecasted snow and frozen depths relied on the precision of weather forcing. Uncertainty is smaller for forecasting soil temperature but large for snow and frozen depths. Lastly, timely and effective soil thermal forecast, constrained through data assimilation that combines process-based understanding and detailed observations, provides boundary conditions for better predictions of future biogeochemical cycles.« less

  7. Soil thermal dynamics, snow cover, and frozen depth under five temperature treatments in an ombrotrophic bog: Constrained forecast with data assimilation: Forecast With Data Assimilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yuanyuan; Jiang, Jiang; Ma, Shuang

    We report that accurate simulation of soil thermal dynamics is essential for realistic prediction of soil biogeochemical responses to climate change. To facilitate ecological forecasting at the Spruce and Peatland Responses Under Climatic and Environmental change site, we incorporated a soil temperature module into a Terrestrial ECOsystem (TECO) model by accounting for surface energy budget, snow dynamics, and heat transfer among soil layers and during freeze-thaw events. We conditioned TECO with detailed soil temperature and snow depth observations through data assimilation before the model was used for forecasting. The constrained model reproduced variations in observed temperature from different soil layers,more » the magnitude of snow depth, the timing of snowfall and snowmelt, and the range of frozen depth. The conditioned TECO forecasted probabilistic distributions of soil temperature dynamics in six soil layers, snow, and frozen depths under temperature treatments of +0.0, +2.25, +4.5, +6.75, and +9.0°C. Air warming caused stronger elevation in soil temperature during summer than winter due to winter snow and ice. And soil temperature increased more in shallow soil layers in summer in response to air warming. Whole ecosystem warming (peat + air warmings) generally reduced snow and frozen depths. The accuracy of forecasted snow and frozen depths relied on the precision of weather forcing. Uncertainty is smaller for forecasting soil temperature but large for snow and frozen depths. Lastly, timely and effective soil thermal forecast, constrained through data assimilation that combines process-based understanding and detailed observations, provides boundary conditions for better predictions of future biogeochemical cycles.« less

  8. Statistical security for Social Security.

    PubMed

    Soneji, Samir; King, Gary

    2012-08-01

    The financial viability of Social Security, the single largest U.S. government program, depends on accurate forecasts of the solvency of its intergenerational trust fund. We begin by detailing information necessary for replicating the Social Security Administration's (SSA's) forecasting procedures, which until now has been unavailable in the public domain. We then offer a way to improve the quality of these procedures via age- and sex-specific mortality forecasts. The most recent SSA mortality forecasts were based on the best available technology at the time, which was a combination of linear extrapolation and qualitative judgments. Unfortunately, linear extrapolation excludes known risk factors and is inconsistent with long-standing demographic patterns, such as the smoothness of age profiles. Modern statistical methods typically outperform even the best qualitative judgments in these contexts. We show how to use such methods, enabling researchers to forecast using far more information, such as the known risk factors of smoking and obesity and known demographic patterns. Including this extra information makes a substantial difference. For example, by improving only mortality forecasting methods, we predict three fewer years of net surplus, $730 billion less in Social Security Trust Funds, and program costs that are 0.66% greater for projected taxable payroll by 2031 compared with SSA projections. More important than specific numerical estimates are the advantages of transparency, replicability, reduction of uncertainty, and what may be the resulting lower vulnerability to the politicization of program forecasts. In addition, by offering with this article software and detailed replication information, we hope to marshal the efforts of the research community to include ever more informative inputs and to continue to reduce uncertainties in Social Security forecasts.

  9. Network bandwidth utilization forecast model on high bandwidth networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wuchert; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2%. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less

  10. Network Bandwidth Utilization Forecast Model on High Bandwidth Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less

  11. Insights on the impact of systematic model errors on data assimilation performance in changing catchments

    NASA Astrophysics Data System (ADS)

    Pathiraja, S.; Anghileri, D.; Burlando, P.; Sharma, A.; Marshall, L.; Moradkhani, H.

    2018-03-01

    The global prevalence of rapid and extensive land use change necessitates hydrologic modelling methodologies capable of handling non-stationarity. This is particularly true in the context of Hydrologic Forecasting using Data Assimilation. Data Assimilation has been shown to dramatically improve forecast skill in hydrologic and meteorological applications, although such improvements are conditional on using bias-free observations and model simulations. A hydrologic model calibrated to a particular set of land cover conditions has the potential to produce biased simulations when the catchment is disturbed. This paper sheds new light on the impacts of bias or systematic errors in hydrologic data assimilation, in the context of forecasting in catchments with changing land surface conditions and a model calibrated to pre-change conditions. We posit that in such cases, the impact of systematic model errors on assimilation or forecast quality is dependent on the inherent prediction uncertainty that persists even in pre-change conditions. Through experiments on a range of catchments, we develop a conceptual relationship between total prediction uncertainty and the impacts of land cover changes on the hydrologic regime to demonstrate how forecast quality is affected when using state estimation Data Assimilation with no modifications to account for land cover changes. This work shows that systematic model errors as a result of changing or changed catchment conditions do not always necessitate adjustments to the modelling or assimilation methodology, for instance through re-calibration of the hydrologic model, time varying model parameters or revised offline/online bias estimation.

  12. Parameter estimation and forecasting for multiplicative log-normal cascades

    NASA Astrophysics Data System (ADS)

    Leövey, Andrés E.; Lux, Thomas

    2012-04-01

    We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing [Physica DPDNPDT0167-278910.1016/0167-2789(90)90035-N 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica DPDNPDT0167-278910.1016/j.physd.2004.01.020 193, 195 (2004)] and Kiyono [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.76.041113 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono 's procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.

  13. Developing a robust methodology for assessing the value of weather/climate services

    NASA Astrophysics Data System (ADS)

    Krijnen, Justin; Golding, Nicola; Buontempo, Carlo

    2016-04-01

    Increasingly, scientists involved in providing weather and climate services are expected to demonstrate the value of their work for end users in order to justify the costs of developing and delivering these services. This talk will outline different approaches that can be used to assess the socio-economic benefits of weather and climate services, including, among others, willingness to pay and avoided costs. The advantages and limitations of these methods will be discussed and relevant case-studies will be used to illustrate each approach. The choice of valuation method may be influenced by different factors, such as resource and time constraints and the end purposes of the study. In addition, there are important methodological differences which will affect the value assessed. For instance the ultimate value of a weather/climate forecast to a decision-maker will not only depend on forecast accuracy but also on other factors, such as how the forecast is communicated to and consequently interpreted by the end-user. Thus, excluding these additional factors may result in inaccurate socio-economic value estimates. In order to reduce the inaccuracies in this valuation process we propose an approach that assesses how the initial weather/climate forecast information can be incorporated within the value chain of a given sector, taking into account value gains and losses at each stage of the delivery process. By this we aim to more accurately depict the socio-economic benefits of a weather/climate forecast to decision-makers.

  14. Climate Prediction Sees Future Despite Chaos: Researchers Outside NASA use NCCS Resources for Studies

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The air on this mostly sunny January day is crisp and the wind is blustery. The morning's National Weather Service 6-hour forecast had accurately predicted these conditions for the Baltimore-Washington area and the 2-3 day extended outlook was almost perfect. The previous week, the National Center for Environmental Prediction's (NCEP) 6-10 day temperature and precipitation outlook for the general trends for the' region was correct as well. However, no forecast could have predicted specific details about this day. It is 28.5 F in the sunshine bright enough for dark sunglasses, and windy enough to blow off a hat. Such details are impossible to foresee with any accuracy and are outside the scope of routine weather prediction. Equally difficult is accurately forecasting weather beyond about 2 weeks.

  15. Improvements in approaches to forecasting and evaluation techniques

    NASA Astrophysics Data System (ADS)

    Weatherhead, Elizabeth

    2014-05-01

    The US is embarking on an experiment to make significant and sustained improvements in weather forecasting. The effort stems from a series of community conversations that recognized the rapid advancements in observations, modeling and computing techniques in the academic, governmental and private sectors. The new directions and initial efforts will be summarized, including information on possibilities for international collaboration. Most new projects are scheduled to start in the last half of 2014. Several advancements include ensemble forecasting with global models, and new sharing of computing resources. Newly developed techniques for evaluating weather forecast models will be presented in detail. The approaches use statistical techniques that incorporate pair-wise comparisons of forecasts with observations and account for daily auto-correlation to assess appropriate uncertainty in forecast changes. Some of the new projects allow for international collaboration, particularly on the research components of the projects.

  16. Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias

    2016-06-25

    This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.

  17. QPF verification using different radar-based analyses: a case study

    NASA Astrophysics Data System (ADS)

    Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.

    2009-09-01

    Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.

  18. Forecasting influenza in Hong Kong with Google search queries and statistical model fusion.

    PubMed

    Xu, Qinneng; Gel, Yulia R; Ramirez Ramirez, L Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung

    2017-01-01

    The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient.

  19. Air Quality Forecasts Using the NASA GEOS Model: A Unified Tool from Local to Global Scales

    NASA Technical Reports Server (NTRS)

    Knowland, E. Emma; Keller, Christoph; Nielsen, J. Eric; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Cook, Melanie; Liu, Junhua; hide

    2017-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (approximately 25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  20. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  1. Novel Methods in Disease Biogeography: A Case Study with Heterosporosis

    PubMed Central

    Escobar, Luis E.; Qiao, Huijie; Lee, Christine; Phelps, Nicholas B. D.

    2017-01-01

    Disease biogeography is currently a promising field to complement epidemiology, and ecological niche modeling theory and methods are a key component. Therefore, applying the concepts and tools from ecological niche modeling to disease biogeography and epidemiology will provide biologically sound and analytically robust descriptive and predictive analyses of disease distributions. As a case study, we explored the ecologically important fish disease Heterosporosis, a relatively poorly understood disease caused by the intracellular microsporidian parasite Heterosporis sutherlandae. We explored two novel ecological niche modeling methods, the minimum-volume ellipsoid (MVE) and the Marble algorithm, which were used to reconstruct the fundamental and the realized ecological niche of H. sutherlandae, respectively. Additionally, we assessed how the management of occurrence reports can impact the output of the models. Ecological niche models were able to reconstruct a proxy of the fundamental and realized niche for this aquatic parasite, identifying specific areas suitable for Heterosporosis. We found that the conceptual and methodological advances in ecological niche modeling provide accessible tools to update the current practices of spatial epidemiology. However, careful data curation and a detailed understanding of the algorithm employed are critical for a clear definition of the assumptions implicit in the modeling process and to ensure biologically sound forecasts. In this paper, we show how sensitive MVE is to the input data, while Marble algorithm may provide detailed forecasts with a minimum of parameters. We showed that exploring algorithms of different natures such as environmental clusters, climatic envelopes, and logistic regressions (e.g., Marble, MVE, and Maxent) provide different scenarios of potential distribution. Thus, no single algorithm should be used for disease mapping. Instead, different algorithms should be employed for a more informed and complete understanding of the pathogen or parasite in question. PMID:28770215

  2. The role of futures forecasts in recreation: some applications in the third nationwide outdoor recreation plan

    Treesearch

    Meg Maguire; Dana R. Younger

    1980-01-01

    This paper provides a quick glimpse into the theoretical applicability and importance of futures forecasting techniques in recreation policy planning. The paper also details contemporary socioeconomic trends affecting recreation, current recreation participation patterns and anticipated social changes which will alter public recreation experiences as developed in the...

  3. Requirement definition of passenger motor transport enterprises for spare parts by method of short-term combined forecasting

    NASA Astrophysics Data System (ADS)

    Bulatov, S. V.

    2018-05-01

    The article considers the method of short-term combined forecasting, which includes theoretical and experimental estimates of the need for details of units and assemblies, which allows obtaining the optimum number of spare parts necessary for rolling stock operation without downtime in repair areas.

  4. Ensemble forecasting for renewable energy applications - status and current challenges for their generation and verification

    NASA Astrophysics Data System (ADS)

    Pinson, Pierre

    2016-04-01

    The operational management of renewable energy generation in power systems and electricity markets requires forecasts in various forms, e.g., deterministic or probabilistic, continuous or categorical, depending upon the decision process at hand. Besides, such forecasts may also be necessary at various spatial and temporal scales, from high temporal resolutions (in the order of minutes) and very localized for an offshore wind farm, to coarser temporal resolutions (hours) and covering a whole country for day-ahead power scheduling problems. As of today, weather predictions are a common input to forecasting methodologies for renewable energy generation. Since for most decision processes, optimal decisions can only be made if accounting for forecast uncertainties, ensemble predictions and density forecasts are increasingly seen as the product of choice. After discussing some of the basic approaches to obtaining ensemble forecasts of renewable power generation, it will be argued that space-time trajectories of renewable power production may or may not be necessitate post-processing ensemble forecasts for relevant weather variables. Example approaches and test case applications will be covered, e.g., looking at the Horns Rev offshore wind farm in Denmark, or gridded forecasts for the whole continental Europe. Eventually, we will illustrate some of the limitations of current frameworks to forecast verification, which actually make it difficult to fully assess the quality of post-processing approaches to obtain renewable energy predictions.

  5. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.

  6. PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments

    NASA Astrophysics Data System (ADS)

    Schmitz, G. H.; Cullmann, J.

    2008-10-01

    SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.

  7. AROME-Arctic: New operational NWP model for the Arctic region

    NASA Astrophysics Data System (ADS)

    Süld, Jakob; Dale, Knut S.; Myrland, Espen; Batrak, Yurii; Homleid, Mariken; Valkonen, Teresa; Seierstad, Ivar A.; Randriamampianina, Roger

    2016-04-01

    In the frame of the EU-funded project ACCESS (Arctic Climate Change, Economy and Society), MET Norway aimed 1) to describe the present monitoring and forecasting capabilities in the Arctic; and 2) to identify the key factors limiting the forecasting capabilities and to give recommendations on key areas to improve the forecasting capabilities in the Arctic. We have observed that the NWP forecast quality is lower in the Arctic than in the regions further south. Earlier research indicated that one of the factors behind this is the composition of the observing system in the Arctic, in particular the scarceness of conventional observations. To further assess possible strategies for alleviating the situation and propose scenarios for a future Arctic observing system, we have performed a set of experiments to gain a more detailed insight in the contribution of the components of the present observing system in a regional state-of-the-art non-hydrostatic NWP model using the AROME physics (Seity et al, 2011) at 2.5 km horizontal resolution - AROME-Arctic. Our observing system experiment studies showed that conventional observations (Synop, Buoys) can play an important role in correcting the surface state of the model, but prove that the present upper-air conventional (Radiosondes, Aircraft) observations in the area are too scarce to have a significant effect on forecasts. We demonstrate that satellite sounding data play an important role in improving forecast quality. This is the case with satellite temperature sounding data (AMSU-A, IASI), as well as with the satellite moisture sounding data (AMSU-B/MHS, IASI). With these sets of observations, the AROME-Arctic clearly performs better in forecasting extreme events, like for example polar lows. For more details see presentation by Randriamampianina et al. in this session. The encouraging performance of AROME-Arctic lead us to implement it with more observations and improved settings into daily runs with the objective to substitute our actual operational Arctic mesoscale HIRLAM (High Resolution Limited Area Model) NWP model. This presentation will discuss in detail the operational implementation of the AROME-Arctic model together with post-processing methods. Aimed services in the Arctic region covered by the model, such as online weather forecasting (yr.no) and tracking of polar lows (barentswatch.no), is also included.

  8. Communicating uncertainty in hydrological forecasts: mission impossible?

    NASA Astrophysics Data System (ADS)

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  9. A simple Lagrangian forecast system with aviation forecast potential

    NASA Technical Reports Server (NTRS)

    Petersen, R. A.; Homan, J. H.

    1983-01-01

    A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.

  10. Financial forecasts accuracy in Brazil's social security system.

    PubMed

    Silva, Carlos Patrick Alves da; Puty, Claudio Alberto Castelo Branco; Silva, Marcelino Silva da; Carvalho, Solon Venâncio de; Francês, Carlos Renato Lisboa

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.

  11. Financial forecasts accuracy in Brazil’s social security system

    PubMed Central

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government’s proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts. PMID:28859172

  12. The Value of Seasonal Climate Forecasts in Managing Energy Resources.

    NASA Astrophysics Data System (ADS)

    Brown Weiss, Edith

    1982-04-01

    Research and interviews with officials of the United States energy industry and a systems analysis of decision making in a natural gas utility lead to the conclusion that seasonal climate forecasts would only have limited value in fine tuning the management of energy supply, even if the forecasts were more reliable and detailed than at present.On the other hand, reliable forecasts could be useful to state and local governments both as a signal to adopt long-term measures to increase the efficiency of energy use and to initiate short-term measures to reduce energy demand in anticipation of a weather-induced energy crisis.To be useful for these purposes, state governments would need better data on energy demand patterns and available energy supplies, staff competent to interpret climate forecasts, and greater incentive to conserve. The use of seasonal climate forecasts is not likely to be constrained by fear of legal action by those claiming to be injured by a possible incorrect forecast.

  13. Verification of National Weather Service spot forecasts using surface observations

    NASA Astrophysics Data System (ADS)

    Lammers, Matthew Robert

    Software has been developed to evaluate National Weather Service spot forecasts issued to support prescribed burns and early-stage wildfires. Fire management officials request spot forecasts from National Weather Service Weather Forecast Offices to provide detailed guidance as to atmospheric conditions in the vicinity of planned prescribed burns as well as wildfires that do not have incident meteorologists on site. This open source software with online display capabilities is used to examine an extensive set of spot forecasts of maximum temperature, minimum relative humidity, and maximum wind speed from April 2009 through November 2013 nationwide. The forecast values are compared to the closest available surface observations at stations installed primarily for fire weather and aviation applications. The accuracy of the spot forecasts is compared to those available from the National Digital Forecast Database (NDFD). Spot forecasts for selected prescribed burns and wildfires are used to illustrate issues associated with the verification procedures. Cumulative statistics for National Weather Service County Warning Areas and for the nation are presented. Basic error and accuracy metrics for all available spot forecasts and the entire nation indicate that the skill of the spot forecasts is higher than that available from the NDFD, with the greatest improvement for maximum temperature and the least improvement for maximum wind speed.

  14. Propagation of uncertainties through the oil spill model MEDSLIK-II: operational application to the Black Sea

    NASA Astrophysics Data System (ADS)

    Liubartseva, Svitlana; Coppini, Giovanni; Ciliberti, Stefania Angela; Lecci, Rita

    2017-04-01

    In operational oil spill modeling, MEDSLIK-II (De Dominicis et al., 2013) focuses on the reliability of the oil drift and fate predictions routinely fed by operational oceanographic and atmospheric forecasting chain. Uncertainty calculations enhance oil spill forecast efficiency, supplying probability maps to quantify the propagation of various uncertainties. Recently, we have developed the methodology that allows users to evaluate the variability of oil drift forecast caused by uncertain data on the initial oil spill conditions (Liubartseva et al., 2016). One of the key methodological aspects is a reasonable choice of a way of parameter perturbation. In case of starting oil spill location and time, these scalars might be treated as independent random parameters. If we want to perturb the underlying ocean currents and wind, we have to deal with deterministic vector parameters. To a first approximation, we suggest rolling forecasts as a set of perturbed ocean currents and wind. This approach does not need any extra hydrodynamic calculations, and it is quick enough to be performed in web-based applications. The capabilities of the proposed methodology are explored using the Black Sea Forecasting System (BSFS) recently implemented by Ciliberti et al. (2016) for the Copernicus Marine Environment Monitoring Service (http://marine.copernicus.eu/services-portfolio/access-to-products). BSFS horizontal resolution is 1/36° in zonal and 1/27° in meridional direction (ca. 3 km). Vertical domain discretization is represented by 31 unevenly spaced vertical levels. Atmospheric wind data are provided by the European Centre for Medium-Range Weather Forecasts (ECMWF) forecasts, at 1/8° (ca. 12.5 km) horizontal and 6-hour temporal resolution. A great variety of probability patterns controlled by different underlying flows is represented including the cyclonic Rim Current, flow bifurcations in anticyclonic eddies (e.g., Sevastopol and Batumi), northwestern shelf circulation, etc. Uncertainty imprints in the oil mass balance components are also analyzed. This work is conducted in the framework of the REACT Project funded by Fondazione CON IL SUD/Brains2South. References Ciliberti, S.A., Peneva, E., Storto, A., Kandilarov, R., Lecci, R., Yang, C., Coppini, G., Masina, S., Pinardi, N., 2016. Implementation of Black Sea numerical model based on NEMO and 3DVAR data assimilation scheme for operational forecasting, Geophys. Res. Abs., 18, EGU2016-16222. De Dominicis, M., Pinardi, N., Zodiatis, G., Lardner, R., 2013. MEDSLIK-II, a Lagrangian marine surface oil spill model for short term forecasting-Part 1: Theory, Geosci. Model Dev., 6, 1851-1869. Liubartseva, S., Coppini, G., Pinardi, N., De Dominicis, M., Lecci, R., Turrisi, G., Cretì, S., Martinelli, S., Agostini, P., Marra, P., Palermo, F., 2016. Decision support system for emergency management of oil spill accidents in the Mediterranean Sea, Nat. Hazards Earth Syst. Sci., 16, 2009-2020.

  15. Forecasting Maintenance Shortcomings of a Planned Equipment Density Listing in Support of Expeditionary Missions

    DTIC Science & Technology

    2017-06-01

    importantly, it examines the methodology used to build the class IX block embarked on ship prior to deployment. The class IX block is defined as a repository...compared to historical data to evaluate model and simulation outputs. This thesis provides recommendations on improving the methodology implemented in...improving the level of organic support available to deployed units. More importantly, it examines the methodology used to build the class IX block

  16. ADAPTATION AND APPLICATION OF THE COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODELING SYSTEM FOR REAL-TIME AIR QUALITY FORECASTING DURING THE SUMMER OF 2004

    EPA Science Inventory

    The ability to forecast local and regional air pollution events is challenging since the processes governing the production and sustenance of atmospheric pollutants are complex and often non-linear. Comprehensive atmospheric models, by representing in as much detail as possible t...

  17. The Second NWRA Flare-Forecasting Comparison Workshop: Methods Compared and Methodology

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, G.; the Flare Forecasting Comparison Group

    2013-07-01

    The Second NWRA Workshop to compare methods of solar flare forecasting was held 2-4 April 2013 in Boulder, CO. This is a follow-on to the First NWRA Workshop on Flare Forecasting Comparison, also known as the ``All-Clear Forecasting Workshop'', held in 2009 jointly with NASA/SRAG and NOAA/SWPC. For this most recent workshop, many researchers who are active in the field participated, and diverse methods were represented in terms of both the characterization of the Sun and the statistical approaches used to create a forecast. A standard dataset was created for this investigation, using data from the Solar Dynamics Observatory/ Helioseismic and Magnetic Imager (SDO/HMI) vector magnetic field HARP series. For each HARP on each day, 6 hours of data were used, allowing for nominal time-series analysis to be included in the forecasts. We present here a summary of the forecasting methods that participated and the standardized dataset that was used. Funding for the workshop and the data analysis was provided by NASA/Living with a Star contract NNH09CE72C and NASA/Guest Investigator contract NNH12CG10C.

  18. Randomly correcting model errors in the ARPEGE-Climate v6.1 component of CNRM-CM: applications for seasonal forecasts

    NASA Astrophysics Data System (ADS)

    Batté, Lauriane; Déqué, Michel

    2016-06-01

    Stochastic methods are increasingly used in global coupled model climate forecasting systems to account for model uncertainties. In this paper, we describe in more detail the stochastic dynamics technique introduced by Batté and Déqué (2012) in the ARPEGE-Climate atmospheric model. We present new results with an updated version of CNRM-CM using ARPEGE-Climate v6.1, and show that the technique can be used both as a means of analyzing model error statistics and accounting for model inadequacies in a seasonal forecasting framework.The perturbations are designed as corrections of model drift errors estimated from a preliminary weakly nudged re-forecast run over an extended reference period of 34 boreal winter seasons. A detailed statistical analysis of these corrections is provided, and shows that they are mainly made of intra-month variance, thereby justifying their use as in-run perturbations of the model in seasonal forecasts. However, the interannual and systematic error correction terms cannot be neglected. Time correlation of the errors is limited, but some consistency is found between the errors of up to 3 consecutive days.These findings encourage us to test several settings of the random draws of perturbations in seasonal forecast mode. Perturbations are drawn randomly but consistently for all three prognostic variables perturbed. We explore the impact of using monthly mean perturbations throughout a given forecast month in a first ensemble re-forecast (SMM, for stochastic monthly means), and test the use of 5-day sequences of perturbations in a second ensemble re-forecast (S5D, for stochastic 5-day sequences). Both experiments are compared in the light of a REF reference ensemble with initial perturbations only. Results in terms of forecast quality are contrasted depending on the region and variable of interest, but very few areas exhibit a clear degradation of forecasting skill with the introduction of stochastic dynamics. We highlight some positive impacts of the method, mainly on Northern Hemisphere extra-tropics. The 500 hPa geopotential height bias is reduced, and improvements project onto the representation of North Atlantic weather regimes. A modest impact on ensemble spread is found over most regions, which suggests that this method could be complemented by other stochastic perturbation techniques in seasonal forecasting mode.

  19. A probabilistic analysis of silicon cost

    NASA Technical Reports Server (NTRS)

    Reiter, L. J.

    1983-01-01

    Silicon materials costs represent both a cost driver and an area where improvement can be made in the manufacture of photovoltaic modules. The cost from three processes for the production of low-cost silicon being developed under the U.S. Department of Energy's (DOE) National Photovoltaic Program is analyzed. The approach is based on probabilistic inputs and makes use of two models developed at the Jet Propulsion Laboratory: SIMRAND (SIMulation of Research ANd Development) and IPEG (Improved Price Estimating Guidelines). The approach, assumptions, and limitations are detailed along with a verification of the cost analyses methodology. Results, presented in the form of cumulative probability distributions for silicon cost, indicate that there is a 55% chance of reaching the DOE target of $16/kg for silicon material. This is a technically achievable cost based on expert forecasts of the results of ongoing research and development and do not imply any market prices for a given year.

  20. The 30/20 GHz fixed communications systems service demand assessment. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Gabriszeski, T.; Reiner, P.; Rogers, J.; Terbo, W.

    1979-01-01

    The market analysis of voice, video, and data 18/30 GHz communications systems services and satellite transmission services is discussed. Detail calculations, computer displays of traffic, survey questionnaires, and detailed service forecasts are presented.

  1. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    NASA Astrophysics Data System (ADS)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  2. Forecasting Influenza Epidemics in Hong Kong.

    PubMed

    Yang, Wan; Cowling, Benjamin J; Lau, Eric H Y; Shaman, Jeffrey

    2015-07-01

    Recent advances in mathematical modeling and inference methodologies have enabled development of systems capable of forecasting seasonal influenza epidemics in temperate regions in real-time. However, in subtropical and tropical regions, influenza epidemics can occur throughout the year, making routine forecast of influenza more challenging. Here we develop and report forecast systems that are able to predict irregular non-seasonal influenza epidemics, using either the ensemble adjustment Kalman filter or a modified particle filter in conjunction with a susceptible-infected-recovered (SIR) model. We applied these model-filter systems to retrospectively forecast influenza epidemics in Hong Kong from January 1998 to December 2013, including the 2009 pandemic. The forecast systems were able to forecast both the peak timing and peak magnitude for 44 epidemics in 16 years caused by individual influenza strains (i.e., seasonal influenza A(H1N1), pandemic A(H1N1), A(H3N2), and B), as well as 19 aggregate epidemics caused by one or more of these influenza strains. Average forecast accuracies were 37% (for both peak timing and magnitude) at 1-3 week leads, and 51% (peak timing) and 50% (peak magnitude) at 0 lead. Forecast accuracy increased as the spread of a given forecast ensemble decreased; the forecast accuracy for peak timing (peak magnitude) increased up to 43% (45%) for H1N1, 93% (89%) for H3N2, and 53% (68%) for influenza B at 1-3 week leads. These findings suggest that accurate forecasts can be made at least 3 weeks in advance for subtropical and tropical regions.

  3. Forecasting Influenza Epidemics in Hong Kong

    PubMed Central

    Yang, Wan; Cowling, Benjamin J.; Lau, Eric H. Y.; Shaman, Jeffrey

    2015-01-01

    Recent advances in mathematical modeling and inference methodologies have enabled development of systems capable of forecasting seasonal influenza epidemics in temperate regions in real-time. However, in subtropical and tropical regions, influenza epidemics can occur throughout the year, making routine forecast of influenza more challenging. Here we develop and report forecast systems that are able to predict irregular non-seasonal influenza epidemics, using either the ensemble adjustment Kalman filter or a modified particle filter in conjunction with a susceptible-infected-recovered (SIR) model. We applied these model-filter systems to retrospectively forecast influenza epidemics in Hong Kong from January 1998 to December 2013, including the 2009 pandemic. The forecast systems were able to forecast both the peak timing and peak magnitude for 44 epidemics in 16 years caused by individual influenza strains (i.e., seasonal influenza A(H1N1), pandemic A(H1N1), A(H3N2), and B), as well as 19 aggregate epidemics caused by one or more of these influenza strains. Average forecast accuracies were 37% (for both peak timing and magnitude) at 1-3 week leads, and 51% (peak timing) and 50% (peak magnitude) at 0 lead. Forecast accuracy increased as the spread of a given forecast ensemble decreased; the forecast accuracy for peak timing (peak magnitude) increased up to 43% (45%) for H1N1, 93% (89%) for H3N2, and 53% (68%) for influenza B at 1-3 week leads. These findings suggest that accurate forecasts can be made at least 3 weeks in advance for subtropical and tropical regions. PMID:26226185

  4. Flight Departure Delay and Rerouting Under Uncertainty in En Route Convective Weather

    NASA Technical Reports Server (NTRS)

    Mukherjee, Avijit; Grabbe, Shon; Sridhar, Banavar

    2011-01-01

    Delays caused by uncertainty in weather forecasts can be reduced by improving traffic flow management decisions. This paper presents a methodology for traffic flow management under uncertainty in convective weather forecasts. An algorithm for assigning departure delays and reroutes to aircraft is presented. Departure delay and route assignment are executed at multiple stages, during which, updated weather forecasts and flight schedules are used. At each stage, weather forecasts up to a certain look-ahead time are treated as deterministic and flight scheduling is done to mitigate the impact of weather on four-dimensional flight trajectories. Uncertainty in weather forecasts during departure scheduling results in tactical airborne holding of flights. The amount of airborne holding depends on the accuracy of forecasts as well as the look-ahead time included in the departure scheduling. The weather forecast look-ahead time is varied systematically within the experiments performed in this paper to analyze its effect on flight delays. Based on the results, longer look-ahead times cause higher departure delays and additional flying time due to reroutes. However, the amount of airborne holding necessary to prevent weather incursions reduces when the forecast look-ahead times are higher. For the chosen day of traffic and weather, setting the look-ahead time to 90 minutes yields the lowest total delay cost.

  5. An experimental system for flood risk forecasting and monitoring at global scale

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Alfieri, Lorenzo; Kalas, Milan; Lorini, Valerio; Salamon, Peter

    2017-04-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by a wide range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasting, combining streamflow estimations with expected inundated areas and flood impacts. Finally, emerging technologies such as crowdsourcing and social media monitoring can play a crucial role in flood disaster management and preparedness. Here, we present some recent advances of an experimental procedure for near-real time flood mapping and impact assessment. The procedure translates in near real-time the daily streamflow forecasts issued by the Global Flood Awareness System (GloFAS) into event-based flood hazard maps, which are then combined with exposure and vulnerability information at global scale to derive risk forecast. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To increase the reliability of our forecasts we propose the integration of model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification and correction of impact forecasts. Finally, we present the results of preliminary tests which show the potential of the proposed procedure in supporting emergency response and management.

  6. Increased performance in the short-term water demand forecasting through the use of a parallel adaptive weighting strategy

    NASA Astrophysics Data System (ADS)

    Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.

    2018-03-01

    Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.

  7. Forecast Based Financing for Managing Weather and Climate Risks to Reduce Potential Disaster Impacts

    NASA Astrophysics Data System (ADS)

    Arrighi, J.

    2017-12-01

    There is a critical window of time to reduce potential impacts of a disaster after a forecast for heightened risk is issued and before an extreme event occurs. The concept of Forecast-based Financing focuses on this window of opportunity. Through advanced preparation during system set-up, tailored methodologies are used to 1) analyze a range of potential extreme event forecasts, 2) identify emergency preparedness measures that can be taken when factoring in forecast lead time and inherent uncertainty and 3) develop standard operating procedures that are agreed on and tied to guaranteed funding sources to facilitate emergency measures led by the Red Cross or government actors when preparedness measures are triggered. This presentation will focus on a broad overview of the current state of theory and approaches used in developing a forecast-based financing systems - with a specific focus on hydrologic events, case studies of success and challenges in various contexts where this approach is being piloted, as well as what is on the horizon to be further explored and developed from a research perspective as the application of this approach continues to expand.

  8. Does money matter in inflation forecasting?

    NASA Astrophysics Data System (ADS)

    Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.

    2010-11-01

    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.

  9. Forecasting cyanobacteria dominance in Canadian temperate lakes.

    PubMed

    Persaud, Anurani D; Paterson, Andrew M; Dillon, Peter J; Winter, Jennifer G; Palmer, Michelle; Somers, Keith M

    2015-03-15

    Predictive models based on broad scale, spatial surveys typically identify nutrients and climate as the most important predictors of cyanobacteria abundance; however these models generally have low predictive power because at smaller geographic scales numerous other factors may be equally or more important. At the lake level, for example, the ability to forecast cyanobacteria dominance is of tremendous value to lake managers as they can use such models to communicate exposure risks associated with recreational and drinking water use, and possible exposure to algal toxins, in advance of bloom occurrence. We used detailed algal, limnological and meteorological data from two temperate lakes in south-central Ontario, Canada to determine the factors that are closely linked to cyanobacteria dominance, and to develop easy to use models to forecast cyanobacteria biovolume. For Brandy Lake (BL), the strongest and most parsimonious model for forecasting % cyanobacteria biovolume (% CB) included water column stability, hypolimnetic TP, and % cyanobacteria biovolume two weeks prior. For Three Mile Lake (TML), the best model for forecasting % CB included water column stability, hypolimnetic TP concentration, and 7-d mean wind speed. The models for forecasting % CB in BL and TML are fundamentally different in their lag periods (BL = lag 1 model and TML = lag 2 model) and in some predictor variables despite the close proximity of the study lakes. We speculate that three main factors (nutrient concentrations, water transparency and lake morphometry) may have contributed to differences in the models developed, and may account for variation observed in models derived from large spatial surveys. Our results illustrate that while forecast models can be developed to determine when cyanobacteria will dominate within two temperate lakes, the models require detailed, lake-specific calibration to be effective as risk-management tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Routine High-Resolution Forecasts/Analyses for the Pacific Disaster Center: User Manual

    NASA Technical Reports Server (NTRS)

    Roads, John; Han, J.; Chen, S.; Burgan, R.; Fujioka, F.; Stevens, D.; Funayama, D.; Chambers, C.; Bingaman, B.; McCord, C.; hide

    2001-01-01

    Enclosed herein is our HWCMO user manual. This manual constitutes the final report for our NASA/PDC grant, NASA NAG5-8730, "Routine High Resolution Forecasts/Analysis for the Pacific Disaster Center". Since the beginning of the grant, we have routinely provided experimental high resolution forecasts from the RSM/MSM for the Hawaii Islands, while working to upgrade the system to include: (1) a more robust input of NCEP analyses directly from NCEP; (2) higher vertical resolution, with increased forecast accuracy; (3) faster delivery of forecast products and extension of initial 1-day forecasts to 2 days; (4) augmentation of our basic meteorological and simplified fireweather forecasts to firedanger and drought forecasts; (5) additional meteorological forecasts with an alternate mesoscale model (MM5); and (6) the feasibility of using our modeling system to work in higher-resolution domains and other regions. In this user manual, we provide a general overview of the operational system and the mesoscale models as well as more detailed descriptions of the models. A detailed description of daily operations and a cost analysis is also provided. Evaluations of the models are included although it should be noted that model evaluation is a continuing process and as potential problems are identified, these can be used as the basis for making model improvements. Finally, we include our previously submitted answers to particular PDC questions (Appendix V). All of our initially proposed objectives have basically been met. In fact, a number of useful applications (VOG, air pollution transport) are already utilizing our experimental output and we believe there are a number of other applications that could make use of our routine forecast/analysis products. Still, work still remains to be done to further develop this experimental weather, climate, fire danger and drought prediction system. In short, we would like to be a part of a future PDC team, if at all possible, to further develop and apply the system for the Hawaiian and other Pacific Islands as well as the entire Pacific Basin.

  11. High-Density Liquid-State Machine Circuitry for Time-Series Forecasting.

    PubMed

    Rosselló, Josep L; Alomar, Miquel L; Morro, Antoni; Oliver, Antoni; Canals, Vincent

    2016-08-01

    Spiking neural networks (SNN) are the last neural network generation that try to mimic the real behavior of biological neurons. Although most research in this area is done through software applications, it is in hardware implementations in which the intrinsic parallelism of these computing systems are more efficiently exploited. Liquid state machines (LSM) have arisen as a strategic technique to implement recurrent designs of SNN with a simple learning methodology. In this work, we show a new low-cost methodology to implement high-density LSM by using Boolean gates. The proposed method is based on the use of probabilistic computing concepts to reduce hardware requirements, thus considerably increasing the neuron count per chip. The result is a highly functional system that is applied to high-speed time series forecasting.

  12. Quantifying probabilities of eruptions at Mount Etna (Sicily, Italy).

    NASA Astrophysics Data System (ADS)

    Brancato, Alfonso

    2010-05-01

    One of the major goals of modern volcanology is to set up sound risk-based decision-making in land-use planning and emergency management. Volcanic hazard must be managed with reliable estimates of quantitative long- and short-term eruption forecasting, but the large number of observables involved in a volcanic process suggests that a probabilistic approach could be a suitable tool in forecasting. The aim of this work is to quantify probabilistic estimate of the vent location for a suitable lava flow hazard assessment at Mt. Etna volcano, through the application of the code named BET (Marzocchi et al., 2004, 2008). The BET_EF model is based on the event tree philosophy assessed by Newhall and Hoblitt (2002), further developing the concept of vent location, epistemic uncertainties, and a fuzzy approach for monitoring measurements. A Bayesian event tree is a specialized branching graphical representation of events in which individual branches are alternative steps from a general prior event, and evolving into increasingly specific subsequent states. Then, the event tree attempts to graphically display all relevant possible outcomes of volcanic unrest in progressively higher levels of detail. The procedure is set to estimate an a priori probability distribution based upon theoretical knowledge, to accommodate it by using past data, and to modify it further by using current monitoring data. For the long-term forecasting, an a priori model, dealing with the present tectonic and volcanic structure of the Mt. Etna, is considered. The model is mainly based on past vent locations and fracture location datasets (XX century of eruptive history of the volcano). Considering the variation of the information through time, and their relationship with the structural setting of the volcano, datasets we are also able to define an a posteriori probability map for next vent opening. For short-term forecasting vent opening hazard assessment, the monitoring has a leading role, primarily based on seismological and volcanological data, integrated with strain, geochemical, gravimetric and magnetic parameters. In the code, is necessary to fix an appropriate forecasting time window. On open-conduit volcanoes as Mt. Etna, a forecast time window of a month (as fixed in other applications worldwide) seems unduly long, because variations of the state of the volcano (significant variation of a specific monitoring parameter could occur in time scale shorter than the forecasting time window) are expected with shorter time scale (hour, day or week). This leads to set a week as forecasting time window, coherently with the number of weeks in which an unrest has been experienced. The short-term vent opening hazard assessment will be estimated during an unrest phase; the testing case (2001 July eruption) will include all the monitoring parameters collected at Mt. Etna during the six months preceding the eruption. The monitoring role has been assessed eliciting more than 50 parameters, including seismic activity, ground deformation, geochemistry, gravity, magnetism, and distributed inside the first three nodes of the procedure. Parameter values describe the Mt. Etna volcano activity, being more detailed through the code, particularly in time units. The methodology allows all assumptions and thresholds to be clearly identified and provides a rational means for their revision if new data or information are incoming. References Newhall C.G. and Hoblitt R.P.; 2002: Constructing event trees for volcanic crises, Bull. Volcanol., 64, 3-20, doi: 10.1007/s0044500100173. Marzocchi W., Sandri L., Gasparini P., Newhall C. and Boschi E.; 2004: Quantifying probabilities of volcanic events: The example of volcanic hazard at Mount Vesuvius, J. Geophys. Res., 109, B11201, doi:10.1029/2004JB00315U. Marzocchi W., Sandri, L. and Selva, J.; 2008: BET_EF: a probabilistic tool for long- and short-term eruption forecasting, Bull. Volcanol., 70, 623 - 632, doi: 10.1007/s00445-007-0157-y.

  13. Forecast horizon of multi-item dynamic lot size model with perishable inventory.

    PubMed

    Jing, Fuying; Lan, Zirui

    2017-01-01

    This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon.

  14. Forecast horizon of multi-item dynamic lot size model with perishable inventory

    PubMed Central

    Jing, Fuying

    2017-01-01

    This paper studies a multi-item dynamic lot size problem for perishable products where stock deterioration rates and inventory costs are age-dependent. We explore structural properties in an optimal solution under two cost structures and develop a dynamic programming algorithm to solve the problem in polynomial time when the number of products is fixed. We establish forecast horizon results that can help the operation manager to decide the precise forecast horizon in a rolling decision-making process. Finally, based on a detailed test bed of instance, we obtain useful managerial insights on the impact of deterioration rate and lifetime of products on the length of forecast horizon. PMID:29125856

  15. US industrial battery forecast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollingsworth, V. III

    1996-09-01

    Last year was strong year for the US industrial battery market with growth in all segments. Sales of industrial batteries in North America grew 19.2% in 1995, exceeding last year`s forecasted growth rate of 11.6%. The results of the recently completed BCI Membership Survey forecast 1996 sales to be up 10.5%, and to continue to increase at a 10.4% compound annual rate through the year 2000. This year`s survey includes further detail on the stationary battery market with the inclusion of less than 25 Ampere-Hour batteries for the first time.

  16. Application of empirical mode decomposition with local linear quantile regression in financial time series forecasting.

    PubMed

    Jaber, Abobaker M; Ismail, Mohd Tahir; Altaher, Alsaidi M

    2014-01-01

    This paper mainly forecasts the daily closing price of stock markets. We propose a two-stage technique that combines the empirical mode decomposition (EMD) with nonparametric methods of local linear quantile (LLQ). We use the proposed technique, EMD-LLQ, to forecast two stock index time series. Detailed experiments are implemented for the proposed method, in which EMD-LPQ, EMD, and Holt-Winter methods are compared. The proposed EMD-LPQ model is determined to be superior to the EMD and Holt-Winter methods in predicting the stock closing prices.

  17. Global Impacts and Regional Actions: Preparing for the 1997-98 El Niño.

    NASA Astrophysics Data System (ADS)

    Buizer, James L.; Foster, Josh; Lund, David

    2000-09-01

    It has been estimated that severe El Niño-related flooding and droughts in Africa, Latin America, North America, and Southeast Asia resulted in more than 22 000 lives lost and in excess of $36 billion in damages during 1997-98. As one of the most severe events this century, the 1997-98 El Niño was unique not only in terms of physical magnitude, but also in terms of human response. This response was made possible by recent advances in climate-observing and forecasting systems, creation and dissemination of forecast information by institutions such as the International Research Institute for Climate Prediction and NOAA's Climate Prediction Center, and individuals in climate-sensitive sectors willing to act on forecast information by incorporating it into their decision-making. The supporting link between the forecasts and their practical application was a product of efforts by several national and international organizations, and a primary focus of the United States National Oceanic and Atmospheric Administration Office of Global Programs (NOAA/OGP).NOAA/OGP over the last decade has supported pilot projects in Latin America, the Caribbean, the South Pacific, Southeast Asia, and Africa to improve transfer of forecast information to climate sensitive sectors, study linkages between climate and human health, and distribute climate information products in certain areas. Working with domestic and international partners, NOAA/OGP helped organize a total of 11 Climate Outlook Fora around the world during the 1997-98 El Niño. At each Outlook Forum, climatologists and meteorologists created regional, consensus-based, seasonal precipitation forecasts and representatives from climate-sensitive sectors discussed options for applying forecast information. Additional ongoing activities during 1997-98 included research programs focused on the social and economic impacts of climate change and the regional manifestations of global-scale climate variations and their effect on decision-making in climate-sensitive sectors in the United States.The overall intent of NOAA/OGP's activities was to make experimental forecast information broadly available to potential users, and to foster a learning process on how seasonal-to-interannual forecasts could be applied in sectors susceptible to climate variability. This process allowed users to explore the capabilities and limitations of climate forecasts currently available, and forecast producers to receive feedback on the utility of their products. Through activities in which NOAA/OGP and its partners were involved, it became clear that further application of forecast information will be aided by improved forecast accuracy and detail, creation of common validation techniques, continued training in forecast generation and application, alternate methods for presenting forecast information, and a systematic strategy for creation and dissemination of forecast products.The overall intent of NOAA/OGP's activities was to make experimental forecast information broadly available to potential users, and to foster a learning process on how seasonal-to-interannual forecasts could be applied in sectors susceptible to climate variability. This process allowed users to explore the capabilities and limitations of climate forecasts currently available, and forecast producers to receive feedback on the utility of their products. Through activities in which NOAA/OGP and its partners were involved, it became clear that further application of forecast information will be aided by improved forecast accuracy and detail, creation of common validation techniques, continued training in forecast generation and application, alternate methods for presenting forecast information, and a systematic strategy for creation and dissemination of forecast products.

  18. How Hydroclimate Influences the Effectiveness of Particle Filter Data Assimilation of Streamflow in Initializing Short- to Medium-range Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Clark, E.; Wood, A.; Nijssen, B.; Clark, M. P.

    2017-12-01

    Short- to medium-range (1- to 7-day) streamflow forecasts are important for flood control operations and in issuing potentially life-save flood warnings. In the U.S., the National Weather Service River Forecast Centers (RFCs) issue such forecasts in real time, depending heavily on a manual data assimilation (DA) approach. Forecasters adjust model inputs, states, parameters and outputs based on experience and consideration of a range of supporting real-time information. Achieving high-quality forecasts from new automated, centralized forecast systems will depend critically on the adequacy of automated DA approaches to make analogous corrections to the forecasting system. Such approaches would further enable systematic evaluation of real-time flood forecasting methods and strategies. Toward this goal, we have implemented a real-time Sequential Importance Resampling particle filter (SIR-PF) approach to assimilate observed streamflow into simulated initial hydrologic conditions (states) for initializing ensemble flood forecasts. Assimilating streamflow alone in SIR-PF improves simulated streamflow and soil moisture during the model spin up period prior to a forecast, with consequent benefits for forecasts. Nevertheless, it only consistently limits error in simulated snow water equivalent during the snowmelt season and in basins where precipitation falls primarily as snow. We examine how the simulated initial conditions with and without SIR-PF propagate into 1- to 7-day ensemble streamflow forecasts. Forecasts are evaluated in terms of reliability and skill over a 10-year period from 2005-2015. The focus of this analysis is on how interactions between hydroclimate and SIR-PF performance impact forecast skill. To this end, we examine forecasts for 5 hydroclimatically diverse basins in the western U.S. Some of these basins receive most of their precipitation as snow, others as rain. Some freeze throughout the mid-winter while others experience significant mid-winter melt events. We describe the methodology and present seasonal and inter-basin variations in DA-enhanced forecast skill.

  19. Economic Models for Projecting Industrial Capacity for Defense Production: A Review

    DTIC Science & Technology

    1983-02-01

    macroeconomic forecast to establish the level of civilian final demand; all use the DoD Bridge Table to allocate budget category outlays to industries. Civilian...output table.’ 3. Macroeconomic Assumptions and the Prediction of Final Demand All input-output models require as a starting point a prediction of final... macroeconomic fore- cast of GNP and its components and (2) a methodology to transform these forecast values of consumption, investment, exports, etc. into

  20. [A method for forecasting the seasonal dynamic of malaria in the municipalities of Colombia].

    PubMed

    Velásquez, Javier Oswaldo Rodríguez

    2010-03-01

    To develop a methodology for forecasting the seasonal dynamic of malaria outbreaks in the municipalities of Colombia. Epidemiologic ranges were defined by multiples of 50 cases for the six municipalities with the highest incidence, 25 cases for the municipalities that ranked 10th and 11th by incidence, 10 for the municipality that ranked 193rd, and 5 for the municipality that ranked 402nd. The specific probability values for each epidemiologic range appearing in each municipality, as well as the S/k value--the ratio between entropy (S) and the Boltzmann constant (k)--were calculated for each three-week set, along with the differences in this ratio divided by the consecutive sets of weeks. These mathematical ratios were used to determine the values for forecasting the case dynamic, which were compared with the actual epidemiologic data from the period 2003-2007. The probability of the epidemiologic ranges appearing ranged from 0.019 and 1.00, while the differences in the S/k ratio between the sets of consecutive weeks ranged from 0.23 to 0.29. Three ratios were established to determine whether the dynamic corresponded to an outbreak. These ratios were corroborated with real epidemiological data from 810 Colombian municipalities. This methodology allows us to forecast the malaria case dynamic and outbreaks in the municipalities of Colombia and can be used in planning interventions and public health policies.

  1. Stress-based aftershock forecasts made within 24h post mainshock: Expected north San Francisco Bay area seismicity changes after the 2014M=6.0 West Napa earthquake

    USGS Publications Warehouse

    Parsons, Thomas E.; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-01-01

    We calculate stress changes resulting from the M= 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  2. Stress-based aftershock forecasts made within 24 h postmain shock: Expected north San Francisco Bay area seismicity changes after the 2014 M = 6.0 West Napa earthquake

    NASA Astrophysics Data System (ADS)

    Parsons, Tom; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.

    2014-12-01

    We calculate stress changes resulting from the M = 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.

  3. Forecasting Temporal Dynamics of Cutaneous Leishmaniasis in Northeast Brazil

    PubMed Central

    Lewnard, Joseph A.; Jirmanus, Lara; Júnior, Nivison Nery; Machado, Paulo R.; Glesby, Marshall J.; Ko, Albert I.; Carvalho, Edgar M.; Schriefer, Albert; Weinberger, Daniel M.

    2014-01-01

    Introduction Cutaneous leishmaniasis (CL) is a vector-borne disease of increasing importance in northeastern Brazil. It is known that sandflies, which spread the causative parasites, have weather-dependent population dynamics. Routinely-gathered weather data may be useful for anticipating disease risk and planning interventions. Methodology/Principal Findings We fit time series models using meteorological covariates to predict CL cases in a rural region of Bahía, Brazil from 1994 to 2004. We used the models to forecast CL cases for the period 2005 to 2008. Models accounting for meteorological predictors reduced mean squared error in one, two, and three month-ahead forecasts by up to 16% relative to forecasts from a null model accounting only for temporal autocorrelation. Significance These outcomes suggest CL risk in northeastern Brazil might be partially dependent on weather. Responses to forecasted CL epidemics may include bolstering clinical capacity and disease surveillance in at-risk areas. Ecological mechanisms by which weather influences CL risk merit future research attention as public health intervention targets. PMID:25356734

  4. Adjusting Wavelet-based Multiresolution Analysis Boundary Conditions for Robust Long-term Streamflow Forecasting Model

    NASA Astrophysics Data System (ADS)

    Maslova, I.; Ticlavilca, A. M.; McKee, M.

    2012-12-01

    There has been an increased interest in wavelet-based streamflow forecasting models in recent years. Often overlooked in this approach are the circularity assumptions of the wavelet transform. We propose a novel technique for minimizing the wavelet decomposition boundary condition effect to produce long-term, up to 12 months ahead, forecasts of streamflow. A simulation study is performed to evaluate the effects of different wavelet boundary rules using synthetic and real streamflow data. A hybrid wavelet-multivariate relevance vector machine model is developed for forecasting the streamflow in real-time for Yellowstone River, Uinta Basin, Utah, USA. The inputs of the model utilize only the past monthly streamflow records. They are decomposed into components formulated in terms of wavelet multiresolution analysis. It is shown that the model model accuracy can be increased by using the wavelet boundary rule introduced in this study. This long-term streamflow modeling and forecasting methodology would enable better decision-making and managing water availability risk.

  5. Modeling and forecasting rainfall patterns of southwest monsoons in North-East India as a SARIMA process

    NASA Astrophysics Data System (ADS)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-02-01

    Weather forecasting is an important issue in the field of meteorology all over the world. The pattern and amount of rainfall are the essential factors that affect agricultural systems. India experiences the precious Southwest monsoon season for four months from June to September. The present paper describes an empirical study for modeling and forecasting the time series of Southwest monsoon rainfall patterns in the North-East India. The Box-Jenkins Seasonal Autoregressive Integrated Moving Average (SARIMA) methodology has been adopted for model identification, diagnostic checking and forecasting for this region. The study has shown that the SARIMA (0, 1, 1) (1, 0, 1)4 model is appropriate for analyzing and forecasting the future rainfall patterns. The Analysis of Means (ANOM) is a useful alternative to the analysis of variance (ANOVA) for comparing the group of treatments to study the variations and critical comparisons of rainfall patterns in different months of the season.

  6. Forecast of the United States telecommunications demand through the year 2000

    NASA Astrophysics Data System (ADS)

    Kratochvil, D.

    1984-01-01

    The telecommunications forecasts considered in the present investigation were developed in studies conducted by Kratochvil et al. (1983). The overall purpose of these studies was to forecast the potential U.S. domestic telecommunications demand for satellite-provided fixed communications voice, data, and video services through the year 2000, so that this information on service demand would be available to aid in NASA communications program planning. Aspects of forecasting methodology are discussed, taking into account forecasting activity flow, specific services and selected techniques, and an event/trend cross-impact model. Events, or market determinant factors, which are very likely to occur by 1995 and 2005, are presented in a table. It is found that the demand for telecommunications in general, and for satellite telecommunications in particular, will increase significantly between now and the year 2000. The required satellite capacity will surpass both the potential and actual capacities in the early 1990s, indicating a need for Ka-band at that time.

  7. Hardware-software complex of informing passengers of forecasted route transport arrival at stop

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, V. Yu; Pushkarev, M. I.; Fadeev, A. S.

    2017-02-01

    The paper presents the hardware-software complex of informing the passengers of the forecasted route transport arrival. A client-server architecture of the forecasting information system is represented and an electronic information board prototype is described. The scheme of information transfer and processing, starting with receiving navigating telemetric data from a transport vehicle and up to the time of passenger public transport arrival at the stop, as well as representation of the information on the electronic board is illustrated and described. Methods and algorithms of determination of the transport vehicle current location in the city route network are considered in detail. The description of the proposed forecasting model of transport vehicle arrival time at the stop is given. The obtained result is applied in Tomsk for forecasting and displaying the arrival time information at the stops.

  8. An expert system-based approach to prediction of year-to-year climatic variations in the North Atlantic region

    NASA Astrophysics Data System (ADS)

    Rodionov, S. N.; Martin, J. H.

    1999-07-01

    A novel approach to climate forecasting on an interannual time scale is described. The approach is based on concepts and techniques from artificial intelligence and expert systems. The suitability of this approach to climate diagnostics and forecasting problems and its advantages compared with conventional forecasting techniques are discussed. The article highlights some practical aspects of the development of climatic expert systems (CESs) and describes an implementation of such a system for the North Atlantic (CESNA). Particular attention is paid to the content of CESNA's knowledge base and those conditions that make climatic forecasts one to several years in advance possible. A detailed evaluation of the quality of the experimental real-time forecasts made by CESNA for the winters of 1995-1996, 1996-1997 and 1997-1998 are presented.

  9. Emerging technologies for the changing global market

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt

    1993-01-01

    This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.

  10. Data Mining for Financial Applications

    NASA Astrophysics Data System (ADS)

    Kovalerchuk, Boris; Vityaev, Evgenii

    This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.

  11. Vulnerability Assessment Using LIDAR Data in Silang-Sta Rosa Subwatershed, Philippines

    NASA Astrophysics Data System (ADS)

    Bragais, M. A.; Magcale-Macandog, D. B.; Arizapa, J. L.; Manalo, K. M.

    2016-10-01

    Silang-Sta. Rosa Subwatershed is experiencing rapid urbanization. Its downstream area is already urbanized and the development is moving fast upstream. With the rapid land conversion of pervious to impervious areas and increase frequency of intense rainfall events, the downstream of the watershed is at risk of flood hazard. The widely used freeware HEC-RAS (Hydrologic Engineering Center- River Analysis System) model was used to implement the 2D unsteady flow analysis to develop a flood hazard map. The LiDAR derived digital elevation model (DEM) with 1m resolution provided detailed terrain that is vital for producing reliable flood extent map that can be used for early warning system. With the detailed information from the simulation like areas to be flooded, the predicted depth and duration, we can now provide specific flood forecasting and mitigation plan even at community level. The methodology of using 2D unsteady flow modelling and high resolution DEM in a watershed can be replicated to other neighbouring watersheds specially those areas that are not yet urbanized so that their development will be guided to be flood hazard resilient. LGUs all over the country will benefit from having a high resolution flood hazard map.

  12. Airfreight forecasting methodology and results

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A series of econometric behavioral equations was developed to explain and forecast the evolution of airfreight traffic demand for the total U.S. domestic airfreight system, the total U.S. international airfreight system, and the total scheduled international cargo traffic carried by the top 44 foreign airlines. The basic explanatory variables used in these macromodels were the real gross national products of the countries involved and a measure of relative transportation costs. The results of the econometric analysis reveal that the models explain more than 99 percent of the historical evolution of freight traffic. The long term traffic forecasts generated with these models are based on scenarios of the likely economic outlook in the United States and 31 major foreign countries.

  13. The FASTER Approach: A New Tool for Calculating Real-Time Tsunami Flood Hazards

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Cross, A.; Johnson, L.; Miller, K.; Nicolini, T.; Whitmore, P.

    2014-12-01

    In the aftermath of the 2010 Chile and 2011 Japan tsunamis that struck the California coastline, emergency managers requested that the state tsunami program provide more detailed information about the flood potential of distant-source tsunamis well ahead of their arrival time. The main issue is that existing tsunami evacuation plans call for evacuation of the predetermined "worst-case" tsunami evacuation zone (typically at a 30- to 50-foot elevation) during any "Warning" level event; the alternative is to not call an evacuation at all. A solution to provide more detailed information for secondary evacuation zones has been the development of tsunami evacuation "playbooks" to plan for tsunami scenarios of various sizes and source locations. To determine a recommended level of evacuation during a distant-source tsunami, an analytical tool has been developed called the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the National Tsunami Warning Center provides tsunami forecast amplitudes and arrival times for approximately 60 coastal locations in California. At the same time, the regional NOAA Weather Forecast Offices in the state calculate the forecasted coastal storm and tidal conditions that will influence tsunami flooding. Providing added conservatism in calculating tsunami flood potential, we include an error factor of 30% for the forecast amplitude, which is based on observed forecast errors during recent events, and a site specific run-up factor which is calculated from the existing state tsunami modeling database. The factors are added together into a cumulative FASTER flood potential value for the first five hours of tsunami activity and used to select the appropriate tsunami phase evacuation "playbook" which is provided to each coastal community shortly after the forecast is provided.

  14. High-Resolution Hydrological Sub-Seasonal Forecasting for Water Resources Management Over Europe

    NASA Astrophysics Data System (ADS)

    Wood, E. F.; Wanders, N.; Pan, M.; Sheffield, J.; Samaniego, L. E.; Thober, S.; Kumar, R.; Prudhomme, C.; Houghton-Carr, H.

    2017-12-01

    For decision-making at the sub-seasonal and seasonal time scale, hydrological forecasts with a high temporal and spatial resolution are required by water managers. So far such forecasts have been unavailable due to 1) lack of availability of meteorological seasonal forecasts, 2) coarse temporal resolution of meteorological seasonal forecasts, requiring temporal downscaling, 3) lack of consistency between observations and seasonal forecasts, requiring bias-correction. The EDgE (End-to-end Demonstrator for improved decision making in the water sector in Europe) project commissioned by the ECMWF (C3S) created a unique dataset of hydrological seasonal forecasts derived from four global climate models (CanCM4, FLOR-B01, ECMF, LFPW) in combination with four global hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), resulting in 208 forecasts for any given day. The forecasts provide a daily temporal and 5-km spatial resolution, and are bias corrected against E-OBS meteorological observations. The forecasts are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs), created in collaboration with the end-user community of the EDgE project (e.g. the percentage of ensemble realizations above the 10th percentile of monthly river flow, or below the 90th). Results show skillful forecasts for discharge from 3 months to 6 months (latter for N Europe due to snow); for soil moisture up to three months due precipitation forecast skill and short initial condition memory; and for groundwater greater than 6 months (lowest skill in western Europe.) The SCIIs are effective in communicating both forecast skill and uncertainty. Overall the new system provides an unprecedented ensemble for seasonal forecasts with significant skill over Europe to support water management. The consistency in both the GCM forecasts and the LSM parameterization ensures a stable and reliable forecast framework and methodology, even if additional GCMs or LSMs are added in the future.

  15. Streamflow forecasts from WRF precipitation for flood early warning in mountain tropical areas

    NASA Astrophysics Data System (ADS)

    Rogelis, María Carolina; Werner, Micha

    2018-02-01

    Numerical weather prediction (NWP) models are fundamental to extend forecast lead times beyond the concentration time of a watershed. Particularly for flash flood forecasting in tropical mountainous watersheds, forecast precipitation is required to provide timely warnings. This paper aims to assess the potential of NWP for flood early warning purposes, and the possible improvement that bias correction can provide, in a tropical mountainous area. The paper focuses on the comparison of streamflows obtained from the post-processed precipitation forecasts, particularly the comparison of ensemble forecasts and their potential in providing skilful flood forecasts. The Weather Research and Forecasting (WRF) model is used to produce precipitation forecasts that are post-processed and used to drive a hydrologic model. Discharge forecasts obtained from the hydrological model are used to assess the skill of the WRF model. The results show that post-processed WRF precipitation adds value to the flood early warning system when compared to zero-precipitation forecasts, although the precipitation forecast used in this analysis showed little added value when compared to climatology. However, the reduction of biases obtained from the post-processed ensembles show the potential of this method and model to provide usable precipitation forecasts in tropical mountainous watersheds. The need for more detailed evaluation of the WRF model in the study area is highlighted, particularly the identification of the most suitable parameterisation, due to the inability of the model to adequately represent the convective precipitation found in the study area.

  16. Meteorological Support in Scientific Ballooning

    NASA Technical Reports Server (NTRS)

    Schwantes, Chris; Mullenax, Robert

    2016-01-01

    The weather affects every portion of a scientific balloon mission, from payload integration to launch, float, and impact and recovery. Forecasting for these missions is very specialized and unique in many aspects. CSBF Meteorology incorporates data from NWSNCEP, as well as several international meteorological organizations, and NCAR. This presentation will detail the tools used and specifics on how CSBF Meteorology produces its forecasts.

  17. Meteorological Support in Scientific Ballooning

    NASA Technical Reports Server (NTRS)

    Schwantes, Chris; Mullenax, Robert

    2017-01-01

    The weather affects every portion of a scientific balloon mission, from payload integration to launch, float, and impact and recovery. Forecasting for these missions is very specialized and unique in many aspects. CSBF Meteorology incorporates data from NWSNCEP, as well as several international meteorological organizations, and NCAR. This presentation will detail the tools used and specifics on how CSBF Meteorology produces its forecasts.

  18. Estimates of emergency operating capacity in U.S. manufacturing industries: 1994--2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belzer, D.B.

    1997-02-01

    To develop integrated policies for mobilization preparedness, planners require estimates and projections of available productive capacity during national emergency conditions. This report develops projections of national emergency operating capacity (EOC) for 458 US manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level. These measures are intended for use in planning models that are designed to predict the demands for detailed industry sectors that would occur under conditions such as a military mobilization or a major national disaster. This report is part of an ongoing series of studies prepared by the Pacific Northwest National Laboratory to support mobilization planning studiesmore » of the Federal Emergency Planning Agency/US Department of Defense (FEMA/DOD). Earlier sets of EOC estimates were developed in 1985 and 1991. This study presents estimates of EOC through 2005. As in the 1991 study, projections of capacity were based upon extrapolations of equipment capital stocks. The methodology uses time series regression models based on industry data to obtain a response function of industry capital stock to levels of industrial output. The distributed lag coefficients of these response function are then used with projected outputs to extrapolate the 1994 level of EOC. Projections of industrial outputs were taken from the intermediate-term forecast of the US economy prepared by INFORUM (Interindustry Forecasting Model, University of Maryland) in the spring of 1996.« less

  19. Type- and Subtype-Specific Influenza Forecast.

    PubMed

    Kandula, Sasikiran; Yang, Wan; Shaman, Jeffrey

    2017-03-01

    Prediction of the growth and decline of infectious disease incidence has advanced considerably in recent years. As these forecasts improve, their public health utility should increase, particularly as interventions are developed that make explicit use of forecast information. It is the task of the research community to increase the content and improve the accuracy of these infectious disease predictions. Presently, operational real-time forecasts of total influenza incidence are produced at the municipal and state level in the United States. These forecasts are generated using ensemble simulations depicting local influenza transmission dynamics, which have been optimized prior to forecast with observations of influenza incidence and data assimilation methods. Here, we explore whether forecasts targeted to predict influenza by type and subtype during 2003-2015 in the United States were more or less accurate than forecasts targeted to predict total influenza incidence. We found that forecasts separated by type/subtype generally produced more accurate predictions and, when summed, produced more accurate predictions of total influenza incidence. These findings indicate that monitoring influenza by type and subtype not only provides more detailed observational content but supports more accurate forecasting. More accurate forecasting can help officials better respond to and plan for current and future influenza activity. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Using forecast modelling to evaluate treatment effects in single-group interrupted time series analysis.

    PubMed

    Linden, Ariel

    2018-05-11

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. ITSA is commonly evaluated using methods which may produce biased results if model assumptions are violated. In this paper, treatment effects are alternatively assessed by using forecasting methods to closely fit the preintervention observations and then forecast the post-intervention trend. A treatment effect may be inferred if the actual post-intervention observations diverge from the forecasts by some specified amount. The forecasting approach is demonstrated using the effect of California's Proposition 99 for reducing cigarette sales. Three forecast models are fit to the preintervention series-linear regression (REG), Holt-Winters (HW) non-seasonal smoothing, and autoregressive moving average (ARIMA)-and forecasts are generated into the post-intervention period. The actual observations are then compared with the forecasts to assess intervention effects. The preintervention data were fit best by HW, followed closely by ARIMA. REG fit the data poorly. The actual post-intervention observations were above the forecasts in HW and ARIMA, suggesting no intervention effect, but below the forecasts in the REG (suggesting a treatment effect), thereby raising doubts about any definitive conclusion of a treatment effect. In a single-group ITSA, treatment effects are likely to be biased if the model is misspecified. Therefore, evaluators should consider using forecast models to accurately fit the preintervention data and generate plausible counterfactual forecasts, thereby improving causal inference of treatment effects in single-group ITSA studies. © 2018 John Wiley & Sons, Ltd.

  1. Price of gasoline: forecasting comparisons. [Box-Jenkins, econometric, and regression methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bopp, A.E.; Neri, J.A.

    Gasoline prices are simulated using three popular forecasting methodologies: A Box--Jenkins type method, an econometric method, and a regression method. One-period-ahead and 18-period-ahead comparisons are made. For the one-period-ahead method, a Box--Jenkins type time-series model simulated best, although all do well. However, for the 18-period simulation, the econometric and regression methods perform substantially better than the Box-Jenkins formulation. A rationale for and implications of these results ae discussed. 11 references.

  2. Environmental noise forecasting based on support vector machine

    NASA Astrophysics Data System (ADS)

    Fu, Yumei; Zan, Xinwu; Chen, Tianyi; Xiang, Shihan

    2018-01-01

    As an important pollution source, the noise pollution is always the researcher's focus. Especially in recent years, the noise pollution is seriously harmful to the human beings' environment, so the research about the noise pollution is a very hot spot. Some noise monitoring technologies and monitoring systems are applied in the environmental noise test, measurement and evaluation. But, the research about the environmental noise forecasting is weak. In this paper, a real-time environmental noise monitoring system is introduced briefly. This monitoring system is working in Mianyang City, Sichuan Province. It is monitoring and collecting the environmental noise about more than 20 enterprises in this district. Based on the large amount of noise data, the noise forecasting by the Support Vector Machine (SVM) is studied in detail. Compared with the time series forecasting model and the artificial neural network forecasting model, the SVM forecasting model has some advantages such as the smaller data size, the higher precision and stability. The noise forecasting results based on the SVM can provide the important and accuracy reference to the prevention and control of the environmental noise.

  3. Assessing probabilistic predictions of ENSO phase and intensity from the North American Multimodel Ensemble

    NASA Astrophysics Data System (ADS)

    Tippett, Michael K.; Ranganathan, Meghana; L'Heureux, Michelle; Barnston, Anthony G.; DelSole, Timothy

    2017-05-01

    Here we examine the skill of three, five, and seven-category monthly ENSO probability forecasts (1982-2015) from single and multi-model ensemble integrations of the North American Multimodel Ensemble (NMME) project. Three-category forecasts are typical and provide probabilities for the ENSO phase (El Niño, La Niña or neutral). Additional forecast categories indicate the likelihood of ENSO conditions being weak, moderate or strong. The level of skill observed for differing numbers of forecast categories can help to determine the appropriate degree of forecast precision. However, the dependence of the skill score itself on the number of forecast categories must be taken into account. For reliable forecasts with same quality, the ranked probability skill score (RPSS) is fairly insensitive to the number of categories, while the logarithmic skill score (LSS) is an information measure and increases as categories are added. The ignorance skill score decreases to zero as forecast categories are added, regardless of skill level. For all models, forecast formats and skill scores, the northern spring predictability barrier explains much of the dependence of skill on target month and forecast lead. RPSS values for monthly ENSO forecasts show little dependence on the number of categories. However, the LSS of multimodel ensemble forecasts with five and seven categories show statistically significant advantages over the three-category forecasts for the targets and leads that are least affected by the spring predictability barrier. These findings indicate that current prediction systems are capable of providing more detailed probabilistic forecasts of ENSO phase and amplitude than are typically provided.

  4. A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting.

    PubMed

    Ben Taieb, Souhaib; Atiya, Amir F

    2016-01-01

    Multistep-ahead forecasts can either be produced recursively by iterating a one-step-ahead time series model or directly by estimating a separate model for each forecast horizon. In addition, there are other strategies; some of them combine aspects of both aforementioned concepts. In this paper, we present a comprehensive investigation into the bias and variance behavior of multistep-ahead forecasting strategies. We provide a detailed review of the different multistep-ahead strategies. Subsequently, we perform a theoretical study that derives the bias and variance for a number of forecasting strategies. Finally, we conduct a Monte Carlo experimental study that compares and evaluates the bias and variance performance of the different strategies. From the theoretical and the simulation studies, we analyze the effect of different factors, such as the forecast horizon and the time series length, on the bias and variance components, and on the different multistep-ahead strategies. Several lessons are learned, and recommendations are given concerning the advantages, disadvantages, and best conditions of use of each strategy.

  5. Extreme Wind, Rain, Storm Surge, and Flooding: Why Hurricane Impacts are Difficult to Forecast?

    NASA Astrophysics Data System (ADS)

    Chen, S. S.

    2017-12-01

    The 2017 hurricane season is estimated as one of the costliest in the U.S. history. The damage and devastation caused by Hurricane Harvey in Houston, Irma in Florida, and Maria in Puerto Rico are distinctly different in nature. The complexity of hurricane impacts from extreme wind, rain, storm surge, and flooding presents a major challenge in hurricane forecasting. A detailed comparison of the storm impacts from Harvey, Irma, and Maria will be presented using observations and state-of-the-art new generation coupled atmosphere-wave-ocean hurricane forecast model. The author will also provide an overview on what we can expect in terms of advancement in science and technology that can help improve hurricane impact forecast in the near future.

  6. Volcanic Eruption Forecasts From Accelerating Rates of Drumbeat Long-Period Earthquakes

    NASA Astrophysics Data System (ADS)

    Bell, Andrew F.; Naylor, Mark; Hernandez, Stephen; Main, Ian G.; Gaunt, H. Elizabeth; Mothes, Patricia; Ruiz, Mario

    2018-02-01

    Accelerating rates of quasiperiodic "drumbeat" long-period earthquakes (LPs) are commonly reported before eruptions at andesite and dacite volcanoes, and promise insights into the nature of fundamental preeruptive processes and improved eruption forecasts. Here we apply a new Bayesian Markov chain Monte Carlo gamma point process methodology to investigate an exceptionally well-developed sequence of drumbeat LPs preceding a recent large vulcanian explosion at Tungurahua volcano, Ecuador. For more than 24 hr, LP rates increased according to the inverse power law trend predicted by material failure theory, and with a retrospectively forecast failure time that agrees with the eruption onset within error. LPs resulted from repeated activation of a single characteristic source driven by accelerating loading, rather than a distributed failure process, showing that similar precursory trends can emerge from quite different underlying physics. Nevertheless, such sequences have clear potential for improving forecasts of eruptions at Tungurahua and analogous volcanoes.

  7. Forecasting influenza in Hong Kong with Google search queries and statistical model fusion

    PubMed Central

    Ramirez Ramirez, L. Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung

    2017-01-01

    Background The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Methods Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. Results DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. Conclusions The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient. PMID:28464015

  8. Alteration of Box-Jenkins methodology by implementing genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad

    2015-02-01

    A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.

  9. Indicators of Student Flow Rates in Honduras: An Assessment of an Alternative Methodology, with Two Methodologies for Estimating Student Flow Rates. BRIDGES Research Report No. 6.

    ERIC Educational Resources Information Center

    Cuadra, Ernesto; Crouch, Luis

    Student promotion, repetition, and dropout rates constitute the basic data needed to forecast future enrollment and new resources. Information on student flow is significantly related to policy formulation aimed at improving internal efficiency, because dropping out and grade repetition increase per pupil cost, block access to eligible school-age…

  10. Statistical post-processing of seasonal multi-model forecasts: Why is it so hard to beat the multi-model mean?

    NASA Astrophysics Data System (ADS)

    Siegert, Stefan

    2017-04-01

    Initialised climate forecasts on seasonal time scales, run several months or even years ahead, are now an integral part of the battery of products offered by climate services world-wide. The availability of seasonal climate forecasts from various modeling centres gives rise to multi-model ensemble forecasts. Post-processing such seasonal-to-decadal multi-model forecasts is challenging 1) because the cross-correlation structure between multiple models and observations can be complicated, 2) because the amount of training data to fit the post-processing parameters is very limited, and 3) because the forecast skill of numerical models tends to be low on seasonal time scales. In this talk I will review new statistical post-processing frameworks for multi-model ensembles. I will focus particularly on Bayesian hierarchical modelling approaches, which are flexible enough to capture commonly made assumptions about collective and model-specific biases of multi-model ensembles. Despite the advances in statistical methodology, it turns out to be very difficult to out-perform the simplest post-processing method, which just recalibrates the multi-model ensemble mean by linear regression. I will discuss reasons for this, which are closely linked to the specific characteristics of seasonal multi-model forecasts. I explore possible directions for improvements, for example using informative priors on the post-processing parameters, and jointly modelling forecasts and observations.

  11. Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján

    2017-06-01

    It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.

  12. A novel hybrid forecasting model for PM₁₀ and SO₂ daily concentrations.

    PubMed

    Wang, Ping; Liu, Yong; Qin, Zuodong; Zhang, Guisheng

    2015-02-01

    Air-quality forecasting in urban areas is difficult because of the uncertainties in describing both the emission and meteorological fields. The use of incomplete information in the training phase restricts practical air-quality forecasting. In this paper, we propose a hybrid artificial neural network and a hybrid support vector machine, which effectively enhance the forecasting accuracy of an artificial neural network (ANN) and support vector machine (SVM) by revising the error term of the traditional methods. The hybrid methodology can be described in two stages. First, we applied the ANN or SVM forecasting system with historical data and exogenous parameters, such as meteorological variables. Then, the forecasting target was revised by the Taylor expansion forecasting model using the residual information of the error term in the previous stage. The innovation involved in this approach is that it sufficiently and validly utilizes the useful residual information on an incomplete input variable condition. The proposed method was evaluated by experiments using a 2-year dataset of daily PM₁₀ (particles with a diameter of 10 μm or less) concentrations and SO₂ (sulfur dioxide) concentrations from four air pollution monitoring stations located in Taiyuan, China. The theoretical analysis and experimental results demonstrated that the forecasting accuracy of the proposed model is very promising. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. A Hybrid Model for Predicting the Prevalence of Schistosomiasis in Humans of Qianjiang City, China

    PubMed Central

    Wang, Ying; Lu, Zhouqin; Tian, Lihong; Tan, Li; Shi, Yun; Nie, Shaofa; Liu, Li

    2014-01-01

    Backgrounds/Objective Schistosomiasis is still a major public health problem in China, despite the fact that the government has implemented a series of strategies to prevent and control the spread of the parasitic disease. Advanced warning and reliable forecasting can help policymakers to adjust and implement strategies more effectively, which will lead to the control and elimination of schistosomiasis. Our aim is to explore the application of a hybrid forecasting model to track the trends of the prevalence of schistosomiasis in humans, which provides a methodological basis for predicting and detecting schistosomiasis infection in endemic areas. Methods A hybrid approach combining the autoregressive integrated moving average (ARIMA) model and the nonlinear autoregressive neural network (NARNN) model to forecast the prevalence of schistosomiasis in the future four years. Forecasting performance was compared between the hybrid ARIMA-NARNN model, and the single ARIMA or the single NARNN model. Results The modelling mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model was 0.1869×10−4, 0.0029, 0.0419 with a corresponding testing error of 0.9375×10−4, 0.0081, 0.9064, respectively. These error values generated with the hybrid model were all lower than those obtained from the single ARIMA or NARNN model. The forecasting values were 0.75%, 0.80%, 0.76% and 0.77% in the future four years, which demonstrated a no-downward trend. Conclusion The hybrid model has high quality prediction accuracy in the prevalence of schistosomiasis, which provides a methodological basis for future schistosomiasis monitoring and control strategies in the study area. It is worth attempting to utilize the hybrid detection scheme in other schistosomiasis-endemic areas including other infectious diseases. PMID:25119882

  14. Enviro-HIRLAM online integrated meteorology-chemistry modelling system: strategy, methodology, developments and applications (v7.2)

    NASA Astrophysics Data System (ADS)

    Baklanov, Alexander; Smith Korsholm, Ulrik; Nuterman, Roman; Mahura, Alexander; Pagh Nielsen, Kristian; Hansen Sass, Bent; Rasmussen, Alix; Zakey, Ashraf; Kaas, Eigil; Kurganskiy, Alexander; Sørensen, Brian; González-Aparicio, Iratxe

    2017-08-01

    The Environment - High Resolution Limited Area Model (Enviro-HIRLAM) is developed as a fully online integrated numerical weather prediction (NWP) and atmospheric chemical transport (ACT) model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI) in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2), in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct) on radiation and (first and second indirect effects) on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform - HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose model configurations for the meteorological and air quality communities are discussed.

  15. Forecasting carbon dioxide emissions based on a hybrid of mixed data sampling regression model and back propagation neural network in the USA.

    PubMed

    Zhao, Xin; Han, Meng; Ding, Lili; Calin, Adrian Cantemir

    2018-01-01

    The accurate forecast of carbon dioxide emissions is critical for policy makers to take proper measures to establish a low carbon society. This paper discusses a hybrid of the mixed data sampling (MIDAS) regression model and BP (back propagation) neural network (MIDAS-BP model) to forecast carbon dioxide emissions. Such analysis uses mixed frequency data to study the effects of quarterly economic growth on annual carbon dioxide emissions. The forecasting ability of MIDAS-BP is remarkably better than MIDAS, ordinary least square (OLS), polynomial distributed lags (PDL), autoregressive distributed lags (ADL), and auto-regressive moving average (ARMA) models. The MIDAS-BP model is suitable for forecasting carbon dioxide emissions for both the short and longer term. This research is expected to influence the methodology for forecasting carbon dioxide emissions by improving the forecast accuracy. Empirical results show that economic growth has both negative and positive effects on carbon dioxide emissions that last 15 quarters. Carbon dioxide emissions are also affected by their own change within 3 years. Therefore, there is a need for policy makers to explore an alternative way to develop the economy, especially applying new energy policies to establish a low carbon society.

  16. Wind Energy Management System Integration Project Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.

    2010-09-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and windmore » forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. In this report, a new methodology to predict the uncertainty ranges for the required balancing capacity, ramping capability and ramp duration is presented. Uncertainties created by system load forecast errors, wind and solar forecast errors, generation forced outages are taken into account. The uncertainty ranges are evaluated for different confidence levels of having the actual generation requirements within the corresponding limits. The methodology helps to identify system balancing reserve requirement based on a desired system performance levels, identify system “breaking points”, where the generation system becomes unable to follow the generation requirement curve with the user-specified probability level, and determine the time remaining to these potential events. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (California ISO) real life data have shown the effectiveness of the proposed approach. A tool developed based on the new methodology described in this report will be integrated with the California ISO systems. Contractual work is currently in place to integrate the tool with the AREVA EMS system.« less

  17. Potential economic value of drought information to support early warning in Africa

    NASA Astrophysics Data System (ADS)

    Quiroga, S.; Iglesias, A.; Diz, A.; Garrote, L.

    2012-04-01

    We present a methodology to estimate the economic value of advanced climate information for food production in Africa under climate change scenarios. The results aim to facilitate better choices in water resources management. The methodology includes 4 sequential steps. First two contrasting management strategies (with and without early warning) are defined. Second, the associated impacts of the management actions are estimated by calculating the effect of drought in crop productivity under climate change scenarios. Third, the optimal management option is calculated as a function of the drought information and risk aversion of potential information users. Finally we use these optimal management simulations to compute the economic value of enhanced water allocation rules to support stable food production in Africa. Our results show how a timely response to climate variations can help reduce loses in food production. The proposed framework is developed within the Dewfora project (Early warning and forecasting systems to predict climate related drought vulnerability and risk in Africa) that aims to improve the knowledge on drought forecasting, warning and mitigation, and advance the understanding of climate related vulnerability to drought and to develop a prototype operational forecasting.

  18. Assessing the Effects of Climate Variability on Orange Yield in Florida to Reduce Production Forecast Errors

    NASA Astrophysics Data System (ADS)

    Concha Larrauri, P.

    2015-12-01

    Orange production in Florida has experienced a decline over the past decade. Hurricanes in 2004 and 2005 greatly affected production, almost to the same degree as strong freezes that occurred in the 1980's. The spread of the citrus greening disease after the hurricanes has also contributed to a reduction in orange production in Florida. The occurrence of hurricanes and diseases cannot easily be predicted but the additional effects of climate on orange yield can be studied and incorporated into existing production forecasts that are based on physical surveys, such as the October Citrus forecast issued every year by the USDA. Specific climate variables ocurring before and after the October forecast is issued can have impacts on flowering, orange drop rates, growth, and maturation, and can contribute to the forecast error. Here we present a methodology to incorporate local climate variables to predict the USDA's orange production forecast error, and we study the local effects of climate on yield in different counties in Florida. This information can aid farmers to gain an insight on what is to be expected during the orange production cycle, and can help supply chain managers to better plan their strategy.

  19. Intra-Hour Dispatch and Automatic Generator Control Demonstration with Solar Forecasting - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coimbra, Carlos F. M.

    2016-02-25

    In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior inmore » real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.« less

  20. Constraints on Rational Model Weighting, Blending and Selecting when Constructing Probability Forecasts given Multiple Models

    NASA Astrophysics Data System (ADS)

    Higgins, S. M. W.; Du, H. L.; Smith, L. A.

    2012-04-01

    Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.

  1. Environmental forecasting and turbulence modeling

    NASA Astrophysics Data System (ADS)

    Hunt, J. C. R.

    This review describes the fundamental assumptions and current methodologies of the two main kinds of environmental forecast; the first is valid for a limited period of time into the future and over a limited space-time ‘target’, and is largely determined by the initial and preceding state of the environment, such as the weather or pollution levels, up to the time when the forecast is issued and by its state at the edges of the region being considered; the second kind provides statistical information over long periods of time and/or over large space-time targets, so that they only depend on the statistical averages of the initial and ‘edge’ conditions. Environmental forecasts depend on the various ways that models are constructed. These range from those based on the ‘reductionist’ methodology (i.e., the combination of separate, scientifically based, models for the relevant processes) to those based on statistical methodologies, using a mixture of data and scientifically based empirical modeling. These are, as a rule, focused on specific quantities required for the forecast. The persistence and predictability of events associated with environmental and turbulent flows and the reasons for variation in the accuracy of their forecasts (of the first and second kinds) are now better understood and better modeled. This has partly resulted from using analogous results of disordered chaotic systems, and using the techniques of calculating ensembles of realizations, ideally involving several different models, so as to incorporate in the probabilistic forecasts a wider range of possible events. The rationale for such an approach needs to be developed. However, other insights have resulted from the recognition of the ordered, though randomly occurring, nature of the persistent motions in these flows, whose scales range from those of synoptic weather patterns (whether storms or ‘blocked’ anticyclones) to small scale vortices. These eigen states can be predicted from the reductionist models or may be modeled specifically, for example, in terms of ‘self-organized’ critical phenomena. It is noted how in certain applications of turbulent modeling its methods are beginning to resemble those of environmental simulations, because of the trend to introduce ‘on-line’ controls of the turbulent flows in advanced flows in advanced engineering fluid systems. In real time simulations, for both local environmental processes and these engineering systems, maximum information is needed about the likely flow patterns in order to optimize both the assimilation of limited real-time data and the use of limited real-time computing capacity. It is concluded that philosophical studies of how scientific models develop and of the concept of determinism in science are helpful in considering these complex issues.

  2. Forecasting production in Liquid Rich Shale plays

    NASA Astrophysics Data System (ADS)

    Nikfarman, Hanieh

    Production from Liquid Rich Shale (LRS) reservoirs is taking center stage in the exploration and production of unconventional reservoirs. Production from the low and ultra-low permeability LRS plays is possible only through multi-fractured horizontal wells (MFHW's). There is no existing workflow that is applicable to forecasting multi-phase production from MFHW's in LRS plays. This project presents a practical and rigorous workflow for forecasting multiphase production from MFHW's in LRS reservoirs. There has been much effort in developing workflows and methodology for forecasting in tight/shale plays in recent years. The existing workflows, however, are applicable only to single phase flow, and are primarily used in shale gas plays. These methodologies do not apply to the multi-phase flow that is inevitable in LRS plays. To account for complexities of multiphase flow in MFHW's the only available technique is dynamic modeling in compositional numerical simulators. These are time consuming and not practical when it comes to forecasting production and estimating reserves for a large number of producers. A workflow was developed, and validated by compositional numerical simulation. The workflow honors physics of flow, and is sufficiently accurate while practical so that an analyst can readily apply it to forecast production and estimate reserves in a large number of producers in a short period of time. To simplify the complex multiphase flow in MFHW, the workflow divides production periods into an initial period where large production and pressure declines are expected, and the subsequent period where production decline may converge into a common trend for a number of producers across an area of interest in the field. Initial period assumes the production is dominated by single-phase flow of oil and uses the tri-linear flow model of Erdal Ozkan to estimate the production history. Commercial software readily available can simulate flow and forecast production in this period. In the subsequent Period, dimensionless rate and dimensionless time functions are introduced that help identify transition from initial period into subsequent period. The production trends in terms of the dimensionless parameters converge for a range of rock permeability and stimulation intensity. This helps forecast production beyond transition to the end of life of well. This workflow is applicable to single fluid system.

  3. A hybrid spatiotemporal drought forecasting model for operational use

    NASA Astrophysics Data System (ADS)

    Vasiliades, L.; Loukas, A.

    2010-09-01

    Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. This study develops a hybrid spatiotemporal scheme for integrated spatial and temporal forecasting. Temporal forecasting is achieved using feed-forward neural networks and the temporal forecasts are extended to the spatial dimension using a spatial recurrent neural network model. The methodology is demonstrated for an operational meteorological drought index the Standardized Precipitation Index (SPI) calculated at multiple timescales. 48 precipitation stations and 18 independent precipitation stations, located at Pinios river basin in Thessaly region, Greece, were used for the development and spatiotemporal validation of the hybrid spatiotemporal scheme. Several quantitative temporal and spatial statistical indices were considered for the performance evaluation of the models. Furthermore, qualitative statistical criteria based on contingency tables between observed and forecasted drought episodes were calculated. The results show that the lead time of forecasting for operational use depends on the SPI timescale. The hybrid spatiotemporal drought forecasting model could be operationally used for forecasting up to three months ahead for SPI short timescales (e.g. 3-6 months) up to six months ahead for large SPI timescales (e.g. 24 months). The above findings could be useful in developing a drought preparedness plan in the region.

  4. Application of SeaWinds Scatterometer and TMI-SSM/I Rain Rates to Hurricane Analysis and Forecasting

    NASA Technical Reports Server (NTRS)

    Atlas, Robert; Hou, Arthur; Reale, Oreste

    2004-01-01

    Results provided by two different assimilation methodologies involving data from passive and active space-borne microwave instruments are presented. The impact of the precipitation estimates produced by the TRMM Microwave Imager (TMI) and Special Sensor Microwave/Imager (SSM/I) in a previously developed 1D variational continuous assimilation algorithm for assimilating tropical rainfall is shown on two hurricane cases. Results on the impact of the SeaWinds scatterometer on the intensity and track forecast of a mid-Atlantic hurricane are also presented. This work is the outcome of a collaborative effort between NASA and NOAA and indicates the substantial improvement in tropical cyclone forecasting that can result from the assimilation of space-based data in global atmospheric models.

  5. An evolving-requirements technology assessment process for advanced propulsion concepts

    NASA Astrophysics Data System (ADS)

    McClure, Erin Kathleen

    The following dissertation investigates the development of a methodology suitable for the evaluation of advanced propulsion concepts. At early stages of development, both the future performance of these concepts and their requirements are highly uncertain, making it difficult to forecast their future value. Developing advanced propulsion concepts requires a huge investment of resources. The methodology was developed to enhance the decision-makers understanding of the concepts, so that they could mitigate the risks associated with developing such concepts. A systematic methodology to identify potential advanced propulsion concepts and assess their robustness is necessary to reduce the risk of developing advanced propulsion concepts. Existing advanced design methodologies have evaluated the robustness of technologies or concepts to variations in requirements, but they are not suitable to evaluate a large number of dissimilar concepts. Variations in requirements have been shown to impact the development of advanced propulsion concepts, and any method designed to evaluate these concepts must incorporate the possible variations of the requirements into the assessment. In order to do so, a methodology was formulated to be capable of accounting for two aspects of the problem. First, it had to systemically identify a probabilistic distribution for the future requirements. Such a distribution would allow decision-makers to quantify the uncertainty introduced by variations in requirements. Second, the methodology must be able to assess the robustness of the propulsion concepts as a function of that distribution. This dissertation describes in depth these enabling elements and proceeds to synthesize them into a new method, the Evolving Requirements Technology Assessment (ERTA). As a proof of concept, the ERTA method was used to evaluate and compare advanced propulsion systems that will be capable of powering a hurricane tracking, High Altitude, Long Endurance (HALE) unmanned aerial vehicle (UAV). The use of the ERTA methodology to assess HALE UAV propulsion concepts demonstrated that potential variations in requirements do significantly impact the assessment and selection of propulsion concepts. The proof of concept also demonstrated that traditional forecasting techniques, such as the cross impact analysis, could be used to forecast the requirements for advanced propulsion concepts probabilistically. "Fitness", a measure of relative goodness, was used to evaluate the concepts. Finally, stochastic optimizations were used to evaluate the propulsion concepts across the range of requirement sets that were considered.

  6. High Resolution Forecasts in the Florida Straits: Predicting the Modulations of the Florida Current and Connectivity Around South Florida and Cuba

    NASA Astrophysics Data System (ADS)

    Kourafalou, V.; Kang, H.; Perlin, N.; Le Henaff, M.; Lamkin, J. T.

    2016-02-01

    Connectivity around the South Florida coastal regions and between South Florida and Cuba are largely influenced by a) local coastal processes and b) circulation in the Florida Straits, which is controlled by the larger scale Florida Current variability. Prediction of the physical connectivity is a necessary component for several activities that require ocean forecasts, such as oil spills, fisheries research, search and rescue. This requires a predictive system that can accommodate the intense coastal to offshore interactions and the linkages to the complex regional circulation. The Florida Straits, South Florida and Florida Keys Hybrid Coordinate Ocean Model is such a regional ocean predictive system, covering a large area over the Florida Straits and the adjacent land areas, representing both coastal and oceanic processes. The real-time ocean forecast system is high resolution ( 900m), embedded in larger scale predictive models. It includes detailed coastal bathymetry, high resolution/high frequency atmospheric forcing and provides 7-day forecasts, updated daily (see: http://coastalmodeling.rsmas.miami.edu/). The unprecedented high resolution and coastal details of this system provide value added on global forecasts through downscaling and allow a variety of applications. Examples will be presented, focusing on the period of a 2015 fisheries cruise around the coastal areas of Cuba, where model predictions helped guide the measurements on biophysical connectivity, under intense variability of the mesoscale eddy field and subsequent Florida Current meandering.

  7. Nonlinear problems in data-assimilation : Can synchronization help?

    NASA Astrophysics Data System (ADS)

    Tribbia, J. J.; Duane, G. S.

    2009-12-01

    Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.

  8. Forecasting the absolute and relative shortage of physicians in Japan using a system dynamics model approach

    PubMed Central

    2013-01-01

    Background In Japan, a shortage of physicians, who serve a key role in healthcare provision, has been pointed out as a major medical issue. The healthcare workforce policy planner should consider future dynamic changes in physician numbers. The purpose of this study was to propose a physician supply forecasting methodology by applying system dynamics modeling to estimate future absolute and relative numbers of physicians. Method We constructed a forecasting model using a system dynamics approach. Forecasting the number of physician was performed for all clinical physician and OB/GYN specialists. Moreover, we conducted evaluation of sufficiency for the number of physicians and sensitivity analysis. Result & conclusion As a result, it was forecast that the number of physicians would increase during 2008–2030 and the shortage would resolve at 2026 for all clinical physicians. However, the shortage would not resolve for the period covered. This suggests a need for measures for reconsidering the allocation system of new entry physicians to resolve maldistribution between medical departments, in addition, for increasing the overall number of clinical physicians. PMID:23981198

  9. Forecasting the absolute and relative shortage of physicians in Japan using a system dynamics model approach.

    PubMed

    Ishikawa, Tomoki; Ohba, Hisateru; Yokooka, Yuki; Nakamura, Kozo; Ogasawara, Katsuhiko

    2013-08-27

    In Japan, a shortage of physicians, who serve a key role in healthcare provision, has been pointed out as a major medical issue. The healthcare workforce policy planner should consider future dynamic changes in physician numbers. The purpose of this study was to propose a physician supply forecasting methodology by applying system dynamics modeling to estimate future absolute and relative numbers of physicians. We constructed a forecasting model using a system dynamics approach. Forecasting the number of physician was performed for all clinical physician and OB/GYN specialists. Moreover, we conducted evaluation of sufficiency for the number of physicians and sensitivity analysis. As a result, it was forecast that the number of physicians would increase during 2008-2030 and the shortage would resolve at 2026 for all clinical physicians. However, the shortage would not resolve for the period covered. This suggests a need for measures for reconsidering the allocation system of new entry physicians to resolve maldistribution between medical departments, in addition, for increasing the overall number of clinical physicians.

  10. An Initial Assessment of the Impact of CYGNSS Ocean Surface Wind Assimilation on Navy Global and Mesoscale Numerical Weather Prediction

    NASA Astrophysics Data System (ADS)

    Baker, N. L.; Tsu, J.; Swadley, S. D.

    2017-12-01

    We assess the impact of assimilation of CYclone Global Navigation Satellite System (CYGNSS) ocean surface winds observations into the NAVGEM[i] global and COAMPS®[ii] mesoscale numerical weather prediction (NWP) systems. Both NAVGEM and COAMPS® used the NRL 4DVar assimilation system NAVDAS-AR[iii]. Long term monitoring of the NAVGEM Forecast Sensitivity Observation Impact (FSOI) indicates that the forecast error reduction for ocean surface wind vectors (ASCAT and WindSat) are significantly larger than for SSMIS wind speed observations. These differences are larger than can be explained by simply two pieces of information (for wind vectors) versus one (wind speed). To help understand these results, we conducted a series of Observing System Experiments (OSEs) to compare the assimilation of ASCAT wind vectors with the equivalent (computed) ASCAT wind speed observations. We found that wind vector assimilation was typically 3 times more effective at reducing the NAVGEM forecast error, with a higher percentage of beneficial observations. These results suggested that 4DVar, in the absence of an additional nonlinear outer loop, has limited ability to modify the analysis wind direction. We examined several strategies for assimilating CYGNSS ocean surface wind speed observations. In the first approach, we assimilated CYGNSS as wind speed observations, following the same methodology used for SSMIS winds. The next two approaches converted CYGNSS wind speed to wind vectors, using NAVGEM sea level pressure fields (following Holton, 1979), and using NAVGEM 10-m wind fields with the AER Variational Analysis Method. Finally, we compared these methods to CYGNSS wind speed assimilation using multiple outer loops with NAVGEM Hybrid 4DVar. Results support the earlier studies suggesting that NAVDAS-AR wind speed assimilation is sub-optimal. We present detailed results from multi-month NAVGEM assimilation runs along with case studies using COAMPS®. Comparisons include the fit of analyses and forecasts with in-situ observations and analyses from other NWP centers (e.g. ECMWF and GFS). [i] NAVy Global Environmental Model [ii] COAMPS® is a registered trademark of the Naval Research Laboratory for the Navy's Coupled Ocean Atmosphere Mesoscale Prediction System. [iii] NRL Atmospheric Variational Data Assimilation System

  11. System learning approach to assess sustainability and ...

    EPA Pesticide Factsheets

    This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key method in Information Theory, to preserve trends in the historical data and prevent over fitting projections. The methodology was applied to demographic, environmental, food and energy consumption, and agricultural production in the San Luis Basin regional system in Colorado, U.S.A. These variables are important for tracking conditions in human and natural systems. However, available data are often so far out of date that they limit the ability to manage these systems. Results indicate that the approaches developed provide viable tools for forecasting outcomes with the aim of assisting management toward sustainable trends. This methodology is also applicable for modeling different scenarios in other dynamic systems. Indicators are indispensable for tracking conditions in human and natural systems, however, available data is sometimes far out of date and limit the ability to gauge system status. Techniques like regression and simulation are not sufficient because system characteristics have to be modeled ensuring over simplification of complex dynamics. This work presents a methodology combining the power of an Artificial Neural Network and Information Theory to capture patterns in a real dyna

  12. Making large amounts of meteorological plots easily accessible to users

    NASA Astrophysics Data System (ADS)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  13. Reservoir studies with geostatistics to forecast performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, R.W.; Behrens, R.A.; Emanuel, A.S.

    1991-05-01

    In this paper example geostatistics and streamtube applications are presented for waterflood and CO{sub 2} flood in two low-permeability sandstone reservoirs. Thy hybrid approach of combining fine vertical resolution in cross-sectional models with streamtubes resulted in models that showed water channeling and provided realistic performance estimates. Results indicate that the combination of detailed geostatistical cross sections and fine-grid streamtube models offers a systematic approach for realistic performance forecasts.

  14. Urban Seismic Hazard Mapping for Memphis, Shelby County, Tennessee

    USGS Publications Warehouse

    Gomberg, Joan

    2006-01-01

    Earthquakes cannot be predicted, but scientists can forecast how strongly the ground is likely to shake as a result of an earthquake. Seismic hazard maps provide one way of conveying such forecasts. The U.S. Geological Survey (USGS), which produces seismic hazard maps for the Nation, is now engaged in developing more detailed maps for vulnerable urban areas. The first set of these maps is now available for Memphis, Tennessee.

  15. Verifying and Postprocesing the Ensemble Spread-Error Relationship

    NASA Astrophysics Data System (ADS)

    Hopson, Tom; Knievel, Jason; Liu, Yubao; Roux, Gregory; Wu, Wanli

    2013-04-01

    With the increased utilization of ensemble forecasts in weather and hydrologic applications, there is a need to verify their benefit over less expensive deterministic forecasts. One such potential benefit of ensemble systems is their capacity to forecast their own forecast error through the ensemble spread-error relationship. The paper begins by revisiting the limitations of the Pearson correlation alone in assessing this relationship. Next, we introduce two new metrics to consider in assessing the utility an ensemble's varying dispersion. We argue there are two aspects of an ensemble's dispersion that should be assessed. First, and perhaps more fundamentally: is there enough variability in the ensembles dispersion to justify the maintenance of an expensive ensemble prediction system (EPS), irrespective of whether the EPS is well-calibrated or not? To diagnose this, the factor that controls the theoretical upper limit of the spread-error correlation can be useful. Secondly, does the variable dispersion of an ensemble relate to variable expectation of forecast error? Representing the spread-error correlation in relation to its theoretical limit can provide a simple diagnostic of this attribute. A context for these concepts is provided by assessing two operational ensembles: 30-member Western US temperature forecasts for the U.S. Army Test and Evaluation Command and 51-member Brahmaputra River flow forecasts of the Climate Forecast and Applications Project for Bangladesh. Both of these systems utilize a postprocessing technique based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. In addition, the methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. We will describe both ensemble systems briefly, review the steps used to calibrate the ensemble forecast, and present verification statistics using error-spread metrics, along with figures from operational ensemble forecasts before and after calibration.

  16. Corridor-based forecasts of work-zone impacts for freeways.

    DOT National Transportation Integrated Search

    2011-08-09

    This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...

  17. Transportation Sector Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model.

  18. Evaluation of flash-flood discharge forecasts in complex terrain using precipitation

    USGS Publications Warehouse

    Yates, D.; Warner, T.T.; Brandes, E.A.; Leavesley, G.H.; Sun, Jielun; Mueller, C.K.

    2001-01-01

    Operational prediction of flash floods produced by thunderstorm (convective) precipitation in mountainous areas requires accurate estimates or predictions of the precipitation distribution in space and time. The details of the spatial distribution are especially critical in complex terrain because the watersheds are generally small in size, and small position errors in the forecast or observed placement of the precipitation can distribute the rain over the wrong watershed. In addition to the need for good precipitation estimates and predictions, accurate flood prediction requires a surface-hydrologic model that is capable of predicting stream or river discharge based on the precipitation-rate input data. Different techniques for the estimation and prediction of convective precipitation will be applied to the Buffalo Creek, Colorado flash flood of July 1996, where over 75 mm of rain from a thunderstorm fell on the watershed in less than 1 h. The hydrologic impact of the precipitation was exacerbated by the fact that a significant fraction of the watershed experienced a wildfire approximately two months prior to the rain event. Precipitation estimates from the National Weather Service's operational Weather Surveillance Radar-Doppler 1988 and the National Center for Atmospheric Research S-band, research, dual-polarization radar, colocated to the east of Denver, are compared. In addition, very short range forecasts from a convection-resolving dynamic model, which is initialized variationally using the radar reflectivity and Doppler winds, are compared with forecasts from an automated-algorithmic forecast system that also employs the radar data. The radar estimates of rain rate, and the two forecasting systems that employ the radar data, have degraded accuracy by virtue of the fact that they are applied in complex terrain. Nevertheless, the radar data and forecasts from the dynamic model and the automated algorithm could be operationally useful for input to surface-hydrologic models employed for flood warning. Precipitation data provided by these various techniques at short time scales and at fine spatial resolutions are employed as detailed input to a distributed-parameter hydrologic model for flash-flood prediction and analysis. With the radar-based precipitation estimates employed as input, the simulated flood discharge was similar to that observed. The dynamic-model precipitation forecast showed the most promise in providing a significant discharge-forecast lead time. The algorithmic system's precipitation forecast did not demonstrate as much skill, but the associated discharge forecast would still have been sufficient to have provided an alert of impending flood danger.

  19. Satellite based Ocean Forecasting, the SOFT project

    NASA Astrophysics Data System (ADS)

    Stemmann, L.; Tintoré, J.; Moneris, S.

    2003-04-01

    The knowledge of future oceanic conditions would have enormous impact on human marine related areas. For such reasons, a number of international efforts are being carried out to obtain reliable and manageable ocean forecasting systems. Among the possible techniques that can be used to estimate the near future states of the ocean, an ocean forecasting system based on satellite imagery is developped through the Satelitte based Ocean ForecasTing project (SOFT). SOFT, established by the European Commission, considers the development of a forecasting system of the ocean space-time variability based on satellite data by using Artificial Intelligence techniques. This system will be merged with numerical simulation approaches, via assimilation techniques, to get a hybrid SOFT-numerical forecasting system of improved performance. The results of the project will provide efficient forecasting of sea-surface temperature structures, currents, dynamic height, and biological activity associated to chlorophyll fields. All these quantities could give valuable information on the planning and management of human activities in marine environments such as navigation, fisheries, pollution control, or coastal management. A detailed identification of present or new needs and potential end-users concerned by such an operational tool is being performed. The project would study solutions adapted to these specific needs.

  20. A 30-day-ahead forecast model for grass pollen in north London, United Kingdom.

    PubMed

    Smith, Matt; Emberlin, Jean

    2006-03-01

    A 30-day-ahead forecast method has been developed for grass pollen in north London. The total period of the grass pollen season is covered by eight multiple regression models, each covering a 10-day period running consecutively from 21 May to 8 August. This means that three models were used for each 30-day forecast. The forecast models were produced using grass pollen and environmental data from 1961 to 1999 and tested on data from 2000 and 2002. Model accuracy was judged in two ways: the number of times the forecast model was able to successfully predict the severity (relative to the 1961-1999 dataset as a whole) of grass pollen counts in each of the eight forecast periods on a scale of 1 to 4; the number of times the forecast model was able to predict whether grass pollen counts were higher or lower than the mean. The models achieved 62.5% accuracy in both assessment years when predicting the relative severity of grass pollen counts on a scale of 1 to 4, which equates to six of the eight 10-day periods being forecast correctly. The models attained 87.5% and 100% accuracy in 2000 and 2002, respectively, when predicting whether grass pollen counts would be higher or lower than the mean. Attempting to predict pollen counts during distinct 10-day periods throughout the grass pollen season is a novel approach. The models also employed original methodology in the use of winter averages of the North Atlantic Oscillation to forecast 10-day means of allergenic pollen counts.

  1. Ensemble-based methods for forecasting census in hospital units

    PubMed Central

    2013-01-01

    Background The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. Methods In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Results Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Conclusions Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts. PMID:23721123

  2. Ensemble-based methods for forecasting census in hospital units.

    PubMed

    Koestler, Devin C; Ombao, Hernando; Bender, Jesse

    2013-05-30

    The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts.

  3. Development of a multi-ensemble Prediction Model for China

    NASA Astrophysics Data System (ADS)

    Brasseur, G. P.; Bouarar, I.; Petersen, A. K.

    2016-12-01

    As part of the EU-sponsored Panda and MarcoPolo Projects, a multi-model prediction system including 7 models has been developed. Most regional models use global air quality predictions provided by the Copernicus Atmospheric Monitoring Service and downscale the forecast at relatively high spatial resolution in eastern China. The paper will describe the forecast system and show examples of forecasts produced for several Chinese urban areas and displayed on a web site developed by the Dutch Meteorological service. A discussion on the accuracy of the predictions based on a detailed validation process using surface measurements from the Chinese monitoring network will be presented.

  4. Research and Development for Technology Evolution Potential Forecasting System

    NASA Astrophysics Data System (ADS)

    Gao, Changqing; Cao, Shukun; Wang, Yuzeng; Ai, Changsheng; Ze, Xiangbo

    Technology forecasting is a powerful weapon for many enterprises to gain an animate future. Evolutionary potential radar plot is a necessary step of some valuable methods to help the technology managers with right technical strategy. A software system for Technology Evolution Potential Forecasting (TEPF) with automatic radar plot drawing is introduced in this paper. The framework of the system and the date structure describing the concrete evolution pattern are illustrated in details. And the algorithm for radar plot drawing is researched. It is proved that the TEPF system is an effective tool during the technology strategy analyzing process with a referenced case study.

  5. SEASAT economic assessment. Volume 3: Offshore oil and natural gas industry case study and generalization

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The economic benefits of improved ocean condition, weather and ice forecasts by SEASAT satellites to the exploration, development and production of oil and natural gas in the offshore regions are considered. The results of case studies which investigate the effects of forecast accuracy on offshore operations in the North Sea, the Celtic Sea, and the Gulf of Mexico are reported. A methodology for generalizing the results to other geographic regions of offshore oil and natural gas exploration and development is described.

  6. Seasonal forecasting of dolphinfish distribution in eastern Australia to aid recreational fishers and managers

    NASA Astrophysics Data System (ADS)

    Brodie, Stephanie; Hobday, Alistair J.; Smith, James A.; Spillman, Claire M.; Hartog, Jason R.; Everett, Jason D.; Taylor, Matthew D.; Gray, Charles A.; Suthers, Iain M.

    2017-06-01

    Seasonal forecasting of environmental conditions and marine species distribution has been used as a decision support tool in commercial and aquaculture fisheries. These tools may also be applicable to species targeted by the recreational fisheries sector, a sector that is increasing its use of marine resources, and making important economic and social contributions to coastal communities around the world. Here, a seasonal forecast of the habitat and density of dolphinfish (Coryphaena hippurus), based on sea surface temperatures, was developed for the east coast of New South Wales (NSW), Australia. Two prototype forecast products were created; geographic spatial forecasts of dolphinfish habitat and a latitudinal summary identifying the location of fish density peaks. The less detailed latitudinal summary was created to limit the resolution of habitat information to prevent potential resource over-exploitation by fishers in the absence of total catch controls. The forecast dolphinfish habitat model was accurate at the start of the annual dolphinfish migration in NSW (December) but other months (January - May) showed poor performance due to spatial and temporal variability in the catch data used in model validation. Habitat forecasts for December were useful up to five months ahead, with performance decreasing as forecast were made further into the future. The continued development and sound application of seasonal forecasts will help fishery industries cope with future uncertainty and promote dynamic and sustainable marine resource management.

  7. The Schaake shuffle: A method for reconstructing space-time variability in forecasted precipitation and temperature fields

    USGS Publications Warehouse

    Clark, M.R.; Gangopadhyay, S.; Hay, L.; Rajagopalan, B.; Wilby, R.

    2004-01-01

    A number of statistical methods that are used to provide local-scale ensemble forecasts of precipitation and temperature do not contain realistic spatial covariability between neighboring stations or realistic temporal persistence for subsequent forecast lead times. To demonstrate this point, output from a global-scale numerical weather prediction model is used in a stepwise multiple linear regression approach to downscale precipitation and temperature to individual stations located in and around four study basins in the United States. Output from the forecast model is downscaled for lead times up to 14 days. Residuals in the regression equation are modeled stochastically to provide 100 ensemble forecasts. The precipitation and temperature ensembles from this approach have a poor representation of the spatial variability and temporal persistence. The spatial correlations for downscaled output are considerably lower than observed spatial correlations at short forecast lead times (e.g., less than 5 days) when there is high accuracy in the forecasts. At longer forecast lead times, the downscaled spatial correlations are close to zero. Similarly, the observed temporal persistence is only partly present at short forecast lead times. A method is presented for reordering the ensemble output in order to recover the space-time variability in precipitation and temperature fields. In this approach, the ensemble members for a given forecast day are ranked and matched with the rank of precipitation and temperature data from days randomly selected from similar dates in the historical record. The ensembles are then reordered to correspond to the original order of the selection of historical data. Using this approach, the observed intersite correlations, intervariable correlations, and the observed temporal persistence are almost entirely recovered. This reordering methodology also has applications for recovering the space-time variability in modeled streamflow. ?? 2004 American Meteorological Society.

  8. Objective calibration of numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Voudouri, A.; Khain, P.; Carmona, I.; Bellprat, O.; Grazzini, F.; Avgoustoglou, E.; Bettems, J. M.; Kaufmann, P.

    2017-07-01

    Numerical weather prediction (NWP) and climate models use parameterization schemes for physical processes, which often include free or poorly confined parameters. Model developers normally calibrate the values of these parameters subjectively to improve the agreement of forecasts with available observations, a procedure referred as expert tuning. A practicable objective multi-variate calibration method build on a quadratic meta-model (MM), that has been applied for a regional climate model (RCM) has shown to be at least as good as expert tuning. Based on these results, an approach to implement the methodology to an NWP model is presented in this study. Challenges in transferring the methodology from RCM to NWP are not only restricted to the use of higher resolution and different time scales. The sensitivity of the NWP model quality with respect to the model parameter space has to be clarified, as well as optimize the overall procedure, in terms of required amount of computing resources for the calibration of an NWP model. Three free model parameters affecting mainly turbulence parameterization schemes were originally selected with respect to their influence on the variables associated to daily forecasts such as daily minimum and maximum 2 m temperature as well as 24 h accumulated precipitation. Preliminary results indicate that it is both affordable in terms of computer resources and meaningful in terms of improved forecast quality. In addition, the proposed methodology has the advantage of being a replicable procedure that can be applied when an updated model version is launched and/or customize the same model implementation over different climatological areas.

  9. Operational early warning of shallow landslides in Norway: Evaluation of landslide forecasts and associated challenges

    NASA Astrophysics Data System (ADS)

    Dahl, Mads-Peter; Colleuille, Hervé; Boje, Søren; Sund, Monica; Krøgli, Ingeborg; Devoli, Graziella

    2015-04-01

    The Norwegian Water Resources and Energy Directorate (NVE) runs a national early warning system (EWS) for shallow landslides in Norway. Slope failures included in the EWS are debris slides, debris flows, debris avalanches and slush flows. The EWS has been operational on national scale since 2013 and consists of (a) quantitative landslide thresholds and daily hydro-meteorological prognosis; (b) daily qualitative expert evaluation of prognosis / additional data in decision to determine warning levels; (c) publication of warning levels through various custom build internet platforms. The effectiveness of an EWS depends on both the quality of forecasts being issued, and the communication of forecasts to the public. In this analysis a preliminary evaluation of landslide forecasts from the Norwegian EWS within the period 2012-2014 is presented. Criteria for categorizing forecasts as correct, missed events or false alarms are discussed and concrete examples of forecasts falling into the latter two categories are presented. The evaluation show a rate of correct forecasts exceeding 90%. However correct forecast categorization is sometimes difficult, particularly due to poorly documented landslide events. Several challenges has to be met in the process of further lowering rates of missed events of false alarms in the EWS. Among others these include better implementation of susceptibility maps in landslide forecasting, more detailed regionalization of hydro-meteorological landslide thresholds, improved prognosis on precipitation, snowmelt and soil water content as well as the build-up of more experience among the people performing landslide forecasting.

  10. Results from the centers for disease control and prevention's predict the 2013-2014 Influenza Season Challenge.

    PubMed

    Biggerstaff, Matthew; Alper, David; Dredze, Mark; Fox, Spencer; Fung, Isaac Chun-Hai; Hickmann, Kyle S; Lewis, Bryan; Rosenfeld, Roni; Shaman, Jeffrey; Tsou, Ming-Hsiang; Velardi, Paola; Vespignani, Alessandro; Finelli, Lyn

    2016-07-22

    Early insights into the timing of the start, peak, and intensity of the influenza season could be useful in planning influenza prevention and control activities. To encourage development and innovation in influenza forecasting, the Centers for Disease Control and Prevention (CDC) organized a challenge to predict the 2013-14 Unites States influenza season. Challenge contestants were asked to forecast the start, peak, and intensity of the 2013-2014 influenza season at the national level and at any or all Health and Human Services (HHS) region level(s). The challenge ran from December 1, 2013-March 27, 2014; contestants were required to submit 9 biweekly forecasts at the national level to be eligible. The selection of the winner was based on expert evaluation of the methodology used to make the prediction and the accuracy of the prediction as judged against the U.S. Outpatient Influenza-like Illness Surveillance Network (ILINet). Nine teams submitted 13 forecasts for all required milestones. The first forecast was due on December 2, 2013; 3/13 forecasts received correctly predicted the start of the influenza season within one week, 1/13 predicted the peak within 1 week, 3/13 predicted the peak ILINet percentage within 1 %, and 4/13 predicted the season duration within 1 week. For the prediction due on December 19, 2013, the number of forecasts that correctly forecasted the peak week increased to 2/13, the peak percentage to 6/13, and the duration of the season to 6/13. As the season progressed, the forecasts became more stable and were closer to the season milestones. Forecasting has become technically feasible, but further efforts are needed to improve forecast accuracy so that policy makers can reliably use these predictions. CDC and challenge contestants plan to build upon the methods developed during this contest to improve the accuracy of influenza forecasts.

  11. Forecasting Non-Stationary Diarrhea, Acute Respiratory Infection, and Malaria Time-Series in Niono, Mali

    PubMed Central

    Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou

    2007-01-01

    Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322

  12. Traffic flow forecasting using approximate nearest neighbor nonparametric regression

    DOT National Transportation Integrated Search

    2000-12-01

    The purpose of this research is to enhance nonparametric regression (NPR) for use in real-time systems by first reducing execution time using advanced data structures and imprecise computations and then developing a methodology for applying NPR. Due ...

  13. A principal component regression model to forecast airborne concentration of Cupressaceae pollen in the city of Granada (SE Spain), during 1995-2006.

    PubMed

    Ocaña-Peinado, Francisco M; Valderrama, Mariano J; Bouzas, Paula R

    2013-05-01

    The problem of developing a 2-week-on ahead forecast of atmospheric cypress pollen levels is tackled in this paper by developing a principal component multiple regression model involving several climatic variables. The efficacy of the proposed model is validated by means of an application to real data of Cupressaceae pollen concentration in the city of Granada (southeast of Spain). The model was applied to data from 11 consecutive years (1995-2005), with 2006 being used to validate the forecasts. Based on the work of different authors, factors as temperature, humidity, hours of sun and wind speed were incorporated in the model. This methodology explains approximately 75-80% of the variability in the airborne Cupressaceae pollen concentration.

  14. How seasonal forecast could help a decision maker: an example of climate service for water resource management

    NASA Astrophysics Data System (ADS)

    Viel, Christian; Beaulant, Anne-Lise; Soubeyroux, Jean-Michel; Céron, Jean-Pierre

    2016-04-01

    The FP7 project EUPORIAS was a great opportunity for the climate community to co-design with stakeholders some original and innovative climate services at seasonal time scales. In this framework, Météo-France proposed a prototype that aimed to provide to water resource managers some tailored information to better anticipate the coming season. It is based on a forecasting system, built on a refined hydrological suite, forced by a coupled seasonal forecast model. It particularly delivers probabilistic river flow prediction on river basins all over the French territory. This paper presents the work we have done with "EPTB Seine Grands Lacs" (EPTB SGL), an institutional stakeholder in charge of the management of 4 great reservoirs on the upper Seine Basin. First, we present the co-design phase, which means the translation of classical climate outputs into several indices, relevant to influence the stakeholder's decision making process (DMP). And second, we detail the evaluation of the impact of the forecast on the DMP. This evaluation is based on an experiment realised in collaboration with the stakeholder. Concretely EPTB SGL has replayed some past decisions, in three different contexts: without any forecast, with a forecast A and with a forecast B. One of forecast A and B really contained seasonal forecast, the other only contained random forecasts taken from past climate. This placebo experiment, realised in a blind test, allowed us to calculate promising skill scores of the DMP based on seasonal forecast in comparison to a classical approach based on climatology, and to EPTG SGL current practice.

  15. Improving Forecasts for Water Management

    NASA Astrophysics Data System (ADS)

    Arumugam, Sankar; Wood, Andy; Rajagopalan, Balaji; Schaake, John

    2014-01-01

    Recent advances in seasonal to interannual hydroclimate predictions provide an opportunity for developing a proactive approach toward water management. This motivated a recent AGU Chapman Conference (see program details at http://chapman.agu.org/watermanagement/). Approximately 85 participants from the United States, Oceania, Asia, Europe, and South America presented and discussed the current state of successes, challenges, and opportunities in seasonal to interannual hydroclimate forecasts and water management, and a number of key messages emerged.

  16. SWIFT Observations in the Arctic Sea State DRI

    DTIC Science & Technology

    2015-09-30

    to understand the role of waves and sea state in the Arctic Ocean, such that forecast models are improved and a robust climatology is defined...OBJECTIVES The objectives are to: develop a sea state climatology for the Arctic Ocean, improve wave forecasting in the presence of sea ice, improve...experiment, coordination of remote sensing products, and analysis of climatology . A detailed cruise plan has been written, including a table of the remote

  17. Retirement Forecasting. Technical Descriptions of Cost, Decision and Income Models. Volume 2. Report to the Chairman, Subcommittee on Social Security and Income Maintenance Programs, Committee on Finance, United States Senate.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    This supplementary report identifies and provides individual descriptions and reviews of 71 retirement forecasting models. Composed of appendices, it is intended as a source of more detailed information than that included in the main volume of the report. Appendix I is an introduction. Appendix II contains individual descriptions of 32 models of…

  18. Mesoscale Modeling, Forecasting and Remote Sensing Research.

    DTIC Science & Technology

    remote sensing , cyclonic scale diagnostic studies and mesoscale numerical modeling and forecasting are summarized. Mechanisms involved in the release of potential instability are discussed and simulated quantitatively, giving particular attention to the convective formulation. The basic mesoscale model is documented including the equations, boundary condition, finite differences and initialization through an idealized frontal zone. Results of tests including a three dimensional test with real data, tests of convective/mesoscale interaction and tests with a detailed

  19. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Robertson, Franklin R.; Bosilovich, Michael; Lyon, Bradfield; Funk, Chris

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period

  20. Forecasting of Radiation Belts: Results From the PROGRESS Project.

    NASA Astrophysics Data System (ADS)

    Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.

    2017-12-01

    Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.

  1. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, J. Brent; Bosilovich, Michael; Lyon, Bradfield

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period.

  2. Projected electric power demands for the Potomac Electric Power Company. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estomin, S.; Kahal, M.

    1984-03-01

    This three-volume report presents the results of an econometric forecast of peak and electric power demands for the Potomac Electric Power Company (PEPCO) through the year 2002. Volume I describes the methodology, the results of the econometric estimations, the forecast assumptions and the calculated forecasts of peak demand and energy usage. Separate sets of models were developed for the Maryland Suburbs (Montgomery and Prince George's counties), the District of Columbia and Southern Maryland (served by a wholesale customer of PEPCO). For each of the three jurisdictions, energy equations were estimated for residential and commercial/industrial customers for both summer and wintermore » seasons. For the District of Columbia, summer and winter equations for energy sales to the federal government were also estimated. Equations were also estimated for street lighting and energy losses. Noneconometric techniques were employed to forecast energy sales to the Northern Virginia suburbs, Metrorail and federal government facilities located in Maryland.« less

  3. A cross impact methodology for the assessment of US telecommunications system with application to fiber optics development: Executive summary

    NASA Technical Reports Server (NTRS)

    Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.

    1979-01-01

    A cross impact model of the U.S. telecommunications system was developed. For this model, it was necessary to prepare forecasts of the major segments of the telecommunications system, such as satellites, telephone, TV, CATV, radio broadcasting, etc. In addition, forecasts were prepared of the traffic generated by a variety of new or expanded services, such as electronic check clearing and point of sale electronic funds transfer. Finally, the interactions among the forecasts were estimated (the cross impacts). Both the forecasts and the cross impacts were used as inputs to the cross impact model, which could then be used to stimulate the future growth of the entire U.S. telecommunications system. By varying the inputs, technology changes or policy decisions with regard to any segment of the system could be evaluated in the context of the remainder of the system. To illustrate the operation of the model, a specific study was made of the deployment of fiber optics, throughout the telecommunications system.

  4. Forecasting malaria incidence based on monthly case reports and environmental factors in Karuzi, Burundi, 1997–2003

    PubMed Central

    Gomez-Elipe, Alberto; Otero, Angel; van Herp, Michel; Aguirre-Jaime, Armando

    2007-01-01

    Background The objective of this work was to develop a model to predict malaria incidence in an area of unstable transmission by studying the association between environmental variables and disease dynamics. Methods The study was carried out in Karuzi, a province in the Burundi highlands, using time series of monthly notifications of malaria cases from local health facilities, data from rain and temperature records, and the normalized difference vegetation index (NDVI). Using autoregressive integrated moving average (ARIMA) methodology, a model showing the relation between monthly notifications of malaria cases and the environmental variables was developed. Results The best forecasting model (R2adj = 82%, p < 0.0001 and 93% forecasting accuracy in the range ± 4 cases per 100 inhabitants) included the NDVI, mean maximum temperature, rainfall and number of malaria cases in the preceding month. Conclusion This model is a simple and useful tool for producing reasonably reliable forecasts of the malaria incidence rate in the study area. PMID:17892540

  5. A cross impact methodology for the assessment of US telecommunications system with application to fiber optics development, volume 1

    NASA Technical Reports Server (NTRS)

    Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.

    1979-01-01

    A cross impact model of the U.S. telecommunications system was developed. It was necessary to prepare forecasts of the major segments of the telecommunications system, such as satellites, telephone, TV, CATV, radio broadcasting, etc. In addition, forecasts were prepared of the traffic generated by a variety of new or expanded services, such as electronic check clearing and point of sale electronic funds transfer. Finally, the interactions among the forecasts were estimated (the cross impact). Both the forecasts and the cross impacts were used as inputs to the cross impact model, which could then be used to stimulate the future growth of the entire U.S. telecommunications system. By varying the inputs, technology changes or policy decisions with regard to any segment of the system could be evaluated in the context of the remainder of the system. To illustrate the operation of the model, a specific study was made of the deployment of fiber optics throughout the telecommunications system.

  6. Forecasting Japanese encephalitis incidence from historical morbidity patterns: Statistical analysis with 27 years of observation in Assam, India.

    PubMed

    Handique, Bijoy K; Khan, Siraj A; Mahanta, J; Sudhakar, S

    2014-09-01

    Japanese encephalitis (JE) is one of the dreaded mosquito-borne viral diseases mostly prevalent in south Asian countries including India. Early warning of the disease in terms of disease intensity is crucial for taking adequate and appropriate intervention measures. The present study was carried out in Dibrugarh district in the state of Assam located in the northeastern region of India to assess the accuracy of selected forecasting methods based on historical morbidity patterns of JE incidence during the past 22 years (1985-2006). Four selected forecasting methods, viz. seasonal average (SA), seasonal adjustment with last three observations (SAT), modified method adjusting long-term and cyclic trend (MSAT), and autoregressive integrated moving average (ARIMA) have been employed to assess the accuracy of each of the forecasting methods. The forecasting methods were validated for five consecutive years from 2007-2012 and accuracy of each method has been assessed. The forecasting method utilising seasonal adjustment with long-term and cyclic trend emerged as best forecasting method among the four selected forecasting methods and outperformed the even statistically more advanced ARIMA method. Peak of the disease incidence could effectively be predicted with all the methods, but there are significant variations in magnitude of forecast errors among the selected methods. As expected, variation in forecasts at primary health centre (PHC) level is wide as compared to that of district level forecasts. The study showed that adopted forecasting techniques could reasonably forecast the intensity of JE cases at PHC level without considering the external variables. The results indicate that the understanding of long-term and cyclic trend of the disease intensity will improve the accuracy of the forecasts, but there is a need for making the forecast models more robust to explain sudden variation in the disease intensity with detail analysis of parasite and host population dynamics.

  7. Ocean Heat Content Reveals Secrets of Fish Migrations

    PubMed Central

    Luo, Jiangang; Ault, Jerald S.; Shay, Lynn K.; Hoolihan, John P.; Prince, Eric D.; Brown, Craig A.; Rooker, Jay R.

    2015-01-01

    For centuries, the mechanisms surrounding spatially complex animal migrations have intrigued scientists and the public. We present a new methodology using ocean heat content (OHC), a habitat metric that is normally a fundamental part of hurricane intensity forecasting, to estimate movements and migration of satellite-tagged marine fishes. Previous satellite-tagging research of fishes using archival depth, temperature and light data for geolocations have been too coarse to resolve detailed ocean habitat utilization. We combined tag data with OHC estimated from ocean circulation and transport models in an optimization framework that substantially improved geolocation accuracy over SST-based tracks. The OHC-based movement track provided the first quantitative evidence that many of the tagged highly migratory fishes displayed affinities for ocean fronts and eddies. The OHC method provides a new quantitative tool for studying dynamic use of ocean habitats, migration processes and responses to environmental changes by fishes, and further, improves ocean animal tracking and extends satellite-based animal tracking data for other potential physical, ecological, and fisheries applications. PMID:26484541

  8. Wind speed time series reconstruction using a hybrid neural genetic approach

    NASA Astrophysics Data System (ADS)

    Rodriguez, H.; Flores, J. J.; Puig, V.; Morales, L.; Guerra, A.; Calderon, F.

    2017-11-01

    Currently, electric energy is used in practically all modern human activities. Most of the energy produced came from fossil fuels, making irreversible damage to the environment. Lately, there has been an effort by nations to produce energy using clean methods, such as solar and wind energy, among others. Wind energy is one of the cleanest alternatives. However, the wind speed is not constant, making the planning and operation at electric power systems a difficult activity. Knowing in advance the amount of raw material (wind speed) used for energy production allows us to estimate the energy to be generated by the power plant, helping the maintenance planning, the operational management, optimal operational cost. For these reasons, the forecast of wind speed becomes a necessary task. The forecast process involves the use of past observations from the variable to forecast (wind speed). To measure wind speed, weather stations use devices called anemometers, but due to poor maintenance, connection error, or natural wear, they may present false or missing data. In this work, a hybrid methodology is proposed, and it uses a compact genetic algorithm with an artificial neural network to reconstruct wind speed time series. The proposed methodology reconstructs the time series using a ANN defined by a Compact Genetic Algorithm.

  9. Forecasting daily patient volumes in the emergency department.

    PubMed

    Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L

    2008-02-01

    Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.

  10. Non-linear forecasting in high-frequency financial time series

    NASA Astrophysics Data System (ADS)

    Strozzi, F.; Zaldívar, J. M.

    2005-08-01

    A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the objectives and the conceptual and methodological approach used in the development of the Coal Production Submodule (CPS). It provides a description of the CPS for model analysts and the public. The Coal Market Module provides annual forecasts of prices, production, and consumption of coal.

  12. Experimental droughts with rainout shelters: A methodological review

    USDA-ARS?s Scientific Manuscript database

    Forecast increases in the frequency, intensity and duration of droughts with climate change may have extreme and extensive ecological consequences. There are currently hundreds of published, ongoing and new drought experiments worldwide aimed to assess ecosystem sensitivities to drought and identify...

  13. First Coast Guard district traffic model report

    DOT National Transportation Integrated Search

    1997-11-01

    The purpose of this report was to describe the methodology used in developing the First Coast Guard District (CGD1) Traffic Model and to document the potential National Distress System (NDS) voice and data traffic forecasted for the year 2001. The ND...

  14. The Devil is in the Concepts: Lessons Learned from World War II Planning Staffs for Transitioning from Conceptual to Detailed Planning

    DTIC Science & Technology

    2017-05-25

    the planning process. Current US Army doctrine links conceptual planning to the Army Design Methodology and detailed planning to the Military...Decision Making Process. By associating conceptual and detailed planning with doctrinal methodologies , it is easy to regard the transition as a set period...plans into detailed directives resulting in changes to the operational environment. 15. SUBJECT TERMS Design; Army Design Methodology ; Conceptual

  15. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.

  16. Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis

    NASA Technical Reports Server (NTRS)

    Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher

    1996-01-01

    We study a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and will be required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and a bias correction of forecast anomalies. In brief, the distortion is determined by minimizing the objective function by varying the displacement and bias correction fields. In the present project we use a global or hemispheric domain, and spherical harmonics to represent these fields. In this project we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically we study the forecast errors of the 500 hPa geopotential height field for forecasts of the short and medium range. The forecasts are those of the Goddard Earth Observing System data assimilation system. Results presented show that the methodology works, that a large part of the total error may be explained by a distortion limited to triangular truncation at wavenumber 10, and that the remaining residual error contains mostly small spatial scales.

  17. Forecasting Epidemics Through Nonparametric Estimation of Time-Dependent Transmission Rates Using the SEIR Model.

    PubMed

    Smirnova, Alexandra; deCamp, Linda; Chowell, Gerardo

    2017-05-02

    Deterministic and stochastic methods relying on early case incidence data for forecasting epidemic outbreaks have received increasing attention during the last few years. In mathematical terms, epidemic forecasting is an ill-posed problem due to instability of parameter identification and limited available data. While previous studies have largely estimated the time-dependent transmission rate by assuming specific functional forms (e.g., exponential decay) that depend on a few parameters, here we introduce a novel approach for the reconstruction of nonparametric time-dependent transmission rates by projecting onto a finite subspace spanned by Legendre polynomials. This approach enables us to effectively forecast future incidence cases, the clear advantage over recovering the transmission rate at finitely many grid points within the interval where the data are currently available. In our approach, we compare three regularization algorithms: variational (Tikhonov's) regularization, truncated singular value decomposition (TSVD), and modified TSVD in order to determine the stabilizing strategy that is most effective in terms of reliability of forecasting from limited data. We illustrate our methodology using simulated data as well as case incidence data for various epidemics including the 1918 influenza pandemic in San Francisco and the 2014-2015 Ebola epidemic in West Africa.

  18. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  19. Monitoring and seasonal forecasting of meteorological droughts

    NASA Astrophysics Data System (ADS)

    Dutra, Emanuel; Pozzi, Will; Wetterhall, Fredrik; Di Giuseppe, Francesca; Magnusson, Linus; Naumann, Gustavo; Barbosa, Paulo; Vogt, Jurgen; Pappenberger, Florian

    2015-04-01

    Near-real time drought monitoring can provide decision makers valuable information for use in several areas, such as water resources management, or international aid. Unfortunately, a major constraint in current drought outlooks is the lack of reliable monitoring capability for observed precipitation globally in near-real time. Furthermore, drought monitoring systems requires a long record of past observations to provide mean climatological conditions. We address these constraints by developing a novel drought monitoring approach in which monthly mean precipitation is derived from short-range using ECMWF probabilistic forecasts and then merged with the long term precipitation climatology of the Global Precipitation Climatology Centre (GPCC) dataset. Merging the two makes available a real-time global precipitation product out of which the Standardized Precipitation Index (SPI) can be estimated and used for global or regional drought monitoring work. This approach provides stability in that by-passes problems of latency (lags) in having local rain-gauge measurements available in real time or lags in satellite precipitation products. Seasonal drought forecasts can also be prepared using the common methodology and based upon two data sources used to provide initial conditions (GPCC and the ECMWF ERA-Interim reanalysis (ERAI) combined with either the current ECMWF seasonal forecast or a climatology based upon ensemble forecasts. Verification of the forecasts as a function of lead time revealed a reduced impact on skill for: (i) long lead times using different initial conditions, and (ii) short lead times using different precipitation forecasts. The memory effect of initial conditions was found to be 1 month lead time for the SPI-3, 3 to 4 months for the SPI-6 and 5 months for the SPI-12. Results show that dynamical forecasts of precipitation provide added value, a skill similar to or better than climatological forecasts. In some cases, particularly for long SPI time scales, it is very difficult to improve on the use of climatological forecasts. However, results presented regionally and globally pinpoint several regions in the world where drought onset forecasting is feasible and skilful.

  20. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  1. Ensemble Flow Forecasts for Risk Based Reservoir Operations of Lake Mendocino in Mendocino County, California: A Framework for Objectively Leveraging Weather and Climate Forecasts in a Decision Support Environment

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Hartman, R. K.; Mendoza, J.; Whitin, B.

    2017-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC. The ESP hindcast was developed using Global Ensemble Forecast System version 10 precipitation reforecasts processed with the Hydrologic Ensemble Forecast System to generate daily reforecasts of 61 flow ensemble members for a 15-day forecast horizon. Model simulation results demonstrate that the EFO alternative may improve water supply reliability for Lake Mendocino yet not increase flood risk for downstream areas. The developed operations framework can directly leverage improved skill in the second week of the forecast and is extendable into the S2S time domain given the demonstration of improved skill through a reliable reforecast of adequate historical duration and consistent with operationally available numerical weather predictions.

  2. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009).

    PubMed

    Nishiura, Hiroshi

    2011-02-16

    Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance.

  3. An Enhanced Convective Forecast (ECF) for the New York TRACON Area

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark; Stobie, James; Gillen, Robert; Jedlovec, Gary; Sims, Danny

    2008-01-01

    In an effort to relieve summer-time congestion in the NY Terminal Radar Approach Control (TRACON) area, the FAA is testing an enhanced convective forecast (ECF) product. The test began in June 2008 and is scheduled to run through early September. The ECF is updated every two hours, right before the Air Traffic Control System Command Center (ATCSCC) national planning telcon. It is intended to be used by traffic managers throughout the National Airspace System (NAS) and airlines dispatchers to supplement information from the Collaborative Convective Forecast Product (CCFP) and the Corridor Integrated Weather System (CIWS). The ECF begins where the current CIWS forecast ends at 2 hours and extends out to 12 hours. Unlike the CCFP it is a detailed deterministic forecast with no aerial coverage limits. It is created by an ENSCO forecaster using a variety of guidance products including, the Weather Research and Forecast (WRF) model. This is the same version of the WRF that ENSCO runs over the Florida peninsula in support of launch operations at the Kennedy Space Center. For this project, the WRF model domain has been shifted to the Northeastern US. Several products from the NASA SPoRT group are also used by the ENSCO forecaster. In this paper we will provide examples of the ECF products and discuss individual cases of traffic management actions using ECF guidance.

  4. Forecast Inaccuracies in Power Plant Projects From Project Managers' Perspectives

    NASA Astrophysics Data System (ADS)

    Sanabria, Orlando

    Guided by organizational theory, this phenomenological study explored the factors affecting forecast preparation and inaccuracies during the construction of fossil fuel-fired power plants in the United States. Forecast inaccuracies can create financial stress and uncertain profits during the project construction phase. A combination of purposeful and snowball sampling supported the selection of participants. Twenty project managers with over 15 years of experience in power generation and project experience across the United States were interviewed within a 2-month period. From the inductive codification and descriptive analysis, 5 themes emerged: (a) project monitoring, (b) cost control, (c) management review frequency, (d) factors to achieve a precise forecast, and (e) factors causing forecast inaccuracies. The findings of the study showed the factors necessary to achieve a precise forecast includes a detailed project schedule, accurate labor cost estimates, monthly project reviews and risk assessment, and proper utilization of accounting systems to monitor costs. The primary factors reported as causing forecast inaccuracies were cost overruns by subcontractors, scope gaps, labor cost and availability of labor, and equipment and material cost. Results of this study could improve planning accuracy and the effective use of resources during construction of power plants. The study results could contribute to social change by providing a framework to project managers to lessen forecast inaccuracies, and promote construction of power plants that will generate employment opportunities and economic development.

  5. Adapting National Water Model Forecast Data to Local Hyper-Resolution H&H Models During Hurricane Irma

    NASA Astrophysics Data System (ADS)

    Singhofen, P.

    2017-12-01

    The National Water Model (NWM) is a remarkable undertaking. The foundation of the NWM is a 1 square kilometer grid which is used for near real-time modeling and flood forecasting of most rivers and streams in the contiguous United States. However, the NWM falls short in highly urbanized areas with complex drainage infrastructure. To overcome these shortcomings, the presenter proposes to leverage existing local hyper-resolution H&H models and adapt the NWM forcing data to them. Gridded near real-time rainfall, short range forecasts (18-hour) and medium range forecasts (10-day) during Hurricane Irma are applied to numerous detailed H&H models in highly urbanized areas of the State of Florida. Coastal and inland models are evaluated. Comparisons of near real-time rainfall data are made with observed gaged data and the ability to predict flooding in advance based on forecast data is evaluated. Preliminary findings indicate that the near real-time rainfall data is consistently and significantly lower than observed data. The forecast data is more promising. For example, the medium range forecast data provides 2 - 3 days advanced notice of peak flood conditions to a reasonable level of accuracy in most cases relative to both timing and magnitude. Short range forecast data provides about 12 - 14 hours advanced notice. Since these are hyper-resolution models, flood forecasts can be made at the street level, providing emergency response teams with valuable information for coordinating and dispatching limited resources.

  6. Training Guide for Severe Weather Forecasters

    DTIC Science & Technology

    1979-11-01

    that worked very well for the example forecast is used to show the importance of parameter intensities and the actual thought processes that go into the...simplify the explanation of the complete level analysis. This entire process will be repeated for the 700 mb and 500 mb levels. Details in Figures 1 through...parameters of moderate to strong intensity must occur, in the same place at the same time. A description of what constitutes a weak, moderate, or strong

  7. Air Traffic Forecasting at the Port Authority of New York and New Jersey

    NASA Technical Reports Server (NTRS)

    Augustine, J. G.

    1972-01-01

    Procedures for conducting air traffic forecasts with specific application to the Port Authority of New York and New Jersey are discussed. The procedure relates air travel growth to detailed socio-economic and demographic characteristics of the U.S. population rather than to aggregate economic data such as Gross National Product, personal income, and industrial production. Charts are presented to show the relationship between various selected characteristics and the use of air transportation facilities.

  8. Comparison of Observation Impacts in Two Forecast Systems using Adjoint Methods

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Langland, Rolf; Todling, Ricardo

    2009-01-01

    An experiment is being conducted to compare directly the impact of all assimilated observations on short-range forecast errors in different operational forecast systems. We use the adjoint-based method developed by Langland and Baker (2004), which allows these impacts to be efficiently calculated. This presentation describes preliminary results for a "baseline" set of observations, including both satellite radiances and conventional observations, used by the Navy/NOGAPS and NASA/GEOS-5 forecast systems for the month of January 2007. In each system, about 65% of the total reduction in 24-h forecast error is provided by satellite observations, although the impact of rawinsonde, aircraft, land, and ship-based observations remains significant. Only a small majority (50- 55%) of all observations assimilated improves the forecast, while the rest degrade it. It is found that most of the total forecast error reduction comes from observations with moderate-size innovations providing small to moderate impacts, not from outliers with very large positive or negative innovations. In a global context, the relative impacts of the major observation types are fairly similar in each system, although regional differences in observation impact can be significant. Of particular interest is the fact that while satellite radiances have a large positive impact overall, they degrade the forecast in certain locations common to both systems, especially over land and ice surfaces. Ongoing comparisons of this type, with results expected from other operational centers, should lead to more robust conclusions about the impacts of the various components of the observing system as well as about the strengths and weaknesses of the methodologies used to assimilate them.

  9. Potential for malaria seasonal forecasting in Africa

    NASA Astrophysics Data System (ADS)

    Tompkins, Adrian; Di Giuseppe, Francesca; Colon-Gonzalez, Felipe; Namanya, Didas; Friday, Agabe

    2014-05-01

    As monthly and seasonal dynamical prediction systems have improved their skill in the tropics over recent years, there is now the potential to use these forecasts to drive dynamical malaria modelling systems to provide early warnings in epidemic and meso-endemic regions. We outline a new pilot operational system that has been developed at ECMWF and ICTP. It uses a precipitation bias correction methodology to seamlessly join the monthly ensemble prediction system (EPS) and seasonal (system 4) forecast systems of ECMWF together. The resulting temperature and rainfall forecasts for Africa are then used to drive the recently developed ICTP malaria model known as VECTRI. The resulting coupled system of ECMWF climate forecasts and VECTRI thus produces predictions of malaria prevalence rates and transmission intensity across Africa. The forecasts are filtered to highlight the regions and months in which the system has particular value due to high year to year variability. In addition to epidemic areas, these also include meso and hyper-endemic regions which undergo considerable variability in the onset months. We demonstrate the limits of the forecast skill as a function of lead-time, showing that for many areas the dynamical system can add one to two months additional warning time to a system based on environmental monitoring. We then evaluate the past forecasts against district level case data in Uganda and show that when interventions can be discounted, the system can show significant skill at predicting interannual variability in transmission intensity up to 3 or 4 months ahead at the district scale. The prospects for a operational implementation will be briefly discussed.

  10. EpiCaster: An Integrated Web Application For Situation Assessment and Forecasting of Global Epidemics

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith; Chen, Jiangzhuo; Barrett, Chris; Wilson, Mandy; Marathe, Madhav

    2016-01-01

    Public health decision makers need access to high resolution situation assessment tools for understanding the extent of various epidemics in different regions of the world. In addition, they need insights into the future course of epidemics by way of forecasts. Such forecasts are essential for planning the allocation of limited resources and for implementing several policy-level and behavioral intervention strategies. The need for such forecasting systems became evident in the wake of the recent Ebola outbreak in West Africa. We have developed EpiCaster, an integrated Web application for situation assessment and forecasting of various epidemics, such as Flu and Ebola, that are prevalent in different regions of the world. Using EpiCaster, users can assess the magnitude and severity of different epidemics at highly resolved spatio-temporal levels. EpiCaster provides time-varying heat maps and graphical plots to view trends in the disease dynamics. EpiCaster also allows users to visualize data gathered through surveillance mechanisms, such as Google Flu Trends (GFT) and the World Health Organization (WHO). The forecasts provided by EpiCaster are generated using different epidemiological models, and the users can select the models through the interface to filter the corresponding forecasts. EpiCaster also allows the users to study epidemic propagation in the presence of a number of intervention strategies specific to certain diseases. Here we describe the modeling techniques, methodologies and computational infrastructure that EpiCaster relies on to support large-scale predictive analytics for situation assessment and forecasting of global epidemics. PMID:27796009

  11. Methodology for Designing Operational Banking Risks Monitoring System

    NASA Astrophysics Data System (ADS)

    Kostjunina, T. N.

    2018-05-01

    The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.

  12. Physician supply forecast: better than peering in a crystal ball?

    PubMed Central

    Roberfroid, Dominique; Leonard, Christian; Stordeur, Sabine

    2009-01-01

    Background Anticipating physician supply to tackle future health challenges is a crucial but complex task for policy planners. A number of forecasting tools are available, but the methods, advantages and shortcomings of such tools are not straightforward and not always well appraised. Therefore this paper had two objectives: to present a typology of existing forecasting approaches and to analyse the methodology-related issues. Methods A literature review was carried out in electronic databases Medline-Ovid, Embase and ERIC. Concrete examples of planning experiences in various countries were analysed. Results Four main forecasting approaches were identified. The supply projection approach defines the necessary inflow to maintain or to reach in the future an arbitrary predefined level of service offer. The demand-based approach estimates the quantity of health care services used by the population in the future to project physician requirements. The needs-based approach involves defining and predicting health care deficits so that they can be addressed by an adequate workforce. Benchmarking health systems with similar populations and health profiles is the last approach. These different methods can be combined to perform a gap analysis. The methodological challenges of such projections are numerous: most often static models are used and their uncertainty is not assessed; valid and comprehensive data to feed into the models are often lacking; and a rapidly evolving environment affects the likelihood of projection scenarios. As a result, the internal and external validity of the projections included in our review appeared limited. Conclusion There is no single accepted approach to forecasting physician requirements. The value of projections lies in their utility in identifying the current and emerging trends to which policy-makers need to respond. A genuine gap analysis, an effective monitoring of key parameters and comprehensive workforce planning are key elements to improving the usefulness of physician supply projections. PMID:19216772

  13. Convective Weather Forecast Accuracy Analysis at Center and Sector Levels

    NASA Technical Reports Server (NTRS)

    Wang, Yao; Sridhar, Banavar

    2010-01-01

    This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in intensity and location for center forecast are relatively low. For example, 1-hour forecast intensity and horizontal location errors for ZDC center were about 0.12 and 0.13. However, the correlation between sector 1-hour forecast and actual weather coverage was weak, for sector ZDC32, about 32% of the total variation of observation weather intensity was unexplained by forecast; the sector horizontal location error was about 0.10. The paper also introduces an approach to estimate the sector three-dimensional actual weather coverage by using multiple sector forecasts, which turned out to produce better predictions. Using Multiple Linear Regression (MLR) model for this approach, the correlations between actual observation and the multiple sector forecast model prediction improved by several percents at 95% confidence level in comparison with single sector forecast.

  14. Aging in America in the Twenty-first Century: Demographic Forecasts from the MacArthur Foundation Research Network on an Aging Society

    PubMed Central

    Olshansky, S Jay; Goldman, Dana P; Zheng, Yuhui; Rowe, John W

    2009-01-01

    Context: The aging of the baby boom generation, the extension of life, and progressive increases in disability-free life expectancy have generated a dramatic demographic transition in the United States. Official government forecasts may, however, have inadvertently underestimated life expectancy, which would have major policy implications, since small differences in forecasts of life expectancy produce very large differences in the number of people surviving to an older age. This article presents a new set of population and life expectancy forecasts for the United States, focusing on transitions that will take place by midcentury. Methods: Forecasts were made with a cohort-components methodology, based on the premise that the risk of death will be influenced in the coming decades by accelerated advances in biomedical technology that either delay the onset and age progression of major fatal diseases or that slow the aging process itself. Findings: Results indicate that the current forecasts of the U.S. Social Security Administration and U.S. Census Bureau may underestimate the rise in life expectancy at birth for men and women combined, by 2050, from 3.1 to 7.9 years. Conclusions: The cumulative outlays for Medicare and Social Security could be higher by $3.2 to $8.3 trillion relative to current government forecasts. This article discusses the implications of these results regarding the benefits and costs of an aging society and the prospect that health disparities could attenuate some of these changes. PMID:20021588

  15. VMT Mix Modeling for Mobile Source Emissions Forecasting: Formulation and Empirical Application

    DOT National Transportation Integrated Search

    2000-05-01

    The purpose of the current report is to propose and implement a methodology for obtaining improved link-specific vehicle miles of travel (VMT) mix values compared to those obtained from existent methods. Specifically, the research is developing a fra...

  16. A methodology for incorporating fuel price impacts into short-term transit ridership forecasts.

    DOT National Transportation Integrated Search

    2009-08-01

    Anticipating changes to public transportation ridership demand is important to planning for and meeting : service goals and maintaining system viability. These changes may occur in the short- or long-term; : extensive academic work has focused on bet...

  17. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    DOT National Transportation Integrated Search

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, : including 41 of the busiest 50, are either in nonattainment : or maintenance areas per the National Ambient : Air Quality Standards. U.S. aviation activity is forecasted : to triple by 20...

  18. DEVELOPMENT AND EVALUATION OF PM 2.5 SOURCE APPORTIONMENT METHODOLOGIES

    EPA Science Inventory

    The receptor model called Positive Matrix Factorization (PMF) has been extensively used to apportion sources of ambient fine particulate matter (PM2.5), but the accuracy of source apportionment results currently remains unknown. In addition, air quality forecast model...

  19. Projecting long term medical spending growth.

    PubMed

    Borger, Christine; Rutherford, Thomas F; Won, Gregory Y

    2008-01-01

    We present a dynamic general equilibrium model of the U.S. economy and the medical sector in which the adoption of new medical treatments is endogenous and the demand for medical services is conditional on the state of technology. We use this model to prepare 75-year medical spending forecasts and a projection of the Medicare actuarial balance, and we compare our results to those obtained from a method that has been used by government actuaries. Our baseline forecast predicts slower health spending growth in the long run and a lower Medicare actuarial deficit relative to the previous projection methodology.

  20. Assessing and forecasting population health: integrating knowledge and beliefs in a comprehensive framework.

    PubMed

    Van Meijgaard, Jeroen; Fielding, Jonathan E; Kominski, Gerald F

    2009-01-01

    A comprehensive population health-forecasting model has the potential to interject new and valuable information about the future health status of the population based on current conditions, socioeconomic and demographic trends, and potential changes in policies and programs. Our Health Forecasting Model uses a continuous-time microsimulation framework to simulate individuals' lifetime histories by using birth, risk exposures, disease incidence, and death rates to mark changes in the state of the individual. The model generates a reference forecast of future health in California, including details on physical activity, obesity, coronary heart disease, all-cause mortality, and medical expenditures. We use the model to answer specific research questions, inform debate on important policy issues in public health, support community advocacy, and provide analysis on the long-term impact of proposed changes in policies and programs, thus informing stakeholders at all levels and supporting decisions that can improve the health of populations.

  1. Using NCAR Yellowstone for PhotoVoltaic Power Forecasts with Artificial Neural Networks and an Analog Ensemble

    NASA Astrophysics Data System (ADS)

    Cervone, G.; Clemente-Harding, L.; Alessandrini, S.; Delle Monache, L.

    2016-12-01

    A methodology based on Artificial Neural Networks (ANN) and an Analog Ensemble (AnEn) is presented to generate 72-hour deterministic and probabilistic forecasts of power generated by photovoltaic (PV) power plants using input from a numerical weather prediction model and computed astronomical variables. ANN and AnEn are used individually and in combination to generate forecasts for three solar power plant located in Italy. The computational scalability of the proposed solution is tested using synthetic data simulating 4,450 PV power stations. The NCAR Yellowstone supercomputer is employed to test the parallel implementation of the proposed solution, ranging from 1 node (32 cores) to 4,450 nodes (141,140 cores). Results show that a combined AnEn + ANN solution yields best results, and that the proposed solution is well suited for massive scale computation.

  2. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    PubMed

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting

    NASA Astrophysics Data System (ADS)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.

  4. Using High Resolution Model Data to Improve Lightning Forecasts across Southern California

    NASA Astrophysics Data System (ADS)

    Capps, S. B.; Rolinski, T.

    2014-12-01

    Dry lightning often results in a significant amount of fire starts in areas where the vegetation is dry and continuous. Meteorologists from the USDA Forest Service Predictive Services' program in Riverside, California are tasked to provide southern and central California's fire agencies with fire potential outlooks. Logistic regression equations were developed by these meteorologists several years ago, which forecast probabilities of lightning as well as lightning amounts, out to seven days across southern California. These regression equations were developed using ten years of historical gridded data from the Global Forecast System (GFS) model on a coarse scale (0.5 degree resolution), correlated with historical lightning strike data. These equations do a reasonably good job of capturing a lightning episode (3-5 consecutive days or greater of lightning), but perform poorly regarding more detailed information such as exact location and amounts. It is postulated that the inadequacies in resolving the finer details of episodic lightning events is due to the coarse resolution of the GFS data, along with limited predictors. Stability parameters, such as the Lifted Index (LI), the Total Totals index (TT), Convective Available Potential Energy (CAPE), along with Precipitable Water (PW) are the only parameters being considered as predictors. It is hypothesized that the statistical forecasts will benefit from higher resolution data both in training and implementing the statistical model. We have dynamically downscaled NCEP FNL (Final) reanalysis data using the Weather Research and Forecasting model (WRF) to 3km spatial and hourly temporal resolution across a decade. This dataset will be used to evaluate the contribution to the success of the statistical model of additional predictors in higher vertical, spatial and temporal resolution. If successful, we will implement an operational dynamically downscaled GFS forecast product to generate predictors for the resulting statistical lightning model. This data will help fire agencies be better prepared to pre-deploy resources in advance of these events. Specific information regarding duration, amount, and location will be especially valuable.

  5. Workstation-Based Real-Time Mesoscale Modeling Designed for Weather Support to Operations at the Kennedy Space Center and Cape Canaveral Air Station

    NASA Technical Reports Server (NTRS)

    Manobianco, John; Zack, John W.; Taylor, Gregory E.

    1996-01-01

    This paper describes the capabilities and operational utility of a version of the Mesoscale Atmospheric Simulation System (MASS) that has been developed to support operational weather forecasting at the Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS). The implementation of local, mesoscale modeling systems at KSC/CCAS is designed to provide detailed short-range (less than 24 h) forecasts of winds, clouds, and hazardous weather such as thunderstorms. Short-range forecasting is a challenge for daily operations, and manned and unmanned launches since KSC/CCAS is located in central Florida where the weather during the warm season is dominated by mesoscale circulations like the sea breeze. For this application, MASS has been modified to run on a Stardent 3000 workstation. Workstation-based, real-time numerical modeling requires a compromise between the requirement to run the system fast enough so that the output can be used before expiration balanced against the desire to improve the simulations by increasing resolution and using more detailed physical parameterizations. It is now feasible to run high-resolution mesoscale models such as MASS on local workstations to provide timely forecasts at a fraction of the cost required to run these models on mainframe supercomputers. MASS has been running in the Applied Meteorology Unit (AMU) at KSC/CCAS since January 1994 for the purpose of system evaluation. In March 1995, the AMU began sending real-time MASS output to the forecasters and meteorologists at CCAS, Spaceflight Meteorology Group (Johnson Space Center, Houston, Texas), and the National Weather Service (Melbourne, Florida). However, MASS is not yet an operational system. The final decision whether to transition MASS for operational use will depend on a combination of forecaster feedback, the AMU's final evaluation results, and the life-cycle costs of the operational system.

  6. Economic analysis for transmission operation and planning

    NASA Astrophysics Data System (ADS)

    Zhou, Qun

    2011-12-01

    Restructuring of the electric power industry has caused dramatic changes in the use of transmission system. The increasing congestion conditions as well as the necessity of integrating renewable energy introduce new challenges and uncertainties to transmission operation and planning. Accurate short-term congestion forecasting facilitates market traders in bidding and trading activities. Cost sharing and recovery issue is a major impediment for long-term transmission investment to integrate renewable energy. In this research, a new short-term forecasting algorithm is proposed for predicting congestion, LMPs, and other power system variables based on the concept of system patterns. The advantage of this algorithm relative to standard statistical forecasting methods is that structural aspects underlying power market operations are exploited to reduce the forecasting error. The advantage relative to previously proposed structural forecasting methods is that data requirements are substantially reduced. Forecasting results based on a NYISO case study demonstrate the feasibility and accuracy of the proposed algorithm. Moreover, a negotiation methodology is developed to guide transmission investment for integrating renewable energy. Built on Nash Bargaining theory, the negotiation of investment plans and payment rate can proceed between renewable generation and transmission companies for cost sharing and recovery. The proposed approach is applied to Garver's six bus system. The numerical results demonstrate fairness and efficiency of the approach, and hence can be used as guidelines for renewable energy investors. The results also shed light on policy-making of renewable energy subsidies.

  7. The value of improved wind power forecasting: Grid flexibility quantification, ramp capability analysis, and impacts of electricity market operation timescales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Wu, Hongyu; Florita, Anthony R.

    The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less

  8. The value of improved wind power forecasting: Grid flexibility quantification, ramp capability analysis, and impacts of electricity market operation timescales

    DOE PAGES

    Wang, Qin; Wu, Hongyu; Florita, Anthony R.; ...

    2016-11-11

    The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less

  9. Unorganized machines for seasonal streamflow series forecasting.

    PubMed

    Siqueira, Hugo; Boccato, Levy; Attux, Romis; Lyra, Christiano

    2014-05-01

    Modern unorganized machines--extreme learning machines and echo state networks--provide an elegant balance between processing capability and mathematical simplicity, circumventing the difficulties associated with the conventional training approaches of feedforward/recurrent neural networks (FNNs/RNNs). This work performs a detailed investigation of the applicability of unorganized architectures to the problem of seasonal streamflow series forecasting, considering scenarios associated with four Brazilian hydroelectric plants and four distinct prediction horizons. Experimental results indicate the pertinence of these models to the focused task.

  10. Preliminary Cost Benefit Assessment of Systems for Detection of Hazardous Weather. Volume I,

    DTIC Science & Technology

    1981-07-01

    not be sufficient for adequate stream flow forecasting , it has important potential for real - time flash flood warning. This was illustrated by the 1977...provide a finer spatial resolution of the gridded data. See Table 9. 42 The results of a demonstration of the real - time capabilities of a radar-man system ...detailed real time measurement capabilities and scope for quantitative forecasting is most likely to provide the degree of lead time required if maximum

  11. 7 CFR 1710.300 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the forecast, including the methodology used to project loads, rates, revenue, power costs, operating expenses, plant additions, and other factors having a material effect on the balance sheet and on financial... regional office will consult with the Power Supply Division in the case of generation projects for...

  12. Research needs for developing a commodity-driven freight modeling approach.

    DOT National Transportation Integrated Search

    2003-01-01

    It is well known that better freight forecasting models and data are needed, but the literature does not clearly indicate which components of the modeling methodology are most in need of improvement, which is a critical need in an era of limited rese...

  13. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  14. Developing a Simulated-Person Methodology Workshop: An Experiential Education Initiative for Educators and Simulators

    ERIC Educational Resources Information Center

    Peisachovich, Eva Hava; Nelles, L. J.; Johnson, Samantha; Nicholson, Laura; Gal, Raya; Kerr, Barbara; Celia, Popovic; Epstein, Iris; Da Silva, Celina

    2017-01-01

    Numerous forecasts suggest that professional-competence development depends on human encounters. Interaction between organizations, tasks, and individual providers influence human behaviour, affect organizations' or systems' performance, and are a key component of professional-competence development. Further, insufficient or ineffective…

  15. Day-Ahead Crude Oil Price Forecasting Using a Novel Morphological Component Analysis Based Model

    PubMed Central

    Zhu, Qing; Zou, Yingchao; Lai, Kin Keung

    2014-01-01

    As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations. PMID:25061614

  16. Thermal sensation prediction by soft computing methodology.

    PubMed

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Day-ahead crude oil price forecasting using a novel morphological component analysis based model.

    PubMed

    Zhu, Qing; He, Kaijian; Zou, Yingchao; Lai, Kin Keung

    2014-01-01

    As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations.

  18. Three-Month Real-Time Dengue Forecast Models: An Early Warning System for Outbreak Alerts and Policy Decision Support in Singapore.

    PubMed

    Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S Y; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R

    2016-09-01

    With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore's dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369-1375; http://dx.doi.org/10.1289/ehp.1509981.

  19. Prototype methodology for obtaining cloud seeding guidance from HRRR model data

    NASA Astrophysics Data System (ADS)

    Dawson, N.; Blestrud, D.; Kunkel, M. L.; Waller, B.; Ceratto, J.

    2017-12-01

    Weather model data, along with real time observations, are critical to determine whether atmospheric conditions are prime for super-cooled liquid water during cloud seeding operations. Cloud seeding groups can either use operational forecast models, or run their own model on a computer cluster. A custom weather model provides the most flexibility, but is also expensive. For programs with smaller budgets, openly-available operational forecasting models are the de facto method for obtaining forecast data. The new High-Resolution Rapid Refresh (HRRR) model (3 x 3 km grid size), developed by the Earth System Research Laboratory (ESRL), provides hourly model runs with 18 forecast hours per run. While the model cannot be fine-tuned for a specific area or edited to provide cloud-seeding-specific output, model output is openly available on a near-real-time basis. This presentation focuses on a prototype methodology for using HRRR model data to create maps which aid in near-real-time cloud seeding decision making. The R programming language is utilized to run a script on a Windows® desktop/laptop computer either on a schedule (such as every half hour) or manually. The latest HRRR model run is downloaded from NOAA's Operational Model Archive and Distribution System (NOMADS). A GRIB-filter service, provided by NOMADS, is used to obtain surface and mandatory pressure level data for a subset domain which greatly cuts down on the amount of data transfer. Then, a set of criteria, identified by the Idaho Power Atmospheric Science Group, is used to create guidance maps. These criteria include atmospheric stability (lapse rates), dew point depression, air temperature, and wet bulb temperature. The maps highlight potential areas where super-cooled liquid water may exist, reasons as to why cloud seeding should not be attempted, and wind speed at flight level.

  20. An Objective Verification of the North American Mesoscale Model for Kennedy Space Center and Cape Canaveral Air Force Station

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The 45th Weather Squadron (45 WS) Launch Weather Officers use the 12-km resolution North American Mesoscale (NAM) model (MesoNAM) text and graphical product forecasts extensively to support launch weather operations. However, the actual performance of the model at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) has not been measured objectively. In order to have tangible evidence of model performance, the 45 WS tasked the Applied Meteorology Unit to conduct a detailed statistical analysis of model output compared to observed values. The model products are provided to the 45 WS by ACTA, Inc. and include hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The objective analysis compared the MesoNAM forecast winds, temperature and dew point, as well as the changes in these parameters over time, to the observed values from the sensors in the KSC/CCAFS wind tower network. Objective statistics will give the forecasters knowledge of the model's strength and weaknesses, which will result in improved forecasts for operations.

  1. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  2. Advancing Data Assimilation in Operational Hydrologic Forecasting: Progresses, Challenges, and Emerging Opportunities

    NASA Technical Reports Server (NTRS)

    Liu, Yuqiong; Weerts, A.; Clark, M.; Hendricks Franssen, H.-J; Kumar, S.; Moradkhani, H.; Seo, D.-J.; Schwanenberg, D.; Smith, P.; van Dijk, A. I. J. M.; hide

    2012-01-01

    Data assimilation (DA) holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters. The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, the Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1) theoretical or mathematical aspects in DA algorithms, (2) the estimation of different types of uncertainty, (3) new observations and their objective use in hydrologic DA, (4) the use of DA for real-time control of water resources systems, and (5) the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among hydrologic modellers, DA developers, and operational forecasters.

  3. FUSION++: A New Data Assimilative Model for Electron Density Forecasting

    NASA Astrophysics Data System (ADS)

    Bust, G. S.; Comberiate, J.; Paxton, L. J.; Kelly, M.; Datta-Barua, S.

    2014-12-01

    There is a continuing need within the operational space weather community, both civilian and military, for accurate, robust data assimilative specifications and forecasts of the global electron density field, as well as derived RF application product specifications and forecasts obtained from the electron density field. The spatial scales of interest range from a hundred to a few thousand kilometers horizontally (synoptic large scale structuring) and meters to kilometers (small scale structuring that cause scintillations). RF space weather applications affected by electron density variability on these scales include navigation, communication and geo-location of RF frequencies ranging from 100's of Hz to GHz. For many of these applications, the necessary forecast time periods range from nowcasts to 1-3 hours. For more "mission planning" applications, necessary forecast times can range from hours to days. In this paper we present a new ionosphere-thermosphere (IT) specification and forecast model being developed at JHU/APL based upon the well-known data assimilation algorithms Ionospheric Data Assimilation Four Dimensional (IDA4D) and Estimating Model Parameters from Ionospheric Reverse Engineering (EMPIRE). This new forecast model, "Forward Update Simple IONosphere model Plus IDA4D Plus EMPIRE (FUSION++), ingests data from observations related to electron density, winds, electric fields and neutral composition and provides improved specification and forecast of electron density. In addition, the new model provides improved specification of winds, electric fields and composition. We will present a short overview and derivation of the methodology behind FUSION++, some preliminary results using real observational sources, example derived RF application products such as HF bi-static propagation, and initial comparisons with independent data sources for validation.

  4. Intercomparison of air quality data using principal component analysis, and forecasting of PM₁₀ and PM₂.₅ concentrations using artificial neural networks, in Thessaloniki and Helsinki.

    PubMed

    Voukantsis, Dimitris; Karatzas, Kostas; Kukkonen, Jaakko; Räsänen, Teemu; Karppinen, Ari; Kolehmainen, Mikko

    2011-03-01

    In this paper we propose a methodology consisting of specific computational intelligence methods, i.e. principal component analysis and artificial neural networks, in order to inter-compare air quality and meteorological data, and to forecast the concentration levels for environmental parameters of interest (air pollutants). We demonstrate these methods to data monitored in the urban areas of Thessaloniki and Helsinki in Greece and Finland, respectively. For this purpose, we applied the principal component analysis method in order to inter-compare the patterns of air pollution in the two selected cities. Then, we proceeded with the development of air quality forecasting models for both studied areas. On this basis, we formulated and employed a novel hybrid scheme in the selection process of input variables for the forecasting models, involving a combination of linear regression and artificial neural networks (multi-layer perceptron) models. The latter ones were used for the forecasting of the daily mean concentrations of PM₁₀ and PM₂.₅ for the next day. Results demonstrated an index of agreement between measured and modelled daily averaged PM₁₀ concentrations, between 0.80 and 0.85, while the kappa index for the forecasting of the daily averaged PM₁₀ concentrations reached 60% for both cities. Compared with previous corresponding studies, these statistical parameters indicate an improved performance of air quality parameters forecasting. It was also found that the performance of the models for the forecasting of the daily mean concentrations of PM₁₀ was not substantially different for both cities, despite the major differences of the two urban environments under consideration. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. A retrospective streamflow ensemble forecast for an extreme hydrologic event: a case study of Hurricane Irene and on the Hudson River basin

    NASA Astrophysics Data System (ADS)

    Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie

    2016-07-01

    This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.

  6. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009)

    PubMed Central

    2011-01-01

    Background Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. Methods A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. Results The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Conclusions Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance. PMID:21324153

  7. Foreign currency rate forecasting using neural networks

    NASA Astrophysics Data System (ADS)

    Pandya, Abhijit S.; Kondo, Tadashi; Talati, Amit; Jayadevappa, Suryaprasad

    2000-03-01

    Neural networks are increasingly being used as a forecasting tool in many forecasting problems. This paper discusses the application of neural networks in predicting daily foreign exchange rates between the USD, GBP as well as DEM. We approach the problem from a time-series analysis framework - where future exchange rates are forecasted solely using past exchange rates. This relies on the belief that the past prices and future prices are very close related, and interdependent. We present the result of training a neural network with historical USD-GBP data. The methodology used in explained, as well as the training process. We discuss the selection of inputs to the network, and present a comparison of using the actual exchange rates and the exchange rate differences as inputs. Price and rate differences are the preferred way of training neural network in financial applications. Results of both approaches are present together for comparison. We show that the network is able to learn the trends in the exchange rate movements correctly, and present the results of the prediction over several periods of time.

  8. Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.

    2005-12-01

    Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.

  9. Proceedings of the Workshop on Transportation/Urban Form Interactions held at Cambridge, MA. on August 14-15, 1978

    DOT National Transportation Integrated Search

    1979-06-01

    Contents: A form of utility function for the UMOT model; An analysis of transportation/land use interactions; Toward a methodology to shape urban structure; Approaches for improving urban travel forecasts; Quasi-dynamic urban location models with end...

  10. Methodologies used to estimate and forecast vehicle miles traveled (VMT) : final report.

    DOT National Transportation Integrated Search

    2016-07-01

    Vehicle miles traveled (VMT) is a measure used in transportation planning for a variety of purposes. It measures the amount of travel for all vehicles in a geographic region over a given period of time, typically a one-year period. VMT is calculated ...

  11. How Many Will Choose? School Choice and Student Enrollment Planning.

    ERIC Educational Resources Information Center

    Chan, Tak C.

    1993-01-01

    Enrollment planning is the basis of all school system planning. Focuses on assessing the impact of a choice plan on student enrollment planning. Issues involved include home schooling, school employees' choice, and private kindergarten programs. Administrators are advised to evaluate existing forecasting methodologies. (MLF)

  12. Mixed Single/Double Precision in OpenIFS: A Detailed Study of Energy Savings, Scaling Effects, Architectural Effects, and Compilation Effects

    NASA Astrophysics Data System (ADS)

    Fagan, Mike; Dueben, Peter; Palem, Krishna; Carver, Glenn; Chantry, Matthew; Palmer, Tim; Schlacter, Jeremy

    2017-04-01

    It has been shown that a mixed precision approach that judiciously replaces double precision with single precision calculations can speed-up global simulations. In particular, a mixed precision variation of the Integrated Forecast System (IFS) of the European Centre for Medium-Range Weather Forecasts (ECMWF) showed virtually the same quality model results as the standard double precision version (Vana et al., Single precision in weather forecasting models: An evaluation with the IFS, Monthly Weather Review, in print). In this study, we perform detailed measurements of savings in computing time and energy using a mixed precision variation of the -OpenIFS- model. The mixed precision variation of OpenIFS is analogous to the IFS variation used in Vana et al. We (1) present results for energy measurements for simulations in single and double precision using Intel's RAPL technology, (2) conduct a -scaling- study to quantify the effects that increasing model resolution has on both energy dissipation and computing cycles, (3) analyze the differences between single core and multicore processing, and (4) compare the effects of different compiler technologies on the mixed precision OpenIFS code. In particular, we compare intel icc/ifort with gnu gcc/gfortran.

  13. A comparison of the domestic satellite communications forecast to the year 2000

    NASA Technical Reports Server (NTRS)

    Poley, W. A.; Lekan, J. F.; Salzman, J. A.; Stevenson, S. M.

    1983-01-01

    The methodologies and results of three NASA-sponsored market demand assessment studies are presented and compared. Forecasts of future satellite addressable traffic (both trunking and customer premises services) were developed for the three main service categories of voice, data and video and subcategories thereof for the benchmark years of 1980, 1990 and 2000. The contractor results are presented on a service by service basis in two formats: equivalent 36 MHz transponders and basic transmission units (voice: half-voice circuits, data: megabits per second and video: video channels). It is shown that while considerable differences exist at the service category level, the overall forecasts by the two contractors are quite similar. ITT estimates the total potential satellite market to be 3594 transponders in the year 2000 with data service comprising 54 percent of this total. The WU outlook for the same time period is 2779 transponders with voice services accounting for 66 percent of the total.

  14. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 1

    NASA Technical Reports Server (NTRS)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.

  15. Assessing the Predictability of Convection using Ensemble Data Assimilation of Simulated Radar Observations in an LETKF system

    NASA Astrophysics Data System (ADS)

    Lange, Heiner; Craig, George

    2014-05-01

    This study uses the Local Ensemble Transform Kalman Filter (LETKF) to perform storm-scale Data Assimilation of simulated Doppler radar observations into the non-hydrostatic, convection-permitting COSMO model. In perfect model experiments (OSSEs), it is investigated how the limited predictability of convective storms affects precipitation forecasts. The study compares a fine analysis scheme with small RMS errors to a coarse scheme that allows for errors in position, shape and occurrence of storms in the ensemble. The coarse scheme uses superobservations, a coarser grid for analysis weights, a larger localization radius and larger observation error that allow a broadening of the Gaussian error statistics. Three hour forecasts of convective systems (with typical lifetimes exceeding 6 hours) from the detailed analyses of the fine scheme are found to be advantageous to those of the coarse scheme during the first 1-2 hours, with respect to the predicted storm positions. After 3 hours in the convective regime used here, the forecast quality of the two schemes appears indiscernible, judging by RMSE and verification methods for rain-fields and objects. It is concluded that, for operational assimilation systems, the analysis scheme might not necessarily need to be detailed to the grid scale of the model. Depending on the forecast lead time, and on the presence of orographic or synoptic forcing that enhance the predictability of storm occurrences, analyses from a coarser scheme might suffice.

  16. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE PAGES

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...

    2017-07-11

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  17. Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators: Generating short-term probabilistic wind power scenarios via nonparametric forecast error density estimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.

    Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less

  18. Extending to seasonal scales the current usage of short range weather forecasts and climate projections for water management in Spain

    NASA Astrophysics Data System (ADS)

    Rodriguez-Camino, Ernesto; Voces, José; Sánchez, Eroteida; Navascues, Beatriz; Pouget, Laurent; Roldan, Tamara; Gómez, Manuel; Cabello, Angels; Comas, Pau; Pastor, Fernando; Concepción García-Gómez, M.°; José Gil, Juan; Gil, Delfina; Galván, Rogelio; Solera, Abel

    2016-04-01

    This presentation, first, briefly describes the current use of weather forecasts and climate projections delivered by AEMET for water management in Spain. The potential use of seasonal climate predictions for water -in particular dams- management is then discussed more in-depth, using a pilot experience carried out by a multidisciplinary group coordinated by AEMET and DG for Water of Spain. This initiative is being developed in the framework of the national implementation of the GFCS and the European project, EUPORIAS. Among the main components of this experience there are meteorological and hydrological observations, and an empirical seasonal forecasting technique that provides an ensemble of water reservoir inflows. These forecasted inflows feed a prediction model for the dam state that has been adapted for this purpose. The full system is being tested retrospectively, over several decades, for selected water reservoirs located in different Spanish river basins. The assessment includes an objective verification of the probabilistic seasonal forecasts using standard metrics, and the evaluation of the potential social and economic benefits, with special attention to drought and flooding conditions. The methodology of implementation of these seasonal predictions in the decision making process is being developed in close collaboration with final users participating in this pilot experience.

  19. Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program

    NASA Technical Reports Server (NTRS)

    Manobianco, John; Nutter, Paul

    1997-01-01

    The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.

  20. Flash flood forecasting using simplified hydrological models, radar rainfall forecasts and data assimilation

    NASA Astrophysics Data System (ADS)

    Smith, P. J.; Beven, K.; Panziera, L.

    2012-04-01

    The issuing of timely flood alerts may be dependant upon the ability to predict future values of water level or discharge at locations where observations are available. Catchments at risk of flash flooding often have a rapid natural response time, typically less then the forecast lead time desired for issuing alerts. This work focuses on the provision of short-range (up to 6 hours lead time) predictions of discharge in small catchments based on utilising radar forecasts to drive a hydrological model. An example analysis based upon the Verzasca catchment (Ticino, Switzerland) is presented. Parsimonious time series models with a mechanistic interpretation (so called Data-Based Mechanistic model) have been shown to provide reliable accurate forecasts in many hydrological situations. In this study such a model is developed to predict the discharge at an observed location from observed precipitation data. The model is shown to capture the snow melt response at this site. Observed discharge data is assimilated to improve the forecasts, of up to two hours lead time, that can be generated from observed precipitation. To generate forecasts with greater lead time ensemble precipitation forecasts are utilised. In this study the Nowcasting ORographic precipitation in the Alps (NORA) product outlined in more detail elsewhere (Panziera et al. Q. J. R. Meteorol. Soc. 2011; DOI:10.1002/qj.878) is utilised. NORA precipitation forecasts are derived from historical analogues based on the radar field and upper atmospheric conditions. As such, they avoid the need to explicitly model the evolution of the rainfall field through for example Lagrangian diffusion. The uncertainty in the forecasts is represented by characterisation of the joint distribution of the observed discharge, the discharge forecast using the (in operational conditions unknown) future observed precipitation and that forecast utilising the NORA ensembles. Constructing the joint distribution in this way allows the full historic record of data at the site to inform the predictive distribution. It is shown that, in part due to the limited availability of forecasts, the uncertainty in the relationship between the NORA based forecasts and other variates dominated the resulting predictive uncertainty.

  1. Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    NASA Technical Reports Server (NTRS)

    Rohloff, Kurt

    2010-01-01

    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.

  2. Fuzzy Multi-Objective Transportation Planning with Modified S-Curve Membership Function

    NASA Astrophysics Data System (ADS)

    Peidro, D.; Vasant, P.

    2009-08-01

    In this paper, the S-Curve membership function methodology is used in a transportation planning decision (TPD) problem. An interactive method for solving multi-objective TPD problems with fuzzy goals, available supply and forecast demand is developed. The proposed method attempts simultaneously to minimize the total production and transportation costs and the total delivery time with reference to budget constraints and available supply, machine capacities at each source, as well as forecast demand and warehouse space constraints at each destination. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in TPD problems, with linear membership functions.

  3. NASA Lewis Research Center Futuring Workshop

    NASA Technical Reports Server (NTRS)

    Boroush, Mark; Stover, John; Thomas, Charles

    1987-01-01

    On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty.

  4. Systematic survey of high-resolution b value imaging along Californian faults: Inference on asperities

    NASA Astrophysics Data System (ADS)

    Tormann, T.; Wiemer, S.; Mignan, A.

    2014-03-01

    Understanding and forecasting earthquake occurrences is presumably linked to understanding the stress distribution in the Earth's crust. This cannot be measured instrumentally with useful coverage. However, the size distribution of earthquakes, quantified by the Gutenberg-Richter b value, is possibly a proxy to differential stress conditions and could therewith act as a crude stress-meter wherever seismicity is observed. In this study, we improve the methodology of b value imaging for application to a high-resolution 3-D analysis of a complex fault network. In particular, we develop a distance-dependent sampling algorithm and introduce a linearity measure to restrict our output to those regions where the magnitude distribution strictly follows a power law. We assess the catalog completeness along the fault traces using the Bayesian Magnitude of Completeness method and systematically image b values for 243 major fault segments in California. We identify and report b value structures, revisiting previously published features, e.g., the Parkfield asperity, and documenting additional anomalies, e.g., along the San Andreas and Northridge faults. Combining local b values with local earthquake productivity rates, we derive probability maps for the annual potential of one or more M6 events as indicated by the microseismicity of the last three decades. We present a physical concept of how different stressing conditions along a fault surface may lead to b value variation and explain nonlinear frequency-magnitude distributions. Detailed spatial b value information and its physical interpretation can advance our understanding of earthquake occurrence and ideally lead to improved forecasting ability.

  5. Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy

    DOE PAGES

    Rosewater, David; Ferreira, Summer; Schoenwald, David; ...

    2018-01-25

    Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less

  6. Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosewater, David; Ferreira, Summer; Schoenwald, David

    Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less

  7. a system approach to the long term forecasting of the climat data in baikal region

    NASA Astrophysics Data System (ADS)

    Abasov, N.; Berezhnykh, T.

    2003-04-01

    The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method of forecasting (with a year in advance) is based on the property of alternation of series of years with increase and decrease in the observed indicators (characteristic indices) of natural processes. Most of the series (98.4-99.6%) are represented by series of one to three years. The problem of forecasting is divided into two parts: 1) qualitative forecast of the probability that the started series will either continue or be replaced by a new series during the next year that is based on the frequency characteristics of series of years with increase or decrease of the forecasted sequence); 2) quantitative estimate of the forecasted value in the form of a curve of conditional frequencies is made on the base of intra-sequence interrelations among hydrometeorological elements by their differentiation with respect to series of years of increase or decrease, by construction of particular curves of conditional frequencies of the runoff for each expected variant of series development and by subsequent construction a generalized curve. Approximative learning methods form forecasted trajectories of the studied process indices for a long-term perspective. The method of analog-similarity relations is based on the fact that long periods of observations reveal some similarities in the character of variability of indices for some fragments of the sequence x (t) by definite criteria. The idea of the method is to estimate similarity of such fragments of the sequence that have been called the analogs. The method applies multistage optimization of both external parameters (e.g. the number of iterations of the sliding averaging needed to decompose the sequence into two components: the smoothed one with isolated periodic oscillations and the residual or random one). The method is applicable to current terms of forecasts and ending with the double solar cycle. Using a special procedure of integration, it separates terms with the best results for the given optimization subsample. Several optimal vectors of parameters obtained are tested on the examination (verifying) subsample. If the procedure is successful, the forecast is immediately made by integration of several best solutions. Peculiarities of forecasting extreme processes. Methods of long-term forecasting allow the sufficiently reliable forecasts to be made within the interval of xmin+Δ_1, xmax - Δ_2 (i.e. in the interval of medium values of indices). Meanwhile, in the intervals close to extreme ones, reliability of forecasts is substantially lower. While for medium values the statistics of the100-year sequence gives acceptable results owing to a sufficiently large number of revealed analogs that correspond to prognostic samples, for extreme values the situation is quite different, first of all by virtue of poverty of statistical data. Decreasing the values of Δ_1,Δ_2: Δ_1,Δ_2 rightarrow 0 (by including them into optimization parameters of the considered forecasting methods) could be one of the ways to improve reliability of forecasts. Partially, such an approach has been realized in the method of analog-similarity relations, giving the possibility to form a range of possible forecasted trajectories in two variants - from the minimum possible trajectory to the maximum possible one. Reliability of long-term forecasts. Both the methodology and the methods considered above have been realized as the information-forecasting system "GIPSAR". The system includes some tools implementing several methods of forecasting, analysis of initial and forecasted information, a developed database, a set of tools for verification of algorithms, additional information on the algorithms of statistical processing of sequences (sliding averaging, integral-difference curves, etc.), aids to organize input of initial information (in its various forms) as well as aids to draw up output prognostic documents. Risk management. The normal functioning of the Angara cascade is periodically interrupted by risks of two types that take place in the Baikal, the Bratsk and Ust-Ilimsk reservoirs: long low-water periods and sudden periods of extremely high water levels. For example, low-water periods, observed in the reservoirs of the Angara cascade can be classified under four risk categories : 1 - acceptable (negligible reduction of electric power generation by hydropower plants; certain difficulty in meeting environmental and navigation requirements); 2 - significant (substantial reduction of electric power generation by hydropower plants; certain restriction on water releases for navigation; violation of environmental requirements in some years); 3 - emergency (big losses in electric power generation; limited electricity supply to large consumers; significant restriction of water releases for navigation; threat of exposure of drinkable water intake works; violation of environmental requirements for a number of years); 4 - catastrophic (energy crisis; social crisis exposure of drinkable water intake works; termination of navigation; environmental catastrophe). Management of energy systems consists in operative, many-year regulation and perspective planning and has to take into account the analysis of operative data (water reserves in reservoirs), long-term statistics and relations among natural processes and also forecasts - short-term (for a day, week, decade), long-term and/or super-long-term (from a month to several decades). Such natural processes as water inflow to reservoirs, air temperatures during heating periods depend in turn on external factors: prevailing types of atmospheric circulation, intensity of the 11- and 22-year cycles of solar activity, volcanic activity, interaction between the ocean and atmosphere, etc. Until recently despite the formed scientific schools on long-term forecasting (I.P.Druzhinin, A.P.Reznikhov) the energy system management has been based on specially drawn dispatching schedules and long-term hydrometeorological forecasts only without attraction of perspective forecasted indices. Insertion of a parallel block of forecast (based on the analysis of data on natural processes and special methods of forecasting) into the scheme can largely smooth unfavorable consequences from the impact of natural processes on sustainable development of energy systems and especially on its safe operation. However, the requirements to reliability and accuracy of long-term forecasts significantly increase. The considered approach to long term forecasting can be used for prediction: mean winter and summer air temperatures, droughts and wood fires.

  8. The Experimental Regional Ensemble Forecast System (ExREF): Its Use in NWS Forecast Operations and Preliminary Verification

    NASA Technical Reports Server (NTRS)

    Reynolds, David; Rasch, William; Kozlowski, Daniel; Burks, Jason; Zavodsky, Bradley; Bernardet, Ligia; Jankov, Isidora; Albers, Steve

    2014-01-01

    The Experimental Regional Ensemble Forecast (ExREF) system is a tool for the development and testing of new Numerical Weather Prediction (NWP) methodologies. ExREF is run in near-realtime by the Global Systems Division (GSD) of the NOAA Earth System Research Laboratory (ESRL) and its products are made available through a website, an ftp site, and via the Unidata Local Data Manager (LDM). The ExREF domain covers most of North America and has 9-km horizontal grid spacing. The ensemble has eight members, all employing WRF-ARW. The ensemble uses a variety of initial conditions from LAPS and the Global Forecasting System (GFS) and multiple boundary conditions from the GFS ensemble. Additionally, a diversity of physical parameterizations is used to increase ensemble spread and to account for the uncertainty in forecasting extreme precipitation events. ExREF has been a component of the Hydrometeorology Testbed (HMT) NWP suite in the 2012-2013 and 2013-2014 winters. A smaller domain covering just the West Coast was created to minimize band-width consumption for the NWS. This smaller domain has and is being distributed to the National Weather Service (NWS) Weather Forecast Office and California Nevada River Forecast Center in Sacramento, California, where it is ingested into the Advanced Weather Interactive Processing System (AWIPS I and II) to provide guidance on the forecasting of extreme precipitation events. This paper will review the cooperative effort employed by NOAA ESRL, NASA SPoRT (Short-term Prediction Research and Transition Center), and the NWS to facilitate the ingest and display of ExREF data utilizing the AWIPS I and II D2D and GFE (Graphical Software Editor) software. Within GFE is a very useful verification software package called BoiVer that allows the NWS to utilize the River Forecast Center's 4 km gridded QPE to compare with all operational NWP models 6-hr QPF along with the ExREF mean 6-hr QPF so the forecasters can build confidence in the use of the ExREF in preparing their rainfall forecasts. Preliminary results will be presented.

  9. Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics

    NASA Astrophysics Data System (ADS)

    Kuchment, L.

    2012-04-01

    Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.

  10. Using a cross correlation technique to refine the accuracy of the Failure Forecast Method: Application to Soufrière Hills volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Salvage, R. O.; Neuberg, J. W.

    2016-09-01

    Prior to many volcanic eruptions, an acceleration in seismicity has been observed, suggesting the potential for this as a forecasting tool. The Failure Forecast Method (FFM) relates an accelerating precursor to the timing of failure by an empirical power law, with failure being defined in this context as the onset of an eruption. Previous applications of the FFM have used a wide variety of accelerating time series, often generating questionable forecasts with large misfits between data and the forecast, as well as the generation of a number of different forecasts from the same data series. Here, we show an alternative approach applying the FFM in combination with a cross correlation technique which identifies seismicity from a single active source mechanism and location at depth. Isolating a single system at depth avoids additional uncertainties introduced by averaging data over a number of different accelerating phenomena, and consequently reduces the misfit between the data and the forecast. Similar seismic waveforms were identified in the precursory accelerating seismicity to dome collapses at Soufrière Hills volcano, Montserrat in June 1997, July 2003 and February 2010. These events were specifically chosen since they represent a spectrum of collapse scenarios at this volcano. The cross correlation technique generates a five-fold increase in the number of seismic events which could be identified from continuous seismic data rather than using triggered data, thus providing a more holistic understanding of the ongoing seismicity at the time. The use of similar seismicity as a forecasting tool for collapses in 1997 and 2003 greatly improved the forecasted timing of the dome collapse, as well as improving the confidence in the forecast, thereby outperforming the classical application of the FFM. We suggest that focusing on a single active seismic system at depth allows a more accurate forecast of some of the major dome collapses from the ongoing eruption at Soufrière Hills volcano, and provides a simple addition to the well-used methodology of the FFM.

  11. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  12. Agri-Manpower Forecasting and Educational Planning

    ERIC Educational Resources Information Center

    Ramarao, D.; Agrawal, Rashmi; Rao, B. V. L. N.; Nanda, S. K.; Joshi, Girish P.

    2014-01-01

    Purpose: Developing countries need to plan growth or expansion of education so as to provide required trained manpower for different occupational sectors. The paper assesses supply and demand of professional manpower in Indian agriculture and the demands are translated in to educational requirements. Methodology: The supply is assessed from the…

  13. 76 FR 30605 - Assessment and Collection of Regulatory Fees For Fiscal Year 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-26

    ... the Wireless Telecommunications Bureau's Numbering Resource Utilization Forecast and Annual CMRS... compute their fee payment using the standard methodology \\32\\ that is currently in place for CMRS Wireless... Commission, Regulatory Fees Fact Sheet: What You Owe--Commercial Wireless Services for FY 2010 at 1 (released...

  14. Using GIS Tools and Environmental Scanning to Forecast Industry Workforce Needs

    ERIC Educational Resources Information Center

    Gaertner, Elaine; Fleming, Kevin; Marquez, Michelle

    2009-01-01

    The Centers of Excellence (COE) provide regional workforce data on high growth, high demand industries and occupations for use by community colleges in program planning and resource enhancement. This article discusses the environmental scanning research methodology and its application to data-driven decision making in community college program…

  15. An Implementing Strategy for Improving Wildland Fire Environmental Literacy

    NASA Astrophysics Data System (ADS)

    McCalla, M. R.; Andrus, D.; Barnett, K.

    2007-12-01

    Wildland fire is any planned or unplanned fire which occurs in wildland ecosystems. Wildland fires affect millions of acres annually in the U.S. An average of 5.4 million acres a year were burned in the U.S. between 1995 and 2004, approximately 142 percent of the average burned area between 1984 and 1994. In 2005 alone, Federal agencies spent nearly $1 billion on fire suppression and state and local agencies contributed millions more. Many Americans prefer to live and vacation in relatively remote surroundings, (i.e., woods and rangelands). These choices offer many benefits, but they also present significant risks. Most of North America is fire-prone and every day developed areas and home sites are extending further into natural wildlands, which increases the chances of catastrophic fire. In addition, an abundance of accumulated biomass in forests and rangelands and persistent drought conditions are contributing to larger, costlier wildland fires. To effectively prevent, manage, suppress, respond to, and recover from wildland fires, fire managers, and other communities which are impacted by wildland fires (e.g., the business community; healthcare providers; federal, state, and local policymakers; the media; the public, etc.) need timely, accurate, and detailed wildland fire weather and climate information to support their decision-making activities. But what are the wildland fire weather and climate data, products, and information, as well as information dissemination technologies, needed to reach out and promote wildland fire environmental literacy in these communities? The Office of the Federal Coordinator for Meteorological Services and Supporting Research (OFCM) conducted a comprehensive review and assessment of weather and climate needs of providers and users in their wildland fire and fuels management activities. The assessment has nine focus areas, one of which is environmental literacy (e.g., education, training, outreach, partnering, and collaboration). The OFCM model for promoting wildland fire environmental literacy, the model's component parts, as well as an implementing strategy to execute the model will be presented. That is, the presentation will lay out the framework and methodology which the OFCM used to systematically define the wildland fire weather and climate education and outreach needs through interdepartmental collaboration within the OFCM coordinating infrastructure. A key element of the methodology is to improve the overall understanding and use of wildland fire forecast and warning climate and weather products and to exploit current and emerging technologies to improve the dissemination of customer-tailored forecast and warning information and products to stakeholders and users. Thus, the framework and methodology define the method used to determine the target public, private, and academic sector audiences. The methodology also identifies the means for determining the optimal channels, formats, and content for informing end users in time for effective action to be taken.

  16. The IDEA model: A single equation approach to the Ebola forecasting challenge.

    PubMed

    Tuite, Ashleigh R; Fisman, David N

    2018-03-01

    Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  17. Dynamical Downscaling of Seasonal Climate Prediction over Nordeste Brazil with ECHAM3 and NCEP's Regional Spectral Models at IRI.

    NASA Astrophysics Data System (ADS)

    Nobre, Paulo; Moura, Antonio D.; Sun, Liqiang

    2001-12-01

    This study presents an evaluation of a seasonal climate forecast done with the International Research Institute for Climate Prediction (IRI) dynamical forecast system (regional model nested into a general circulation model) over northern South America for January-April 1999, encompassing the rainy season over Brazil's Nordeste. The one-way nesting is one in two tiers: first the NCEP's Regional Spectral Model (RSM) runs with an 80-km grid mesh forced by the ECHAM3 atmospheric general circulation model (AGCM) outputs; then the RSM runs with a finer grid mesh (20 km) forced by the forecasts generated by the RSM-80. An ensemble of three realizations is done. Lower boundary conditions over the oceans for both ECHAM and RSM model runs are sea surface temperature forecasts over the tropical oceans. Soil moisture is initialized by ECHAM's inputs. The rainfall forecasts generated by the regional model are compared with those of the AGCM and observations. It is shown that the regional model at 80-km resolution improves upon the AGCM rainfall forecast, reducing both seasonal bias and root-mean-square error. On the other hand, the RSM-20 forecasts presented larger errors, with spatial patterns that resemble those of local topography. The better forecast of the position and width of the intertropical convergence zone (ITCZ) over the tropical Atlantic by the RSM-80 model is one of the principal reasons for better-forecast scores of the RSM-80 relative to the AGCM. The regional model improved the spatial as well as the temporal details of rainfall distribution, and also presenting the minimum spread among the ensemble members. The statistics of synoptic-scale weather variability on seasonal timescales were best forecast with the regional 80-km model over the Nordeste. The possibility of forecasting the frequency distribution of dry and wet spells within the rainy season is encouraging.

  18. An Operational Short-Term Forecasting System for Regional Hydropower Management

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Labuhn, K. A.; Calappi, T. J.; MacNeil, A.

    2017-12-01

    The Niagara River is the natural outlet of Lake Erie and drains four of the five Great lakes. The river is used to move commerce and is home to both sport fishing and tourism industries. It also provides nearly 5 million kilowatts of hydropower for approximately 3.9 million homes. Due to a complex international treaty and the necessity of balancing water needs for an extensive tourism industry, the power entities operating on the river require detailed and accurate short-term river flow forecasts to maximize power output. A new forecast system is being evaluated that takes advantage of several previously independent components including the NOAA Lake Erie operational Forecast System (LEOFS), a previously developed HEC-RAS model, input from the New York Power Authority(NYPA) and Ontario Power Generation (OPG) and lateral flow forecasts for some of the tributaries provided by the NOAA Northeast River Forecast Center (NERFC). The Corps of Engineers updated the HEC-RAS model of the upper Niagara River to use the output forcing from LEOFS and a planned Grass Island Pool elevation provided by the power entities. The entire system has been integrated at the NERFC; it will be run multiple times per day with results provided to the Niagara River Control Center operators. The new model helps improve discharge forecasts by better accounting for dynamic conditions on Lake Erie. LEOFS captures seiche events on the lake that are often several meters of displacement from still water level. These seiche events translate into flow spikes that HEC-RAS routes downstream. Knowledge of the peak arrival time helps improve operational decisions at the Grass Island Pool. This poster will compare and contrast results from the existing operational flow forecast and the new integrated LEOFS/HEC-RAS forecast. This additional model will supply the Niagara River Control Center operators with multiple forecasts of flow to help improve forecasting under a wider variety of conditions.

  19. The Impact of Implementing a Demand Forecasting System into a Low-Income Country’s Supply Chain

    PubMed Central

    Mueller, Leslie E.; Haidari, Leila A.; Wateska, Angela R.; Phillips, Roslyn J.; Schmitz, Michelle M.; Connor, Diana L.; Norman, Bryan A.; Brown, Shawn T.; Welling, Joel S.; Lee, Bruce Y.

    2016-01-01

    OBJECTIVE To evaluate the potential impact and value of applications (e.g., ordering levels, storage capacity, transportation capacity, distribution frequency) of data from demand forecasting systems implemented in a lower-income country’s vaccine supply chain with different levels of population change to urban areas. MATERIALS AND METHODS Using our software, HERMES, we generated a detailed discrete event simulation model of Niger’s entire vaccine supply chain, including every refrigerator, freezer, transport, personnel, vaccine, cost, and location. We represented the introduction of a demand forecasting system to adjust vaccine ordering that could be implemented with increasing delivery frequencies and/or additions of cold chain equipment (storage and/or transportation) across the supply chain during varying degrees of population movement. RESULTS Implementing demand forecasting system with increased storage and transport frequency increased the number of successfully administered vaccine doses and lowered the logistics cost per dose up to 34%. Implementing demand forecasting system without storage/transport increases actually decreased vaccine availability in certain circumstances. DISCUSSION The potential maximum gains of a demand forecasting system may only be realized if the system is implemented to both augment the supply chain cold storage and transportation. Implementation may have some impact but, in certain circumstances, may hurt delivery. Therefore, implementation of demand forecasting systems with additional storage and transport may be the better approach. Significant decreases in the logistics cost per dose with more administered vaccines support investment in these forecasting systems. CONCLUSION Demand forecasting systems have the potential to greatly improve vaccine demand fulfillment, and decrease logistics cost/dose when implemented with storage and transportation increases direct vaccines. Simulation modeling can demonstrate the potential health and economic benefits of supply chain improvements. PMID:27219341

  20. A new forecast presentation tool for offshore contractors

    NASA Astrophysics Data System (ADS)

    Jørgensen, M.

    2009-09-01

    Contractors working off shore are often very sensitive to both sea and weather conditions, and it's essential that they have easy access to reliable information on coming conditions to enable planning of when to start or shut down offshore operations to avoid loss of life and materials. Danish Meteorological Institute, DMI, recently, in cooperation with business partners in the field, developed a new application to accommodate that need. The "Marine Forecast Service” is a browser based forecast presentation tool. It provides an interface for the user to enable easy and quick access to all relevant meteorological and oceanographic forecasts and observations for a given area of interest. Each customer gains access to the application via a standard login/password procedure. Once logged in, the user can inspect animated forecast maps of parameters like wind, gust, wave height, swell and current among others. Supplementing the general maps, the user can choose to look at forecast graphs for each of the locations where the user is running operations. These forecast graphs can also be overlaid with the user's own in situ observations, if such exist. Furthermore, the data from the graphs can be exported as data files that the customer can use in his own applications as he desires. As part of the application, a forecaster's view on the current and near future weather situation is presented to the user as well, adding further value to the information presented through maps and graphs. Among other features of the product, animated radar and satellite images could be mentioned. And finally the application provides the possibility of a "second opinion” through traditional weather charts from another recognized provider of weather forecasts. The presentation will provide more detailed insights into the contents of the applications as well as some of the experiences with the product.

  1. Nonmechanistic forecasts of seasonal influenza with iterative one-week-ahead distributions.

    PubMed

    Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni

    2018-06-15

    Accurate and reliable forecasts of seasonal epidemics of infectious disease can assist in the design of countermeasures and increase public awareness and preparedness. This article describes two main contributions we made recently toward this goal: a novel approach to probabilistic modeling of surveillance time series based on "delta densities", and an optimization scheme for combining output from multiple forecasting methods into an adaptively weighted ensemble. Delta densities describe the probability distribution of the change between one observation and the next, conditioned on available data; chaining together nonparametric estimates of these distributions yields a model for an entire trajectory. Corresponding distributional forecasts cover more observed events than alternatives that treat the whole season as a unit, and improve upon multiple evaluation metrics when extracting key targets of interest to public health officials. Adaptively weighted ensembles integrate the results of multiple forecasting methods, such as delta density, using weights that can change from situation to situation. We treat selection of optimal weightings across forecasting methods as a separate estimation task, and describe an estimation procedure based on optimizing cross-validation performance. We consider some details of the data generation process, including data revisions and holiday effects, both in the construction of these forecasting methods and when performing retrospective evaluation. The delta density method and an adaptively weighted ensemble of other forecasting methods each improve significantly on the next best ensemble component when applied separately, and achieve even better cross-validated performance when used in conjunction. We submitted real-time forecasts based on these contributions as part of CDC's 2015/2016 FluSight Collaborative Comparison. Among the fourteen submissions that season, this system was ranked by CDC as the most accurate.

  2. The impact of implementing a demand forecasting system into a low-income country's supply chain.

    PubMed

    Mueller, Leslie E; Haidari, Leila A; Wateska, Angela R; Phillips, Roslyn J; Schmitz, Michelle M; Connor, Diana L; Norman, Bryan A; Brown, Shawn T; Welling, Joel S; Lee, Bruce Y

    2016-07-12

    To evaluate the potential impact and value of applications (e.g. adjusting ordering levels, storage capacity, transportation capacity, distribution frequency) of data from demand forecasting systems implemented in a lower-income country's vaccine supply chain with different levels of population change to urban areas. Using our software, HERMES, we generated a detailed discrete event simulation model of Niger's entire vaccine supply chain, including every refrigerator, freezer, transport, personnel, vaccine, cost, and location. We represented the introduction of a demand forecasting system to adjust vaccine ordering that could be implemented with increasing delivery frequencies and/or additions of cold chain equipment (storage and/or transportation) across the supply chain during varying degrees of population movement. Implementing demand forecasting system with increased storage and transport frequency increased the number of successfully administered vaccine doses and lowered the logistics cost per dose up to 34%. Implementing demand forecasting system without storage/transport increases actually decreased vaccine availability in certain circumstances. The potential maximum gains of a demand forecasting system may only be realized if the system is implemented to both augment the supply chain cold storage and transportation. Implementation may have some impact but, in certain circumstances, may hurt delivery. Therefore, implementation of demand forecasting systems with additional storage and transport may be the better approach. Significant decreases in the logistics cost per dose with more administered vaccines support investment in these forecasting systems. Demand forecasting systems have the potential to greatly improve vaccine demand fulfilment, and decrease logistics cost/dose when implemented with storage and transportation increases. Simulation modeling can demonstrate the potential health and economic benefits of supply chain improvements. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Surface Water and Flood Extent Mapping, Monitoring, and Modeling Products and Services for the SERVIR Regions

    NASA Technical Reports Server (NTRS)

    Anderson, Eric

    2016-01-01

    SERVIR is a joint NASA - US Agency for International Development (USAID) project to improve environmental decision-making using Earth observations and geospatial technologies. A common need identified among SERVIR regions has been improved information for disaster risk reduction and in specific surface water and flood extent mapping, monitoring and forecasting. Of the 70 SERVIR products (active, complete, and in development), 4 are related to surface water and flood extent mapping, monitoring or forecasting. Visit http://www.servircatalog.net for more product details.

  4. Forecast and analysis of the ratio of electric energy to terminal energy consumption for global energy internet

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si

    2018-02-01

    In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.

  5. Nonlinear forecasting analysis of inflation-deflation patterns of an active caldera (Campi Flegrei, Italy)

    USGS Publications Warehouse

    Cortini, M.; Barton, C.C.

    1993-01-01

    The ground level in Pozzuoli, Italy, at the center of the Campi Flegrei caldera, has been monitored by tide gauges. Previous work suggests that the dynamics of the Campi Flegrei system, as reconstructed from the tide gauge record, is chaotic and low dimensional. According to this suggestion, in spite of the complexity of the system, at a time scale of days the ground motion is driven by a deterministic mechanism with few degrees of freedom; however, the interactions of the system may never be describable in full detail. New analysis of the tide gauge record using Nonlinear Forecasting, confirms low-dimensional chaos in the ground elevation record at Campi Flegrei and suggests that Nonlinear Forecasting could be a useful tool in volcanic surveillance. -from Authors

  6. Imaging Girls: Visual Methodologies and Messages for Girls' Education

    ERIC Educational Resources Information Center

    Magno, Cathryn; Kirk, Jackie

    2008-01-01

    This article describes the use of visual methodologies to examine images of girls used by development agencies to portray and promote their work in girls' education, and provides a detailed discussion of three report cover images. It details the processes of methodology and tool development for the visual analysis and presents initial 'readings'…

  7. Utility of flood warning systems for emergency management

    NASA Astrophysics Data System (ADS)

    Molinari, Daniela; Ballio, Francesco; Menoni, Scira

    2010-05-01

    The presentation is focused on a simple and crucial question for warning systems: are flood and hydrological modelling and forecasting helpful to manage flood events? Indeed, it is well known that a warning process can be invalidated by inadequate forecasts so that the accuracy and robustness of the previsional model is a key issue for any flood warning procedure. However, one problem still arises at this perspective: when forecasts can be considered to be adequate? According to Murphy (1993, Wea. Forecasting 8, 281-293), forecasts hold no intrinsic value but they acquire it through their ability to influence the decisions made by their users. Moreover, we can add that forecasts value depends on the particular problem at stake showing, this way, a multifaceted nature. As a result, forecasts verification should not be seen as a universal process, instead it should be tailored to the particular context in which forecasts are implemented. This presentation focuses on warning problems in mountain regions, whereas the short time which is distinctive of flood events makes the provision of adequate forecasts particularly significant. In this context, the quality of a forecast is linked to its capability to reduce the impact of a flood by improving the correctness of the decision about issuing (or not) a warning as well as of the implementation of a proper set of actions aimed at lowering potential flood damages. The present study evaluates the performance of a real flood forecasting system from this perspective. In detail, a back analysis of past flood events and available verification tools have been implemented. The final objective was to evaluate the system ability to support appropriate decisions with respect not only to the flood characteristics but also to the peculiarities of the area at risk as well as to the uncertainty of forecasts. This meant to consider also flood damages and forecasting uncertainty among the decision variables. Last but not least, the presentation explains how the procedure implemented in the case study could support the definition of a proper warning rule.

  8. Promoting Interests in Atmospheric Science at a Liberal Arts Institution

    NASA Astrophysics Data System (ADS)

    Roussev, S.; Sherengos, P. M.; Limpasuvan, V.; Xue, M.

    2007-12-01

    Coastal Carolina University (CCU) students in Computer Science participated in a project to set up an operational weather forecast for the local community. The project involved the construction of two computing clusters and the automation of daily forecasting. Funded by NSF-MRI, two high-performance clusters were successfully established to run the University of Oklahoma's Advance Regional Prediction System (ARPS). Daily weather predictions are made over South Carolina and North Carolina at 3-km horizontal resolution (roughly 1.9 miles) using initial and boundary condition data provided by UNIDATA. At this high resolution, the model is cloud- resolving, thus providing detailed picture of heavy thunderstorms and precipitation. Forecast results are displayed on CCU's website (https://marc.coastal.edu/HPC) to complement observations at the National Weather Service in Wilmington N.C. Present efforts include providing forecasts at 1-km resolution (or finer), comparisons with other models like Weather Research and Forecasting (WRF) model, and the examination of local phenomena (like water spouts and tornadoes). Through these activities the students learn about shell scripting, cluster operating systems, and web design. More importantly, students are introduced to Atmospheric Science, the processes involved in making weather forecasts, and the interpretation of their forecasts. Simulations generated by the forecasts will be integrated into the contents of CCU's course like Fluid Dynamics, Atmospheric Sciences, Atmospheric Physics, and Remote Sensing. Operated jointly between the departments of Applied Physics and Computer Science, the clusters are expected to be used by CCU faculty and students for future research and inquiry-based projects in Computer Science, Applied Physics, and Marine Science.

  9. Forecasting human exposure to atmospheric pollutants in Portugal - A modelling approach

    NASA Astrophysics Data System (ADS)

    Borrego, C.; Sá, E.; Monteiro, A.; Ferreira, J.; Miranda, A. I.

    2009-12-01

    Air pollution has become one main environmental concern because of its known impact on human health. Aiming to inform the population about the air they are breathing, several air quality modelling systems have been developed and tested allowing the assessment and forecast of air pollution ambient levels in many countries. However, every day, an individual is exposed to different concentrations of atmospheric pollutants as he/she moves from and to different outdoor and indoor places (the so-called microenvironments). Therefore, a more efficient way to prevent the population from the health risks caused by air pollution should be based on exposure rather than air concentrations estimations. The objective of the present study is to develop a methodology to forecast the human exposure of the Portuguese population based on the air quality forecasting system available and validated for Portugal since 2005. Besides that, a long-term evaluation of human exposure estimates aims to be obtained using one-year of this forecasting system application. Additionally, a hypothetical 50% emission reduction scenario has been designed and studied as a contribution to study emission reduction strategies impact on human exposure. To estimate the population exposure the forecasting results of the air quality modelling system MM5-CHIMERE have been combined with the population spatial distribution over Portugal and their time-activity patterns, i.e. the fraction of the day time spent in specific indoor and outdoor places. The population characterization concerning age, work, type of occupation and related time spent was obtained from national census and available enquiries performed by the National Institute of Statistics. A daily exposure estimation module has been developed gathering all these data and considering empirical indoor/outdoor relations from literature to calculate the indoor concentrations in each one of the microenvironments considered, namely home, office/school, and other indoors (leisure activities like shopping areas, gym, theatre/cinema and restaurants). The results show how this developed modelling system can be useful to anticipate air pollution episodes and to estimate their effects on human health on a long-term basis. The two metropolitan areas of Porto and Lisbon are identified as the most critical ones in terms of air pollution effects on human health over Portugal in a long-term as well as in a short-term perspective. The coexistence of high concentration values and high population density is the key factor for these stressed areas. Regarding the 50% emission reduction scenario, the model results are significantly different for both pollutants: there is a small overall reduction in the individual exposure values of PM 10 (<10 μg m -3 h), but for O 3, in contrast, there is an extended area where exposure values increase with emission reduction. This detailed knowledge is a prerequisite for the development of effective policies to reduce the foreseen adverse impact of air pollution on human health and to act on time.

  10. Sampling strategies based on singular vectors for assimilated models in ocean forecasting systems

    NASA Astrophysics Data System (ADS)

    Fattorini, Maria; Brandini, Carlo; Ortolani, Alberto

    2016-04-01

    Meteorological and oceanographic models do need observations, not only as a ground truth element to verify the quality of the models, but also to keep model forecast error acceptable: through data assimilation techniques which merge measured and modelled data, natural divergence of numerical solutions from reality can be reduced / controlled and a more reliable solution - called analysis - is computed. Although this concept is valid in general, its application, especially in oceanography, raises many problems due to three main reasons: the difficulties that have ocean models in reaching an acceptable state of equilibrium, the high measurements cost and the difficulties in realizing them. The performances of the data assimilation procedures depend on the particular observation networks in use, well beyond the background quality and the used assimilation method. In this study we will present some results concerning the great impact of the dataset configuration, in particular measurements position, on the evaluation of the overall forecasting reliability of an ocean model. The aim consists in identifying operational criteria to support the design of marine observation networks at regional scale. In order to identify the observation network able to minimize the forecast error, a methodology based on Singular Vectors Decomposition of the tangent linear model is proposed. Such a method can give strong indications on the local error dynamics. In addition, for the purpose of avoiding redundancy of information contained in the data, a minimal distance among data positions has been chosen on the base of a spatial correlation analysis of the hydrodynamic fields under investigation. This methodology has been applied for the choice of data positions starting from simplified models, like an ideal double-gyre model and a quasi-geostrophic one. Model configurations and data assimilation are based on available ROMS routines, where a variational assimilation algorithm (4D-var) is included as part of the code These first applications have provided encouraging results in terms of increased predictability time and reduced forecast error, also improving the quality of the analysis used to recover the real circulation patterns from a first guess quite far from the real state.

  11. Development of a model-based flood emergency management system in Yujiang River Basin, South China

    NASA Astrophysics Data System (ADS)

    Zeng, Yong; Cai, Yanpeng; Jia, Peng; Mao, Jiansu

    2014-06-01

    Flooding is the most frequent disaster in China. It affects people's lives and properties, causing considerable economic loss. Flood forecast and operation of reservoirs are important in flood emergency management. Although great progress has been achieved in flood forecast and reservoir operation through using computer, network technology, and geographic information system technology in China, the prediction accuracy of models are not satisfactory due to the unavailability of real-time monitoring data. Also, real-time flood control scenario analysis is not effective in many regions and can seldom provide online decision support function. In this research, a decision support system for real-time flood forecasting in Yujiang River Basin, South China (DSS-YRB) is introduced in this paper. This system is based on hydrological and hydraulic mathematical models. The conceptual framework and detailed components of the proposed DSS-YRB is illustrated, which employs real-time rainfall data conversion, model-driven hydrologic forecasting, model calibration, data assimilation methods, and reservoir operational scenario analysis. Multi-tiered architecture offers great flexibility, portability, reusability, and reliability. The applied case study results show the development and application of a decision support system for real-time flood forecasting and operation is beneficial for flood control.

  12. An Objective Verification of the North American Mesoscale Model for Kennedy Space Center and Cape Canaveral Air Force Station

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The 45th Weather Squadron (45 WS) Launch Weather Officers (LWO's) use the 12-km resolution North American Mesoscale (NAM) model (MesoNAM) text and graphical product forecasts extensively to support launch weather operations. However, the actual performance of the model at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) has not been measured objectively. In order to have tangible evidence of model performance, the 45 WS tasked the Applied Meteorology Unit (AMU; Bauman et ai, 2004) to conduct a detailed statistical analysis of model output compared to observed values. The model products are provided to the 45 WS by ACTA, Inc. and include hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The objective analysis compared the MesoNAM forecast winds, temperature (T) and dew pOint (T d), as well as the changes in these parameters over time, to the observed values from the sensors in the KSC/CCAFS wind tower network shown in Table 1. These objective statistics give the forecasters knowledge of the model's strengths and weaknesses, which will result in improved forecasts for operations.

  13. Importance and use of correlational research.

    PubMed

    Curtis, Elizabeth A; Comiskey, Catherine; Dempsey, Orla

    2016-07-01

    The importance of correlational research has been reported in the literature yet few research texts discuss design in any detail. To discuss important issues and considerations in correlational research, and suggest ways to avert potential problems during the preparation and application of the design. This article targets the gap identified in the literature regarding correlational research design. Specifically, it discusses the importance and purpose of correlational research, its application, analysis and interpretation with contextualisations to nursing and health research. Findings from correlational research can be used to determine prevalence and relationships among variables, and to forecast events from current data and knowledge. In spite of its many uses, prudence is required when using the methodology and analysing data. To assist researchers in reducing mistakes, important issues are singled out for discussion and several options put forward for analysing data. Correlational research is widely used and this paper should be particularly useful for novice nurse researchers. Furthermore, findings generated from correlational research can be used, for example, to inform decision-making, and to improve or initiate health-related activities or change.

  14. Improving Forecasts Through Realistic Uncertainty Estimates: A Novel Data Driven Method for Model Uncertainty Quantification in Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.

    2016-12-01

    Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.

  15. Perspectives on model forecasts of the 2014-2015 Ebola epidemic in West Africa: lessons and the way forward.

    PubMed

    Chowell, Gerardo; Viboud, Cécile; Simonsen, Lone; Merler, Stefano; Vespignani, Alessandro

    2017-03-01

    The unprecedented impact and modeling efforts associated with the 2014-2015 Ebola epidemic in West Africa provides a unique opportunity to document the performances and caveats of forecasting approaches used in near-real time for generating evidence and to guide policy. A number of international academic groups have developed and parameterized mathematical models of disease spread to forecast the trajectory of the outbreak. These modeling efforts often relied on limited epidemiological data to derive key transmission and severity parameters, which are needed to calibrate mechanistic models. Here, we provide a perspective on some of the challenges and lessons drawn from these efforts, focusing on (1) data availability and accuracy of early forecasts; (2) the ability of different models to capture the profile of early growth dynamics in local outbreaks and the importance of reactive behavior changes and case clustering; (3) challenges in forecasting the long-term epidemic impact very early in the outbreak; and (4) ways to move forward. We conclude that rapid availability of aggregated population-level data and detailed information on a subset of transmission chains is crucial to characterize transmission patterns, while ensemble-forecasting approaches could limit the uncertainty of any individual model. We believe that coordinated forecasting efforts, combined with rapid dissemination of disease predictions and underlying epidemiological data in shared online platforms, will be critical in optimizing the response to current and future infectious disease emergencies.

  16. Improving Timeliness of Winter Wheat Production Forecast in United States of America, Ukraine and China Using MODIS Data and NCAR Growing Degree Day

    NASA Astrophysics Data System (ADS)

    Vermote, E.; Franch, B.; Becker-Reshef, I.; Claverie, M.; Huang, J.; Zhang, J.; Sobrino, J. A.

    2014-12-01

    Wheat is the most important cereal crop traded on international markets and winter wheat constitutes approximately 80% of global wheat production. Thus, accurate and timely forecasts of its production are critical for informing agricultural policies and investments, as well as increasing market efficiency and stability. Becker-Reshef et al. (2010) used an empirical generalized model for forecasting winter wheat production. Their approach combined BRDF-corrected daily surface reflectance from Moderate resolution Imaging Spectroradiometer (MODIS) Climate Modeling Grid (CMG) with detailed official crop statistics and crop type masks. It is based on the relationship between the Normalized Difference Vegetation Index (NDVI) at the peak of the growing season, percent wheat within the CMG pixel, and the final yields. This method predicts the yield approximately one month to six weeks prior to harvest. In this study, we include the Growing Degree Day (GDD) information extracted from NCEP/NCAR reanalysis data in order to improve the winter wheat production forecast by increasing the timeliness of the forecasts while conserving the accuracy of the original model. We apply this modified model to three major wheat-producing countries: United States of America, Ukraine and China from 2001 to 2012. We show that a reliable forecast can be made between one month to a month and a half prior to the peak NDVI (meaning two months to two and a half months prior to harvest) while conserving an accuracy of 10% in the production forecast.

  17. Energy management of a university campus utilizing short-term load forecasting with an artificial neural network

    NASA Astrophysics Data System (ADS)

    Palchak, David

    Electrical load forecasting is a tool that has been utilized by distribution designers and operators as a means for resource planning and generation dispatch. The techniques employed in these predictions are proving useful in the growing market of consumer, or end-user, participation in electrical energy consumption. These predictions are based on exogenous variables, such as weather, and time variables, such as day of week and time of day as well as prior energy consumption patterns. The participation of the end-user is a cornerstone of the Smart Grid initiative presented in the Energy Independence and Security Act of 2007, and is being made possible by the emergence of enabling technologies such as advanced metering infrastructure. The optimal application of the data provided by an advanced metering infrastructure is the primary motivation for the work done in this thesis. The methodology for using this data in an energy management scheme that utilizes a short-term load forecast is presented. The objective of this research is to quantify opportunities for a range of energy management and operation cost savings of a university campus through the use of a forecasted daily electrical load profile. The proposed algorithm for short-term load forecasting is optimized for Colorado State University's main campus, and utilizes an artificial neural network that accepts weather and time variables as inputs. The performance of the predicted daily electrical load is evaluated using a number of error measurements that seek to quantify the best application of the forecast. The energy management presented utilizes historical electrical load data from the local service provider to optimize the time of day that electrical loads are being managed. Finally, the utilization of forecasts in the presented energy management scenario is evaluated based on cost and energy savings.

  18. Real-time Ensemble Flow Forecasts for a 2017 Mock Operation Test Trial of Forecast Informed Reservoir Operations for Lake Mendocino in Mendocino County, California

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Mendoza, J.; Jasperse, J.; Hartman, R. K.; Whitin, B.; Kalansky, J.

    2017-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates 15-day ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to conduct a mock operation test trial of the EFO alternative for 2017. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The operational trial utilized real-time ESPs prepared by the CNRFC and observed flow information to simulate hydrologic conditions in Lake Mendocino and a 50-mile downstream reach of the Russian River to the City of Healdsburg. Results of the EFO trial demonstrate a 6% increase in reservoir storage at the end of trial period (May 10) relative to observed conditions. Additionally, model results show no increase in flows above flood stage for points downstream of Lake Mendocino. Results of this investigation and other studies demonstrate that the EFO alternative may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.

  19. Smart Irrigation From Soil Moisture Forecast Using Satellite And Hydro -Meteorological Modelling

    NASA Astrophysics Data System (ADS)

    Corbari, Chiara; Mancini, Marco; Ravazzani, Giovanni; Ceppi, Alessandro; Salerno, Raffaele; Sobrino, Josè

    2017-04-01

    Increased water demand and climate change impacts have recently enhanced the need to improve water resources management, even in those areas which traditionally have an abundant supply of water. The highest consumption of water is devoted to irrigation for agricultural production, and so it is in this area that efforts have to be focused to study possible interventions. The SIM project funded by EU in the framework of the WaterWorks2014 - Water Joint Programming Initiative aims at developing an operational tool for real-time forecast of crops irrigation water requirements to support parsimonious water management and to optimize irrigation scheduling providing real-time and forecasted soil moisture behavior at high spatial and temporal resolutions with forecast horizons from few up to thirty days. This study discusses advances in coupling satellite driven soil water balance model and meteorological forecast as support for precision irrigation use comparing different case studies in Italy, in the Netherlands, in China and Spain, characterized by different climatic conditions, water availability, crop types and irrigation techniques and water distribution rules. Herein, the applications in two operative farms in vegetables production in the South of Italy where semi-arid climatic conditions holds, two maize fields in Northern Italy in a more water reach environment with flood irrigation will be presented. This system combines state of the art mathematical models and new technologies for environmental monitoring, merging ground observed data with Earth observations. Discussion on the methodology approach is presented, comparing for a reanalysis periods the forecast system outputs with observed soil moisture and crop water needs proving the reliability of the forecasting system and its benefits. The real-time visualization of the implemented system is also presented through web-dashboards.

  20. Transportation Sector Model of the National Energy Modeling System. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-01-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. The NEMS Transportation Model comprises a series of semi-independent models which address different aspects of the transportation sector. The primary purpose of this model is to provide mid-term forecasts of transportation energy demand by fuel type including, but not limited to, motor gasoline, distillate, jet fuel, and alternative fuels (such as CNG) not commonly associated with transportation. Themore » current NEMS forecast horizon extends to the year 2010 and uses 1990 as the base year. Forecasts are generated through the separate consideration of energy consumption within the various modes of transport, including: private and fleet light-duty vehicles; aircraft; marine, rail, and truck freight; and various modes with minor overall impacts, such as mass transit and recreational boating. This approach is useful in assessing the impacts of policy initiatives, legislative mandates which affect individual modes of travel, and technological developments. The model also provides forecasts of selected intermediate values which are generated in order to determine energy consumption. These elements include estimates of passenger travel demand by automobile, air, or mass transit; estimates of the efficiency with which that demand is met; projections of vehicle stocks and the penetration of new technologies; and estimates of the demand for freight transport which are linked to forecasts of industrial output. Following the estimation of energy demand, TRAN produces forecasts of vehicular emissions of the following pollutants by source: oxides of sulfur, oxides of nitrogen, total carbon, carbon dioxide, carbon monoxide, and volatile organic compounds.« less

  1. Time Series Analysis for Forecasting Hospital Census: Application to the Neonatal Intensive Care Unit

    PubMed Central

    Hoover, Stephen; Jackson, Eric V.; Paul, David; Locke, Robert

    2016-01-01

    Summary Background Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Objective Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. Methods We used five years of retrospective daily NICU census data for model development (January 2008 – December 2012, N=1827 observations) and one year of data for validation (January – December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. Results The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Conclusions Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning. PMID:27437040

  2. Hybrid vs Adaptive Ensemble Kalman Filtering for Storm Surge Forecasting

    NASA Astrophysics Data System (ADS)

    Altaf, M. U.; Raboudi, N.; Gharamti, M. E.; Dawson, C.; McCabe, M. F.; Hoteit, I.

    2014-12-01

    Recent storm surge events due to Hurricanes in the Gulf of Mexico have motivated the efforts to accurately forecast water levels. Toward this goal, a parallel architecture has been implemented based on a high resolution storm surge model, ADCIRC. However the accuracy of the model notably depends on the quality and the recentness of the input data (mainly winds and bathymetry), model parameters (e.g. wind and bottom drag coefficients), and the resolution of the model grid. Given all these uncertainties in the system, the challenge is to build an efficient prediction system capable of providing accurate forecasts enough ahead of time for the authorities to evacuate the areas at risk. We have developed an ensemble-based data assimilation system to frequently assimilate available data into the ADCIRC model in order to improve the accuracy of the model. In this contribution we study and analyze the performances of different ensemble Kalman filter methodologies for efficient short-range storm surge forecasting, the aim being to produce the most accurate forecasts at the lowest possible computing time. Using Hurricane Ike meteorological data to force the ADCIRC model over a domain including the Gulf of Mexico coastline, we implement and compare the forecasts of the standard EnKF, the hybrid EnKF and an adaptive EnKF. The last two schemes have been introduced as efficient tools for enhancing the behavior of the EnKF when implemented with small ensembles by exploiting information from a static background covariance matrix. Covariance inflation and localization are implemented in all these filters. Our results suggest that both the hybrid and the adaptive approach provide significantly better forecasts than those resulting from the standard EnKF, even when implemented with much smaller ensembles.

  3. Time Series Analysis for Forecasting Hospital Census: Application to the Neonatal Intensive Care Unit.

    PubMed

    Capan, Muge; Hoover, Stephen; Jackson, Eric V; Paul, David; Locke, Robert

    2016-01-01

    Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. We used five years of retrospective daily NICU census data for model development (January 2008 - December 2012, N=1827 observations) and one year of data for validation (January - December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning.

  4. Weapons and Tactics Instructor Course 2-16 Sleep and Performance Study

    DTIC Science & Technology

    2017-03-01

    assessments showed a significant increase in self-reported fatigue as the course progressed. This thesis outlines a detailed methodology and lessons...increase in self-reported fatigue as the course progressed. This thesis outlines a detailed methodology and lessons learned for follow-on studies of...Performance as a Result of Insufficient Sleep .........23  III.  METHODOLOGY

  5. Linking seasonal climate forecasts with crop models in Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Capa, Mirian; Ines, Amor; Baethgen, Walter; Rodriguez-Fonseca, Belen; Han, Eunjin; Ruiz-Ramos, Margarita

    2015-04-01

    Translating seasonal climate forecasts into agricultural production forecasts could help to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse conditions. In this study, we use seasonal rainfall forecasts and crop models to improve predictability of wheat yield in the Iberian Peninsula (IP). Additionally, we estimate economic margins and production risks associated with extreme scenarios of seasonal rainfall forecast. This study evaluates two methods for disaggregating seasonal climate forecasts into daily weather data: 1) a stochastic weather generator (CondWG), and 2) a forecast tercile resampler (FResampler). Both methods were used to generate 100 (with FResampler) and 110 (with CondWG) weather series/sequences for three scenarios of seasonal rainfall forecasts. Simulated wheat yield is computed with the crop model CERES-wheat (Ritchie and Otter, 1985), which is included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at two locations in northeastern Spain where the crop model was calibrated and validated with independent field data. Once simulated yields were obtained, an assessment of farmer's gross margin for different seasonal climate forecasts was accomplished to estimate production risks under different climate scenarios. This methodology allows farmers to assess the benefits and risks of a seasonal weather forecast in IP prior to the crop growing season. The results of this study may have important implications on both, public (agricultural planning) and private (decision support to farmers, insurance companies) sectors. Acknowledgements Research by M. Capa-Morocho has been partly supported by a PICATA predoctoral fellowship of the Moncloa Campus of International Excellence (UCM-UPM) and MULCLIVAR project (CGL2012-38923-C02-02) References Hoogenboom, G. et al., 2010. The Decision Support System for Agrotechnology Transfer (DSSAT).Version 4.5 [CD-ROM].University of Hawaii, Honolulu, Hawaii. Ritchie, J.T., Otter, S., 1985. Description and performanceof CERES-Wheat: a user-oriented wheat yield model. In: ARS Wheat Yield Project. ARS-38.Natl Tech Info Serv, Springfield, Missouri, pp. 159-175.

  6. Seasonal forecasting for water resource management: the example of CNR Genissiat dam on the Rhone River in France

    NASA Astrophysics Data System (ADS)

    Dommanget, Etienne; Bellier, Joseph; Ben Daoud, Aurélien; Graff, Benjamin

    2014-05-01

    Compagnie Nationale du Rhône (CNR) has been granted the concession to operate the Rhone River from the Swiss border to the Mediterranean Sea since 1933 and carries out three interdependent missions: navigation, irrigation and hydropower production. Nowadays, CNR generates one quarter of France's hydropower electricity. The convergence of public and private interests around optimizing the management of water resources throughout the French Rhone valley led CNR to develop hydrological models dedicated to discharge seasonal forecasting. Indeed, seasonal forecasting is a major issue for CNR and water resource management, in order to optimize long-term investments of the produced electricity, plan dam maintenance operations and anticipate low water period. Seasonal forecasting models have been developed on the Genissiat dam. With an installed capacity of 420MW, Genissiat dam is the first of the 19 CNR's hydropower plants. Discharge forecasting at Genissiat dam is strategic since its inflows contributes to 20% of the total Rhone average discharge and consequently to 40% of the total Rhone hydropower production. Forecasts are based on hydrological statistical models. Discharge on the main Rhone River tributaries upstream Genissiat dam are forecasted from 1 to 6 months ahead thanks to multiple linear regressions. Inputs data of these regressions are identified depending on river hydrological regimes and periods of the year. For the melting season, from spring to summer, snow water equivalent (SWE) data are of major importance. SWE data are calculated from Crocus model (Météo France) and SLF's model (Switzerland). CNR hydro-meteorological forecasters assessed meteorological trends regarding precipitations for the next coming months. These trends are used to generate stochastically precipitation scenarios in order to complement regression data set. This probabilistic approach build a decision-making supports for CNR's water resource management team and provides them with seasonal forecasts and their confidence interval. After a presentation of CNR methodology, results for the years 2011 and 2013 will illustrate CNR's seasonal forecasting models ability. These years are of particular interest regarding water resource management seeing that they are, respectively, unusually dry and snowy. Model performances will be assessed in comparison with historical climatology thanks to CRPS skill score.

  7. Post-processing of multi-model ensemble river discharge forecasts using censored EMOS

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2014-05-01

    When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.

  8. The use of real-time off-site observations as a methodology for increasing forecast skill in prediction of large wind power ramps one or more hours ahead of their impact on a wind plant.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin Wilde, Principal Investigator

    2012-12-31

    ABSTRACT Application of Real-Time Offsite Measurements in Improved Short-Term Wind Ramp Prediction Skill Improved forecasting performance immediately preceding wind ramp events is of preeminent concern to most wind energy companies, system operators, and balancing authorities. The value of near real-time hub height-level wind data and more general meteorological measurements to short-term wind power forecasting is well understood. For some sites, access to onsite measured wind data - even historical - can reduce forecast error in the short-range to medium-range horizons by as much as 50%. Unfortunately, valuable free-stream wind measurements at tall tower are not typically available at most windmore » plants, thereby forcing wind forecasters to rely upon wind measurements below hub height and/or turbine nacelle anemometry. Free-stream measurements can be appropriately scaled to hub-height levels, using existing empirically-derived relationships that account for surface roughness and turbulence. But there is large uncertainty in these relationships for a given time of day and state of the boundary layer. Alternatively, forecasts can rely entirely on turbine anemometry measurements, though such measurements are themselves subject to wake effects that are not stationary. The void in free-stream hub-height level measurements of wind can be filled by remote sensing (e.g., sodar, lidar, and radar). However, the expense of such equipment may not be sustainable. There is a growing market for traditional anemometry on tall tower networks, maintained by third parties to the forecasting process (i.e., independent of forecasters and the forecast users). This study examines the value of offsite tall-tower data from the WINDataNOW Technology network for short-horizon wind power predictions at a wind farm in northern Montana. The presentation shall describe successful physical and statistical techniques for its application and the practicality of its application in an operational setting. It shall be demonstrated that when used properly, the real-time offsite measurements materially improve wind ramp capture and prediction statistics, when compared to traditional wind forecasting techniques and to a simple persistence model.« less

  9. Three-Month Real-Time Dengue Forecast Models: An Early Warning System for Outbreak Alerts and Policy Decision Support in Singapore

    PubMed Central

    Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S.Y.; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R.

    2015-01-01

    Background: With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. Objectives: We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. Methods: We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Results: Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore’s dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Conclusions: Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Citation: Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369–1375; http://dx.doi.org/10.1289/ehp.1509981 PMID:26662617

  10. The weather roulette: assessing the economic value of seasonal wind speed predictions

    NASA Astrophysics Data System (ADS)

    Christel, Isadora; Cortesi, Nicola; Torralba-Fernandez, Veronica; Soret, Albert; Gonzalez-Reviriego, Nube; Doblas-Reyes, Francisco

    2016-04-01

    Climate prediction is an emerging and highly innovative research area. For the wind energy sector, predicting the future variability of wind resources over the coming weeks or seasons is especially relevant to quantify operation and maintenance logistic costs or to inform energy trading decision with potential cost savings and/or economic benefits. Recent advances in climate predictions have already shown that probabilistic forecasting can improve the current prediction practices, which are based in the use of retrospective climatology and the assumption that what happened in the past is the best estimation of future conditions. Energy decision makers now have this new set of climate services but, are they willing to use them? Our aim is to properly explain the potential economic benefits of adopting probabilistic predictions, compared with the current practice, by using the weather roulette methodology (Hagedorn & Smith, 2009). This methodology is a diagnostic tool created to inform in a more intuitive and relevant way about the skill and usefulness of a forecast in the decision making process, by providing an economic and financial oriented assessment of the benefits of using a particular forecast system. We have selected a region relevant to the energy stakeholders where the predictions of the EUPORIAS climate service prototype for the energy sector (RESILIENCE) are skillful. In this region, we have applied the weather roulette to compare the overall prediction success of RESILIENCE's predictions and climatology illustrating it as an effective interest rate, an economic term that is easier to understand for energy stakeholders.

  11. Assessment of precursory information in seismo-electromagnetic phenomena

    NASA Astrophysics Data System (ADS)

    Han, P.; Hattori, K.; Zhuang, J.

    2017-12-01

    Previous statistical studies showed that there were correlations between seismo-electromagnetic phenomena and sizeable earthquakes in Japan. In this study, utilizing Molchan's error diagram, we evaluate whether these phenomena contain precursory information and discuss how they can be used in short-term forecasting of large earthquake events. In practice, for given series of precursory signals and related earthquake events, each prediction strategy is characterized by the leading time of alarms, the length of alarm window, the alarm radius (area) and magnitude. The leading time is the time length between a detected anomaly and its following alarm, and the alarm window is the duration that an alarm lasts. The alarm radius and magnitude are maximum predictable distance and minimum predictable magnitude of earthquake events, respectively. We introduce the modified probability gain (PG') and the probability difference (D') to quantify the forecasting performance and to explore the optimal prediction parameters for a given electromagnetic observation. The above methodology is firstly applied to ULF magnetic data and GPS-TEC data. The results show that the earthquake predictions based on electromagnetic anomalies are significantly better than random guesses, indicating the data contain potential useful precursory information. Meanwhile, we reveal the optimal prediction parameters for both observations. The methodology proposed in this study could be also applied to other pre-earthquake phenomena to find out whether there is precursory information, and then on this base explore the optimal alarm parameters in practical short-term forecast.

  12. Using the Firefly optimization method to weight an ensemble of rainfall forecasts from the Brazilian developments on the Regional Atmospheric Modeling System (BRAMS)

    NASA Astrophysics Data System (ADS)

    dos Santos, A. F.; Freitas, S. R.; de Mattos, J. G. Z.; de Campos Velho, H. F.; Gan, M. A.; da Luz, E. F. P.; Grell, G. A.

    2013-09-01

    In this paper we consider an optimization problem applying the metaheuristic Firefly algorithm (FY) to weight an ensemble of rainfall forecasts from daily precipitation simulations with the Brazilian developments on the Regional Atmospheric Modeling System (BRAMS) over South America during January 2006. The method is addressed as a parameter estimation problem to weight the ensemble of precipitation forecasts carried out using different options of the convective parameterization scheme. Ensemble simulations were performed using different choices of closures, representing different formulations of dynamic control (the modulation of convection by the environment) in a deep convection scheme. The optimization problem is solved as an inverse problem of parameter estimation. The application and validation of the methodology is carried out using daily precipitation fields, defined over South America and obtained by merging remote sensing estimations with rain gauge observations. The quadratic difference between the model and observed data was used as the objective function to determine the best combination of the ensemble members to reproduce the observations. To reduce the model rainfall biases, the set of weights determined by the algorithm is used to weight members of an ensemble of model simulations in order to compute a new precipitation field that represents the observed precipitation as closely as possible. The validation of the methodology is carried out using classical statistical scores. The algorithm has produced the best combination of the weights, resulting in a new precipitation field closest to the observations.

  13. How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction

    NASA Astrophysics Data System (ADS)

    Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.

    2015-03-01

    The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.

  14. Technical Processing Librarians in the 1980's: Current Trends and Future Forecasts.

    ERIC Educational Resources Information Center

    Kennedy, Gail

    1980-01-01

    This review of recent and anticipated advances in library automation technology and methodology includes a review of the effects of OCLC, MARC formatting, AACR2, and increasing costs, as well as predictions of the impact on library technical processing of networking, expansion of automation, minicomputers, specialized reference services, and…

  15. Application of Classification Methods for Forecasting Mid-Term Power Load Patterns

    NASA Astrophysics Data System (ADS)

    Piao, Minghao; Lee, Heon Gyu; Park, Jin Hyoung; Ryu, Keun Ho

    Currently an automated methodology based on data mining techniques is presented for the prediction of customer load patterns in long duration load profiles. The proposed approach in this paper consists of three stages: (i) data preprocessing: noise or outlier is removed and the continuous attribute-valued features are transformed to discrete values, (ii) cluster analysis: k-means clustering is used to create load pattern classes and the representative load profiles for each class and (iii) classification: we evaluated several supervised learning methods in order to select a suitable prediction method. According to the proposed methodology, power load measured from AMR (automatic meter reading) system, as well as customer indexes, were used as inputs for clustering. The output of clustering was the classification of representative load profiles (or classes). In order to evaluate the result of forecasting load patterns, the several classification methods were applied on a set of high voltage customers of the Korea power system and derived class labels from clustering and other features are used as input to produce classifiers. Lastly, the result of our experiments was presented.

  16. Probabilistic population aging

    PubMed Central

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  17. Development and Application of Advanced Weather Prediction Technologies for the Wind Energy Industry (Invited)

    NASA Astrophysics Data System (ADS)

    Mahoney, W. P.; Wiener, G.; Liu, Y.; Myers, W.; Johnson, D.

    2010-12-01

    Wind energy decision makers are required to make critical judgments on a daily basis with regard to energy generation, distribution, demand, storage, and integration. Accurate knowledge of the present and future state of the atmosphere is vital in making these decisions. As wind energy portfolios expand, this forecast problem is taking on new urgency because wind forecast inaccuracies frequently lead to substantial economic losses and constrain the national expansion of renewable energy. Improved weather prediction and precise spatial analysis of small-scale weather events are crucial for renewable energy management. In early 2009, the National Center for Atmospheric Research (NCAR) began a collaborative project with Xcel Energy Services, Inc. to perform research and develop technologies to improve Xcel Energy's ability to increase the amount of wind energy in their generation portfolio. The agreement and scope of work was designed to provide highly detailed, localized wind energy forecasts to enable Xcel Energy to more efficiently integrate electricity generated from wind into the power grid. The wind prediction technologies are designed to help Xcel Energy operators make critical decisions about powering down traditional coal and natural gas-powered plants when sufficient wind energy is predicted. The wind prediction technologies have been designed to cover Xcel Energy wind resources spanning a region from Wisconsin to New Mexico. The goal of the project is not only to improve Xcel Energy’s wind energy prediction capabilities, but also to make technological advancements in wind and wind energy prediction, expand our knowledge of boundary layer meteorology, and share the results across the renewable energy industry. To generate wind energy forecasts, NCAR is incorporating observations of current atmospheric conditions from a variety of sources including satellites, aircraft, weather radars, ground-based weather stations, wind profilers, and even wind sensors on individual wind turbines. The information is utilized by several technologies including: a) the Weather Research and Forecasting (WRF) model, which generates finely detailed simulations of future atmospheric conditions, b) the Real-Time Four-Dimensional Data Assimilation System (RTFDDA), which performs continuous data assimilation providing the WRF model with continuous updates of the initial atmospheric state, 3) the Dynamic Integrated Forecast System (DICast®), which statistically optimizes the forecasts using all predictors, and 4) a suite of wind-to-power algorithms that convert wind speed to power for a wide range of wind farms with varying real-time data availability capabilities. In addition to these core wind energy prediction capabilities, NCAR implemented a high-resolution (10 km grid increment) 30-member ensemble RTFDDA prediction system that provides information on the expected range of wind power over a 72-hour forecast period covering Xcel Energy’s service areas. This talk will include descriptions of these capabilities and report on several topics including initial results of next-day forecasts and nowcasts of wind energy ramp events, influence of local observations on forecast skill, and overall lessons learned to date.

  18. Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty

    NASA Astrophysics Data System (ADS)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge observations and several years of archived forecasts, overall empirical error distributions termed 'overall error' were for each gauge derived for a range of relevant forecast lead times. b) The error distributions vary strongly with the hydrometeorological situation, therefore a subdivision into the hydrological cases 'low flow, 'rising flood', 'flood', flood recession' was introduced. c) For the sake of numerical compression, theoretical distributions were fitted to the empirical distributions using the method of moments. Here, the normal distribution was generally best suited. d) Further data compression was achieved by representing the distribution parameters as a function (second-order polynome) of lead time. In general, the 'overall error' obtained from the above procedure is most useful in regions where large human impact occurs and where the influence of the meteorological forecast is limited. In upstream regions however, forecast uncertainty is strongly dependent on the current predictability of the atmosphere, which is contained in the spread of an ensemble forecast. Including this dynamically in the hydrological forecast uncertainty estimation requires prior elimination of the contribution of the weather forecast to the 'overall error'. This was achieved by calculating long series of hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The resulting error distribution is termed 'model error' and can be applied on hydrological ensemble forecasts, where ensemble rainfall forecasts are used as forcing. The concept will be illustrated by examples (good and bad ones) covering a wide range of catchment sizes, hydrometeorological regimes and quality of hydrological model calibration. The methodology to combine the static and dynamic shares of uncertainty will be presented in part II of this study.

  19. Windblown Dust Deposition Forecasting and Spread of Contamination around Mine Tailings.

    PubMed

    Stovern, Michael; Guzmán, Héctor; Rine, Kyle P; Felix, Omar; King, Matthew; Ela, Wendell P; Betterton, Eric A; Sáez, Avelino Eduardo

    2016-02-01

    Wind erosion, transport and deposition of windblown dust from anthropogenic sources, such as mine tailings impoundments, can have significant effects on the surrounding environment. The lack of vegetation and the vertical protrusion of the mine tailings above the neighboring terrain make the tailings susceptible to wind erosion. Modeling the erosion, transport and deposition of particulate matter from mine tailings is a challenge for many reasons, including heterogeneity of the soil surface, vegetative canopy coverage, dynamic meteorological conditions and topographic influences. In this work, a previously developed Deposition Forecasting Model (DFM) that is specifically designed to model the transport of particulate matter from mine tailings impoundments is verified using dust collection and topsoil measurements. The DFM is initialized using data from an operational Weather Research and Forecasting (WRF) model. The forecast deposition patterns are compared to dust collected by inverted-disc samplers and determined through gravimetric, chemical composition and lead isotopic analysis. The DFM is capable of predicting dust deposition patterns from the tailings impoundment to the surrounding area. The methodology and approach employed in this work can be generalized to other contaminated sites from which dust transport to the local environment can be assessed as a potential route for human exposure.

  20. Windblown Dust Deposition Forecasting and Spread of Contamination around Mine Tailings

    PubMed Central

    Stovern, Michael; Guzmán, Héctor; Rine, Kyle P.; Felix, Omar; King, Matthew; Ela, Wendell P.; Betterton, Eric A.; Sáez, Avelino Eduardo

    2017-01-01

    Wind erosion, transport and deposition of windblown dust from anthropogenic sources, such as mine tailings impoundments, can have significant effects on the surrounding environment. The lack of vegetation and the vertical protrusion of the mine tailings above the neighboring terrain make the tailings susceptible to wind erosion. Modeling the erosion, transport and deposition of particulate matter from mine tailings is a challenge for many reasons, including heterogeneity of the soil surface, vegetative canopy coverage, dynamic meteorological conditions and topographic influences. In this work, a previously developed Deposition Forecasting Model (DFM) that is specifically designed to model the transport of particulate matter from mine tailings impoundments is verified using dust collection and topsoil measurements. The DFM is initialized using data from an operational Weather Research and Forecasting (WRF) model. The forecast deposition patterns are compared to dust collected by inverted-disc samplers and determined through gravimetric, chemical composition and lead isotopic analysis. The DFM is capable of predicting dust deposition patterns from the tailings impoundment to the surrounding area. The methodology and approach employed in this work can be generalized to other contaminated sites from which dust transport to the local environment can be assessed as a potential route for human exposure. PMID:29082035

  1. Hybrid methodology for tuberculosis incidence time-series forecasting based on ARIMA and a NAR neural network.

    PubMed

    Wang, K W; Deng, C; Li, J P; Zhang, Y Y; Li, X Y; Wu, M C

    2017-04-01

    Tuberculosis (TB) affects people globally and is being reconsidered as a serious public health problem in China. Reliable forecasting is useful for the prevention and control of TB. This study proposes a hybrid model combining autoregressive integrated moving average (ARIMA) with a nonlinear autoregressive (NAR) neural network for forecasting the incidence of TB from January 2007 to March 2016. Prediction performance was compared between the hybrid model and the ARIMA model. The best-fit hybrid model was combined with an ARIMA (3,1,0) × (0,1,1)12 and NAR neural network with four delays and 12 neurons in the hidden layer. The ARIMA-NAR hybrid model, which exhibited lower mean square error, mean absolute error, and mean absolute percentage error of 0·2209, 0·1373, and 0·0406, respectively, in the modelling performance, could produce more accurate forecasting of TB incidence compared to the ARIMA model. This study shows that developing and applying the ARIMA-NAR hybrid model is an effective method to fit the linear and nonlinear patterns of time-series data, and this model could be helpful in the prevention and control of TB.

  2. Bayesian Probabilistic Projections of Life Expectancy for All Countries

    PubMed Central

    Raftery, Adrian E.; Chunn, Jennifer L.; Gerland, Patrick; Ševčíková, Hana

    2014-01-01

    We propose a Bayesian hierarchical model for producing probabilistic forecasts of male period life expectancy at birth for all the countries of the world from the present to 2100. Such forecasts would be an input to the production of probabilistic population projections for all countries, which is currently being considered by the United Nations. To evaluate the method, we did an out-of-sample cross-validation experiment, fitting the model to the data from 1950–1995, and using the estimated model to forecast for the subsequent ten years. The ten-year predictions had a mean absolute error of about 1 year, about 40% less than the current UN methodology. The probabilistic forecasts were calibrated, in the sense that (for example) the 80% prediction intervals contained the truth about 80% of the time. We illustrate our method with results from Madagascar (a typical country with steadily improving life expectancy), Latvia (a country that has had a mortality crisis), and Japan (a leading country). We also show aggregated results for South Asia, a region with eight countries. Free publicly available R software packages called bayesLife and bayesDem are available to implement the method. PMID:23494599

  3. New technique for ensemble dressing combining Multimodel SuperEnsemble and precipitation PDF

    NASA Astrophysics Data System (ADS)

    Cane, D.; Milelli, M.

    2009-09-01

    The Multimodel SuperEnsemble technique (Krishnamurti et al., Science 285, 1548-1550, 1999) is a postprocessing method for the estimation of weather forecast parameters reducing direct model output errors. It differs from other ensemble analysis techniques by the use of an adequate weighting of the input forecast models to obtain a combined estimation of meteorological parameters. Weights are calculated by least-square minimization of the difference between the model and the observed field during a so-called training period. Although it can be applied successfully on the continuous parameters like temperature, humidity, wind speed and mean sea level pressure (Cane and Milelli, Meteorologische Zeitschrift, 15, 2, 2006), the Multimodel SuperEnsemble gives good results also when applied on the precipitation, a parameter quite difficult to handle with standard post-processing methods. Here we present our methodology for the Multimodel precipitation forecasts applied on a wide spectrum of results over Piemonte very dense non-GTS weather station network. We will focus particularly on an accurate statistical method for bias correction and on the ensemble dressing in agreement with the observed precipitation forecast-conditioned PDF. Acknowledgement: this work is supported by the Italian Civil Defence Department.

  4. Ensemble-sensitivity Analysis Based Observation Targeting for Mesoscale Convection Forecasts and Factors Influencing Observation-Impact Prediction

    NASA Astrophysics Data System (ADS)

    Hill, A.; Weiss, C.; Ancell, B. C.

    2017-12-01

    The basic premise of observation targeting is that additional observations, when gathered and assimilated with a numerical weather prediction (NWP) model, will produce a more accurate forecast related to a specific phenomenon. Ensemble-sensitivity analysis (ESA; Ancell and Hakim 2007; Torn and Hakim 2008) is a tool capable of accurately estimating the proper location of targeted observations in areas that have initial model uncertainty and large error growth, as well as predicting the reduction of forecast variance due to the assimilated observation. ESA relates an ensemble of NWP model forecasts, specifically an ensemble of scalar forecast metrics, linearly to earlier model states. A thorough investigation is presented to determine how different factors of the forecast process are impacting our ability to successfully target new observations for mesoscale convection forecasts. Our primary goals for this work are to determine: (1) If targeted observations hold more positive impact over non-targeted (i.e. randomly chosen) observations; (2) If there are lead-time constraints to targeting for convection; (3) How inflation, localization, and the assimilation filter influence impact prediction and realized results; (4) If there exist differences between targeted observations at the surface versus aloft; and (5) how physics errors and nonlinearity may augment observation impacts.Ten cases of dryline-initiated convection between 2011 to 2013 are simulated within a simplified OSSE framework and presented here. Ensemble simulations are produced from a cycling system that utilizes the Weather Research and Forecasting (WRF) model v3.8.1 within the Data Assimilation Research Testbed (DART). A "truth" (nature) simulation is produced by supplying a 3-km WRF run with GFS analyses and integrating the model forward 90 hours, from the beginning of ensemble initialization through the end of the forecast. Target locations for surface and radiosonde observations are computed 6, 12, and 18 hours into the forecast based on a chosen scalar forecast response metric (e.g., maximum reflectivity at convection initiation). A variety of experiments are designed to achieve the aforementioned goals and will be presented, along with their results, detailing the feasibility of targeting for mesoscale convection forecasts.

  5. Statistical Post-Processing of Wind Speed Forecasts to Estimate Relative Economic Value

    NASA Astrophysics Data System (ADS)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2013-04-01

    The objective of this research is to get the best possible wind speed forecasts for the wind energy industry by using an optimal combination of well-established forecasting and post-processing methods. We start with the ECMWF 51 member ensemble prediction system (EPS) which is underdispersive and hence uncalibrated. We aim to produce wind speed forecasts that are more accurate and calibrated than the EPS. The 51 members of the EPS are clustered to 8 weighted representative members (RMs), chosen to minimize the within-cluster spread, while maximizing the inter-cluster spread. The forecasts are then downscaled using two limited area models, WRF and COSMO, at two resolutions, 14km and 3km. This process creates four distinguishable ensembles which are used as input to statistical post-processes requiring multi-model forecasts. Two such processes are presented here. The first, Bayesian Model Averaging, has been proven to provide more calibrated and accurate wind speed forecasts than the ECMWF EPS using this multi-model input data. The second, heteroscedastic censored regression is indicating positive results also. We compare the two post-processing methods, applied to a year of hindcast wind speed data around Ireland, using an array of deterministic and probabilistic verification techniques, such as MAE, CRPS, probability transform integrals and verification rank histograms, to show which method provides the most accurate and calibrated forecasts. However, the value of a forecast to an end-user cannot be fully quantified by just the accuracy and calibration measurements mentioned, as the relationship between skill and value is complex. Capturing the full potential of the forecast benefits also requires detailed knowledge of the end-users' weather sensitive decision-making processes and most importantly the economic impact it will have on their income. Finally, we present the continuous relative economic value of both post-processing methods to identify which is more beneficial to the wind energy industry of Ireland.

  6. Do we need demographic data to forecast plant population dynamics?

    USGS Publications Warehouse

    Tredennick, Andrew T.; Hooten, Mevin B.; Adler, Peter B.

    2017-01-01

    Rapid environmental change has generated growing interest in forecasts of future population trajectories. Traditional population models built with detailed demographic observations from one study site can address the impacts of environmental change at particular locations, but are difficult to scale up to the landscape and regional scales relevant to management decisions. An alternative is to build models using population-level data that are much easier to collect over broad spatial scales than individual-level data. However, it is unknown whether models built using population-level data adequately capture the effects of density-dependence and environmental forcing that are necessary to generate skillful forecasts.Here, we test the consequences of aggregating individual responses when forecasting the population states (percent cover) and trajectories of four perennial grass species in a semi-arid grassland in Montana, USA. We parameterized two population models for each species, one based on individual-level data (survival, growth and recruitment) and one on population-level data (percent cover), and compared their forecasting accuracy and forecast horizons with and without the inclusion of climate covariates. For both models, we used Bayesian ridge regression to weight the influence of climate covariates for optimal prediction.In the absence of climate effects, we found no significant difference between the forecast accuracy of models based on individual-level data and models based on population-level data. Climate effects were weak, but increased forecast accuracy for two species. Increases in accuracy with climate covariates were similar between model types.In our case study, percent cover models generated forecasts as accurate as those from a demographic model. For the goal of forecasting, models based on aggregated individual-level data may offer a practical alternative to data-intensive demographic models. Long time series of percent cover data already exist for many plant species. Modelers should exploit these data to predict the impacts of environmental change.

  7. The need for conducting forensic analysis of decommissioned bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...

  8. Data Assimilation and Regional Forecasts Using Atmospheric InfraRed Sounder (AIRS) Profiles

    NASA Technical Reports Server (NTRS)

    Chou, Shih-Hung; Zavodsky, Bradley; Jedlovec, Gary

    2009-01-01

    In data sparse regions, remotely-sensed observations can be used to improve analyses, which in turn should lead to better forecasts. One such source comes from the Atmospheric Infrared Sounder (AIRS), which together with the Advanced Microwave Sounding Unit (AMSU), provides temperature and moisture profiles with an accuracy comparable to that of radiosondes. The purpose of this paper is to describe a procedure to optimally assimilate AIRS thermodynamic profiles--obtained from the version 5.0 Earth Observing System (EOS) science team retrieval algorithm-into a regional configuration of the Weather Research and Forecasting (WRF) model using WRF-Var. The paper focuses on development of background error covariances for the regional domain and background field type, a methodology for ingesting AIRS profiles as separate over-land and over-water retrievals with different error characteristics, and utilization of level-by-level quality indicators to select only the highest quality data. The assessment of the impact of the AIRS profiles on WRF-Var analyses will focus on intelligent use of the quality indicators, optimized tuning of the WRF-Var, and comparison of analysis soundings to radiosondes. The analyses will be used to conduct a month-long series of regional forecasts over the continental U.S. The long-tern1 impact of AIRS profiles on forecast will be assessed against verifying radiosonde and stage IV precipitation data.

  9. Data Assimilation and Regional Forecasts using Atmospheric InfraRed Sounder (AIRS) Profiles

    NASA Technical Reports Server (NTRS)

    Zabodsky, Brad; Chou, Shih-Hung; Jedlovec, Gary J.

    2009-01-01

    In data sparse regions, remotely-sensed observations can be used to improve analyses, which in turn should lead to better forecasts. One such source comes from the Atmospheric Infrared Sounder (AIRS), which, together with the Advanced Microwave Sounding Unit (AMSU), provides temperature and moisture profiles with an accuracy comparable to that of radionsondes. The purpose of this poster is to describe a procedure to optimally assimilate AIRS thermodynamic profiles, obtained from the version 5.0 Earth Observing System (EOS) science team retrieval algorithm, into a regional configuration of the Weather Research and Forecasting (WRF) model using WRF-Var. The poster focuses on development of background error covariances for the regional domain and background field type, a methodology for ingesting AIRS profiles as separate over-land and over-water retrievals with different error characteristics, and utilization of level-by-level quality indicators to select only the highest quality data. The assessment of the impact of the AIRS profiles on WRF-Var analyses will focus on intelligent use of the quality indicators, optimized tuning of the WRF-Var, and comparison of analysis soundings to radiosondes. The analyses are used to conduct a month-long series of regional forecasts over the continental U.S. The long-term impact of AIRS profiles on forecast will be assessed against NAM analyses and stage IV precipitation data.

  10. FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting.

    PubMed

    Alomar, Miquel L; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L

    2016-01-01

    Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting.

  11. FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting

    PubMed Central

    Alomar, Miquel L.; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L.

    2016-01-01

    Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting. PMID:26880876

  12. Evaluating the Impact of AIRS Observations on Regional Forecasts at the SPoRT Center

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley

    2011-01-01

    NASA Short-term Prediction Research and Transition (SPoRT) Center collaborates with operational partners of different sizes and operational goals to improve forecasts using targeted projects and data sets. Modeling and DA activities focus on demonstrating utility of NASA data sets and capabilities within operational systems. SPoRT has successfully assimilated the Atmospheric Infrared Sounder (AIRS) radiance and profile data. A collaborative project is underway with the Joint Center for Satellite Data Assimilation (JCSDA) to use AIRS profiles to better understand the impact of AIRS radiances assimilated within Gridpoint Statistical Interpolation (GSI) in hopes of engaging the operational DA community in a reassessment of assimilation methodologies to more effectively assimilate hyperspectral radiances.

  13. AgRISTARS: Foreign commodity production forecasting. Minutes of the annual formal project manager's review, including preliminary technical review reports of FY80 experiments. [wheat/barley and corn/soybean experiments

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The U.S./Canada wheat/barley exploratory experiment is discussed with emphasis on labeling, machine processing using P1A, and the crop calendar. Classification and the simulated aggregation test used in the U.S. corn/soybean exploratory experiment are also considered. Topics covered regarding the foreign commodity production forecasting project include: (1) the acquisition, handling, and processing of both U.S. and foreign agricultural data, as well as meteorological data. The accuracy assessment methodology, multicrop sampling and aggregation technology development, frame development, the yield project interface, and classification for area estimation are also examined.

  14. Performance evaluation of contrast-detail in full field digital mammography systems using ideal (Hotelling) observer vs. conventional automated analysis of CDMAM images for quality control of contrast-detail characteristics.

    PubMed

    Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia

    2015-11-01

    The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.; Mred Team

    2010-12-01

    The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.

  16. INSPIRE Project (IoNospheric Sounding for Pre-seismic anomalies Identification REsearch): Main Results and Future Prospects

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.; Andrzej, K.; Hernandez-Pajares, M.; Cherniak, I.; Zakharenkova, I.; Rothkaehl, H.; Davidenko, D.

    2017-12-01

    The INSPIRE project is dedicated to the study of physical processes and their effects in ionosphere which could be determined as earthquake precursors together with detailed description of the methodology of ionospheric pre-seismic anomalies definition. It was initiated by ESA and carried out by international consortium. The physical mechanisms of the ionospheric pre-seismic anomalies generation from ground to the ionosphere altitudes were formulated within framework of the Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling (LAIMC) model (Pulinets et al., 2015). The general algorithm for the identification of the ionospheric precursors was formalized which also takes into account the external Space Weather factors able to generate the false alarms. Importance of the special stable pattern called the "precursor mask" was highlighted which is based on self-similarity of pre-seismic ionospheric variations. The role of expert decision in pre-seismic anomalies interpretation for generation of seismic warning is important as well. The algorithm performance of the LAIMC seismo-ionospheric effect detection module has been demonstrated using the L'Aquila 2009 earthquake as a case study. The results of INSPIRE project have demonstrated that the ionospheric anomalies registered before the strong earthquakes could be used as reliable precursors. The detailed classification of the pre-seismic anomalies was presented in different regions of the ionosphere and signatures of the pre-seismic anomalies as detected by ground and satellite based instruments were described what clarified methodology of the precursor's identification from ionospheric multi-instrumental measurements. Configuration for the dedicated multi-observation experiment and satellite payload was proposed for the future implementation of the INSPIRE project results. In this regard the multi-instrument set can be divided by two groups: space equipment and ground-based support, which could be used for real-time monitoring. Together with scientific and technical tasks the set of political, logistic and administrative problems (including certification of approaches by seismological community, juridical procedures by the governmental authorities) should be resolved for the real earthquake forecast effectuation.

  17. Short-Range Prediction of Monsoon Precipitation by NCMRWF Regional Unified Model with Explicit Convection

    NASA Astrophysics Data System (ADS)

    Mamgain, Ashu; Rajagopal, E. N.; Mitra, A. K.; Webster, S.

    2018-03-01

    There are increasing efforts towards the prediction of high-impact weather systems and understanding of related dynamical and physical processes. High-resolution numerical model simulations can be used directly to model the impact at fine-scale details. Improvement in forecast accuracy can help in disaster management planning and execution. National Centre for Medium Range Weather Forecasting (NCMRWF) has implemented high-resolution regional unified modeling system with explicit convection embedded within coarser resolution global model with parameterized convection. The models configurations are based on UK Met Office unified seamless modeling system. Recent land use/land cover data (2012-2013) obtained from Indian Space Research Organisation (ISRO) are also used in model simulations. Results based on short-range forecast of both the global and regional models over India for a month indicate that convection-permitting simulations by the high-resolution regional model is able to reduce the dry bias over southern parts of West Coast and monsoon trough zone with more intense rainfall mainly towards northern parts of monsoon trough zone. Regional model with explicit convection has significantly improved the phase of the diurnal cycle of rainfall as compared to the global model. Results from two monsoon depression cases during study period show substantial improvement in details of rainfall pattern. Many categories in rainfall defined for operational forecast purposes by Indian forecasters are also well represented in case of convection-permitting high-resolution simulations. For the statistics of number of days within a range of rain categories between `No-Rain' and `Heavy Rain', the regional model is outperforming the global model in all the ranges. In the very heavy and extremely heavy categories, the regional simulations show overestimation of rainfall days. Global model with parameterized convection have tendency to overestimate the light rainfall days and underestimate the heavy rain days compared to the observation data.

  18. Development of Regional Power Sector Coal Fuel Costs (Prices) for the Short-Term Energy Outlook (STEO) Model

    EIA Publications

    2017-01-01

    The U.S. Energy Information Administration's Short-Term Energy Outlook (STEO) produces monthly projections of energy supply, demand, trade, and prices over a 13-24 month period. Every January, the forecast horizon is extended through December of the following year. The STEO model is an integrated system of econometric regression equations and identities that link data on the various components of the U.S. energy industry together in order to develop consistent forecasts. The regression equations are estimated and the STEO model is solved using the EViews 9.5 econometric software package from IHS Global Inc. The model consists of various modules specific to each energy resource. All modules provide projections for the United States, and some modules provide more detailed forecasts for different regions of the country.

  19. The quality and value of seasonal precipitation forecasts for an early warning of large-scale droughts and floods in West Africa

    NASA Astrophysics Data System (ADS)

    Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald

    2017-04-01

    Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.

  20. Calibration of Ocean Forcing with satellite Flux Estimates (COFFEE)

    NASA Astrophysics Data System (ADS)

    Barron, Charlie; Jan, Dastugue; Jackie, May; Rowley, Clark; Smith, Scott; Spence, Peter; Gremes-Cordero, Silvia

    2016-04-01

    Predicting the evolution of ocean temperature in regional ocean models depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. Within the COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates, real-time satellite observations are used to estimate shortwave, longwave, sensible, and latent air-sea heat flux corrections to a background estimate from the prior day's regional or global model forecast. These satellite-corrected fluxes are used to prepare a corrected ocean hindcast and to estimate flux error covariances to project the heat flux corrections for a 3-5 day forecast. In this way, satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. While traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle, COFFEE endeavors to appropriately partition and reduce among various surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using operational global or regional atmospheric forcing. Experiment cases combine different levels of flux calibration with assimilation alternatives. The cases use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.

  1. Satellite-based Calibration of Heat Flux at the Ocean Surface

    NASA Astrophysics Data System (ADS)

    Barron, C. N.; Dastugue, J. M.; May, J. C.; Rowley, C. D.; Smith, S. R.; Spence, P. L.; Gremes-Cordero, S.

    2016-02-01

    Model forecasts of upper ocean heat content and variability on diurnal to daily scales are highly dependent on estimates of heat flux through the air-sea interface. Satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. Traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle. Subsequent evolution depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. The COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates) endeavors to correct ocean forecast bias through a responsive error partition among surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using Navy operational global or regional atmospheric forcing. COFFEE addresses satellite-calibration of surface fluxes to estimate surface error covariances and links these to the ocean interior. Experiment cases combine different levels of flux calibration with different assimilation alternatives. The cases may use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.

  2. Prediction of ENSO episodes using canonical correlation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnston, A.G.; Ropelewski, C.F.

    Canonical correlation analysis (CCA) is explored as a multivariate linear statistical methodology with which to forecast fluctuations of the El Nino/Southern Oscillation (ENSO) in real time. CCA is capable of identifying critical sequences of predictor patterns that tend to evolve into subsequent pattern that can be used to form a forecast. The CCA model is used to forecast the 3-month mean sea surface temperature (SST) in several regions of the tropical Pacific and Indian oceans for projection times of 0 to 4 seasons beyond the immediately forthcoming season. The predictor variables, representing the climate situation in the four consecutive 3-monthmore » periods ending at the time of the forecast, are (1) quasi-global seasonal mean sea level pressure (SLP) and (2) SST in the predicted regions themselves. Forecast skill is estimated using cross-validation, and persistence is used as the primary skill control measure. Results indicate that a large region in the eastern equatorial Pacific (120[degrees]-170[degrees] W longitude) has the highest overall predictability, with excellent skill realized for winter forecasts made at the end of summer. CCA outperforms persistence in this region under most conditions, and does noticeably better with the SST included as a predictor in addition to the SLP. It is demonstrated that better forecast performance at the longer lead times would be obtained if some significantly earlier (i.e., up to 4 years) predictor data were included, because the ability to predict the lower-frequency ENSO phase changes would increase. The good performance of the current system at shorter lead times appears to be based largely on the ability to predict ENSO evolution for events already in progress. The forecasting of the eastern tropical Pacific SST using CCA is now done routinely on a monthly basis for a O-, 1-, and 2-season lead at the Climate Analysis Center.« less

  3. Updating of states in operational hydrological models

    NASA Astrophysics Data System (ADS)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  4. Short-term load forecasting of power system

    NASA Astrophysics Data System (ADS)

    Xu, Xiaobin

    2017-05-01

    In order to ensure the scientific nature of optimization about power system, it is necessary to improve the load forecasting accuracy. Power system load forecasting is based on accurate statistical data and survey data, starting from the history and current situation of electricity consumption, with a scientific method to predict the future development trend of power load and change the law of science. Short-term load forecasting is the basis of power system operation and analysis, which is of great significance to unit combination, economic dispatch and safety check. Therefore, the load forecasting of the power system is explained in detail in this paper. First, we use the data from 2012 to 2014 to establish the partial least squares model to regression analysis the relationship between daily maximum load, daily minimum load, daily average load and each meteorological factor, and select the highest peak by observing the regression coefficient histogram Day maximum temperature, daily minimum temperature and daily average temperature as the meteorological factors to improve the accuracy of load forecasting indicators. Secondly, in the case of uncertain climate impact, we use the time series model to predict the load data for 2015, respectively, the 2009-2014 load data were sorted out, through the previous six years of the data to forecast the data for this time in 2015. The criterion for the accuracy of the prediction is the average of the standard deviations for the prediction results and average load for the previous six years. Finally, considering the climate effect, we use the BP neural network model to predict the data in 2015, and optimize the forecast results on the basis of the time series model.

  5. The Impact of the Assimilation of AIRS Radiance Measurements on Short-term Weather Forecasts

    NASA Technical Reports Server (NTRS)

    McCarty, Will; Jedlovec, Gary; Miller, Timothy L.

    2009-01-01

    Advanced spaceborne instruments have the ability to improve the horizontal and vertical characterization of temperature and water vapor in the atmosphere through the explicit use of hyperspectral thermal infrared radiance measurements. The incorporation of these measurements into a data assimilation system provides a means to continuously characterize a three-dimensional, instantaneous atmospheric state necessary for the time integration of numerical weather forecasts. Measurements from the National Aeronautics and Space Administration (NASA) Atmospheric Infrared Sounder (AIRS) are incorporated into the gridpoint statistical interpolation (GSI) three-dimensional variational (3D-Var) assimilation system to provide improved initial conditions for use in a mesoscale modeling framework mimicking that of the operational North American Mesoscale (NAM) model. The methodologies for the incorporation of the measurements into the system are presented. Though the measurements have been shown to have a positive impact in global modeling systems, the measurements are further constrained in this system as the model top is physically lower than the global systems and there is no ozone characterization in the background state. For a study period, the measurements are shown to have positive impact on both the analysis state as well as subsequently spawned short-term (0-48 hr) forecasts, particularly in forecasted geopotential height and precipitation fields. At 48 hr, height anomaly correlations showed an improvement in forecast skill of 2.3 hours relative to a system without the AIRS measurements. Similarly, the equitable threat and bias scores of precipitation forecasts of 25 mm (6 hr)-1 were shown to be improved by 8% and 7%, respectively.

  6. Assimilating InSAR Maps of Water Vapor to Improve Heavy Rainfall Forecasts: A Case Study With Two Successive Storms

    NASA Astrophysics Data System (ADS)

    Mateus, Pedro; Miranda, Pedro M. A.; Nico, Giovanni; Catalão, João.; Pinto, Paulo; Tomé, Ricardo

    2018-04-01

    Very high resolution precipitable water vapor maps obtained by the Sentinel-1 A synthetic aperture radar (SAR), using the SAR interferometry (InSAR) technique, are here shown to have a positive impact on the performance of severe weather forecasts. A case study of deep convection which affected the city of Adra, Spain, on 6-7 September 2015, is successfully forecasted by the Weather Research and Forecasting model initialized with InSAR data assimilated by the three-dimensional variational technique, with improved space and time distributions of precipitation, as observed by the local weather radar and rain gauge. This case study is exceptional because it consisted of two severe events 12 hr apart, with a timing that allows for the assimilation of both the ascending and descending satellite images, each for the initialization of each event. The same methodology applied to the network of Global Navigation Satellite System observations in Iberia, at the same times, failed to reproduce observed precipitation, although it also improved, in a more modest way, the forecast skill. The impact of precipitable water vapor data is shown to result from a direct increment of convective available potential energy, associated with important adjustments in the low-level wind field, favoring its release in deep convection. It is suggested that InSAR images, complemented by dense Global Navigation Satellite System data, may provide a new source of water vapor data for weather forecasting, since their sampling frequency could reach the subdaily scale by merging different SAR platforms, or when future geosynchronous radar missions become operational.

  7. Accelerated Aging in Electrolytic Capacitors for Prognostics

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Kulkarni, Chetan; Saha, Sankalita; Biswas, Gautam; Goebel, Kai Frank

    2012-01-01

    The focus of this work is the analysis of different degradation phenomena based on thermal overstress and electrical overstress accelerated aging systems and the use of accelerated aging techniques for prognostics algorithm development. Results on thermal overstress and electrical overstress experiments are presented. In addition, preliminary results toward the development of physics-based degradation models are presented focusing on the electrolyte evaporation failure mechanism. An empirical degradation model based on percentage capacitance loss under electrical overstress is presented and used in: (i) a Bayesian-based implementation of model-based prognostics using a discrete Kalman filter for health state estimation, and (ii) a dynamic system representation of the degradation model for forecasting and remaining useful life (RUL) estimation. A leave-one-out validation methodology is used to assess the validity of the methodology under the small sample size constrain. The results observed on the RUL estimation are consistent through the validation tests comparing relative accuracy and prediction error. It has been observed that the inaccuracy of the model to represent the change in degradation behavior observed at the end of the test data is consistent throughout the validation tests, indicating the need of a more detailed degradation model or the use of an algorithm that could estimate model parameters on-line. Based on the observed degradation process under different stress intensity with rest periods, the need for more sophisticated degradation models is further supported. The current degradation model does not represent the capacitance recovery over rest periods following an accelerated aging stress period.

  8. Cognitive methodology for forecasting oil and gas industry using pattern-based neural information technologies

    NASA Astrophysics Data System (ADS)

    Gafurov, O.; Gafurov, D.; Syryamkin, V.

    2018-05-01

    The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.

  9. Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods

    USGS Publications Warehouse

    Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.

    2011-01-01

    Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.

  10. Time series modelling of global mean temperature for managerial decision-making.

    PubMed

    Romilly, Peter

    2005-07-01

    Climate change has important implications for business and economic activity. Effective management of climate change impacts will depend on the availability of accurate and cost-effective forecasts. This paper uses univariate time series techniques to model the properties of a global mean temperature dataset in order to develop a parsimonious forecasting model for managerial decision-making over the short-term horizon. Although the model is estimated on global temperature data, the methodology could also be applied to temperature data at more localised levels. The statistical techniques include seasonal and non-seasonal unit root testing with and without structural breaks, as well as ARIMA and GARCH modelling. A forecasting evaluation shows that the chosen model performs well against rival models. The estimation results confirm the findings of a number of previous studies, namely that global mean temperatures increased significantly throughout the 20th century. The use of GARCH modelling also shows the presence of volatility clustering in the temperature data, and a positive association between volatility and global mean temperature.

  11. A combination of HARMONIE short time direct normal irradiance forecasts and machine learning: The #hashtdim procedure

    NASA Astrophysics Data System (ADS)

    Gastón, Martín; Fernández-Peruchena, Carlos; Körnich, Heiner; Landelius, Tomas

    2017-06-01

    The present work describes the first approach of a new procedure to forecast Direct Normal Irradiance (DNI): the #hashtdim that treats to combine ground information and Numerical Weather Predictions. The system is centered in generate predictions for the very short time. It combines the outputs from the Numerical Weather Prediction Model HARMONIE with an adaptive methodology based on Machine Learning. The DNI predictions are generated with 15-minute and hourly temporal resolutions and presents 3-hourly updates. Each update offers forecasts to the next 12 hours, the first nine hours are generated with 15-minute temporal resolution meanwhile the last three hours present hourly temporal resolution. The system is proved over a Spanish emplacement with BSRN operative station in south of Spain (PSA station). The #hashtdim has been implemented in the framework of the Direct Normal Irradiance Nowcasting methods for optimized operation of concentrating solar technologies (DNICast) project, under the European Union's Seventh Programme for research, technological development and demonstration framework.

  12. Electron Flux Models for Different Energies at Geostationary Orbit

    NASA Technical Reports Server (NTRS)

    Boynton, R. J.; Balikhin, M. A.; Sibeck, D. G.; Walker, S. N.; Billings, S. A.; Ganushkina, N.

    2016-01-01

    Forecast models were derived for energetic electrons at all energy ranges sampled by the third-generation Geostationary Operational Environmental Satellites (GOES). These models were based on Multi-Input Single-Output Nonlinear Autoregressive Moving Average with Exogenous inputs methodologies. The model inputs include the solar wind velocity, density and pressure, the fraction of time that the interplanetary magnetic field (IMF) was southward, the IMF contribution of a solar wind-magnetosphere coupling function proposed by Boynton et al. (2011b), and the Dst index. As such, this study has deduced five new 1 h resolution models for the low-energy electrons measured by GOES (30-50 keV, 50-100 keV, 100-200 keV, 200-350 keV, and 350-600 keV) and extended the existing >800 keV and >2 MeV Geostationary Earth Orbit electron fluxes models to forecast at a 1 h resolution. All of these models were shown to provide accurate forecasts, with prediction efficiencies ranging between 66.9% and 82.3%.

  13. National Severe Storms Forecast Center

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The principal mission of the National Severe Storms Forecast Center (NSSFC) is to maintain a continuous watch of weather developments that are capable of producing severe local storms, including tornadoes, and to prepare and issue messages designated as either Weather Outlooks or Tornado or Severe Thunderstorm Watches for dissemination to the public and aviation services. In addition to its assigned responsibility at the national level, the NSSFC is involved in a number of programs at the regional and local levels. Subsequent subsections and paragraphs describe the NSSFC, its users, inputs, outputs, interfaces, capabilities, workload, problem areas, and future plans in more detail.

  14. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    NASA Technical Reports Server (NTRS)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using traditional verification methodologies. Output from object-based verification within NCAR s Meteorological Evaluation Tools reveals that the WRF runs initialized with LIS+MODIS data consistently generated precipitation objects that better matched observed precipitation objects, especially at higher precipitation intensities. The LIS+MODIS runs produced on average a 4% increase in matched precipitation areas and a simultaneous 4% decrease in unmatched areas during three months of daily simulations.

  15. Extensions and applications of ensemble-of-trees methods in machine learning

    NASA Astrophysics Data System (ADS)

    Bleich, Justin

    Ensemble-of-trees algorithms have emerged to the forefront of machine learning due to their ability to generate high forecasting accuracy for a wide array of regression and classification problems. Classic ensemble methodologies such as random forests (RF) and stochastic gradient boosting (SGB) rely on algorithmic procedures to generate fits to data. In contrast, more recent ensemble techniques such as Bayesian Additive Regression Trees (BART) and Dynamic Trees (DT) focus on an underlying Bayesian probability model to generate the fits. These new probability model-based approaches show much promise versus their algorithmic counterparts, but also offer substantial room for improvement. The first part of this thesis focuses on methodological advances for ensemble-of-trees techniques with an emphasis on the more recent Bayesian approaches. In particular, we focus on extensions of BART in four distinct ways. First, we develop a more robust implementation of BART for both research and application. We then develop a principled approach to variable selection for BART as well as the ability to naturally incorporate prior information on important covariates into the algorithm. Next, we propose a method for handling missing data that relies on the recursive structure of decision trees and does not require imputation. Last, we relax the assumption of homoskedasticity in the BART model to allow for parametric modeling of heteroskedasticity. The second part of this thesis returns to the classic algorithmic approaches in the context of classification problems with asymmetric costs of forecasting errors. First we consider the performance of RF and SGB more broadly and demonstrate its superiority to logistic regression for applications in criminology with asymmetric costs. Next, we use RF to forecast unplanned hospital readmissions upon patient discharge with asymmetric costs taken into account. Finally, we explore the construction of stable decision trees for forecasts of violence during probation hearings in court systems.

  16. Improving global flood risk awareness through collaborative research: Id-Lab

    NASA Astrophysics Data System (ADS)

    Weerts, A.; Zijderveld, A.; Cumiskey, L.; Buckman, L.; Verlaan, M.; Baart, F.

    2015-12-01

    Scientific and end-user collaboration on operational flood risk modelling and forecasting requires an environment where scientists and end-users can physically work together and demonstrate, enhance and learn about new tools, methods and models for forecasting and warning purposes. Therefore, Deltares has built a real-time demonstration, training and research infrastructure ('operational' room and ICT backend). This research infrastructure supports various functions like (1) Real time response and disaster management, (2) Training, (3) Collaborative Research, (4) Demonstration. The research infrastructure will be used for a mixture of these functions on a regular basis by Deltares and a multitude of both scientists as well as end users such as universities, research institutes, consultants, governments and aid agencies. This infrastructure facilitates emergency advice and support during international and national disasters caused by rainfall, tropical cyclones or tsunamis. It hosts research flood and storm surge forecasting systems for global/continental/regional scale. It facilitates training for emergency & disaster management (along with hosting forecasting system user trainings in for instance the forecasting platform Delft-FEWS) both internally and externally. The facility is expected to inspire and initiate creative innovations by bringing together different experts from various organizations. The room hosts interactive modelling developments, participatory workshops and stakeholder meetings. State of the art tools, models and software, being applied across the globe are available and on display within the facility. We will present the Id-Lab in detail and we will put particular focus on the global operational forecasting systems GLOFFIS (Global Flood Forecasting Information System) and GLOSSIS (Global Storm Surge Information System).

  17. Seasonal scale water deficit forecasting in Africa and the Middle East using NASA's Land Information System (LIS)

    NASA Astrophysics Data System (ADS)

    Peters-Lidard, C. D.; Arsenault, K. R.; Shukla, S.; Getirana, A.; McNally, A.; Koster, R. D.; Zaitchik, B. F.; Badr, H. S.; Roningen, J. M.; Kumar, S.; Funk, C. C.

    2017-12-01

    A seamless and effective water deficit monitoring and early warning system is critical for assessing food security in Africa and the Middle East. In this presentation, we report on the ongoing development and validation of a seasonal scale water deficit forecasting system based on NASA's Land Information System (LIS) and seasonal climate forecasts. First, our presentation will focus on the implementation and validation of drought and water availability monitoring products in the region. Next, it will focus on evaluating drought and water availability forecasts. Finally, details will be provided of our ongoing collaboration with end-user partners in the region (e.g., USAID's Famine Early Warning Systems Network, FEWS NET), on formulating meaningful early warning indicators, effective communication and seamless dissemination of the products through NASA's web-services. The water deficit forecasting system thus far incorporates NASA GMAO's Catchment and the Noah Multi-Physics (MP) LSMs. In addition, the LSMs' surface and subsurface runoff are routed through the Hydrological Modeling and Analysis Platform (HyMAP) to simulate surface water dynamics. To establish a climatology from 1981-2015, the two LSMs are driven by NASA/GMAO's Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2), and the USGS and UCSB Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) daily rainfall dataset. Comparison of the models' energy and hydrological budgets with independent observations suggests that major droughts are well-reflected in the climatology. The system uses seasonal climate forecasts from NASA's GEOS-5 (the Goddard Earth Observing System Model-5) and NCEP's Climate Forecast System-2, and it produces forecasts of soil moisture, ET and streamflow out to 6 months in the future. Forecasts of those variables are formulated in terms of indicators to provide forecasts of drought and water availability in the region. Current work suggests that for the Blue Nile basin, (1) the combination of GEOS-5 and CFSv2 is equivalent in skill to the full North American Multimodel Ensemble (NMME); and (2) the seasonal water deficit forecasting system skill for both soil moisture and streamflow anomalies is greater than the standard Ensemble Streamflow Prediction (ESP) approach.

  18. A methodology for long range prediction of air transportation

    NASA Technical Reports Server (NTRS)

    Ayati, M. B.; English, J. M.

    1980-01-01

    The paper describes the methodology for long-time projection of aircraft fuel requirements. A new concept of social and economic factors for future aviation industry which provides an estimate of predicted fuel usage is presented; it includes air traffic forecasts and lead times for producing new engines and aircraft types. An air transportation model is then developed in terms of an abstracted set of variables which represent the entire aircraft industry on a macroscale. This model was evaluated by testing the required output variables from a model based on historical data over the past decades.

  19. MyOcean Central Information System - Achievements and Perspectives

    NASA Astrophysics Data System (ADS)

    de Dianous, Rémi; Jolibois, Tony; Besnard, Sophie

    2015-04-01

    MyOcean (http://www.myocean.eu) is providing a pre-operational service, for forecasts, analysis and expertise on ocean currents, temperature, salinity, sea level, primary ecosystems and ice coverage. Since 2009, three successive projects (MyOcean-I, MyOcean-II and MyOcean-Follow-on) have been designed to prepare and to lead the demonstration phases of the future Copernicus Marine Environment Monitoring Service. The main goal of these projects was to build a system of systems offering the users a unique access point to European oceanographic data. Reaching this goal at European level with 59 partners from 28 different countries was a real challenge: initially, each local system had its own human processes and methodology, its own interfaces for production and dissemination. At the end of MyOcean Follow-on, any user can connect to one web portal, browse an interactive catalogue of products and services, use one login to access all data disseminated through harmonized interfaces in a common format and contact a unique centralized service desk. In this organization the central information system plays a key role. The production of observation and forecasting data is done by 48 Production Units (PU). Product download and visualisation are hosted by 26 Dissemination Units (DU). All these products and associated services are gathered in a single system hiding the intricate distributed organization of PUs and DUs. This central system will be presented in detail, including notably the technical choices in architecture and technologies which have been made and why, and the lessons learned during these years of real life of the system, taking into account internal and external feedbacks. Then, perspectives will be presented to sketch the future of such system in the next Marine Copernicus Service which is meant to be fully operational from 2015 onwards.

  20. Using Multispectral False Color Imaging to Characterize Tropical Cyclone Structure and Environment

    NASA Astrophysics Data System (ADS)

    Cossuth, J.; Bankert, R.; Richardson, K.; Surratt, M. L.

    2016-12-01

    The Naval Research Laboratory's (NRL) tropical cyclone (TC) web page (http://www.nrlmry.navy.mil/TC.html) has provided nearly two decades of near real-time access to TC-centric images and products by TC forecasters and enthusiasts around the world. Particularly, microwave imager and sounder information that is featured on this site provides crucial internal storm structure information by allowing users to perceive hydrometeor structure, providing key details beyond cloud top information provided by visible and infrared channels. Towards improving TC analysis techniques and helping advance the utility of the NRL TC webpage resource, new research efforts are presented. This work demonstrates results as well as the methodology used to develop new automated, objective satellite-based TC structure and intensity guidance and enhanced data fusion imagery products that aim to bolster and streamline TC forecast operations. This presentation focuses on the creation and interpretation of false color RGB composite imagery that leverages the different emissive and scattering properties of atmospheric ice, liquid, and vapor water as well as ocean surface roughness as seen by microwave radiometers. Specifically, a combination of near-realtime data and a standardized digital database of global TCs in microwave imagery from 1987-2012 is employed as a climatology of TC structures. The broad range of TC structures, from pinhole eyes through multiple eyewall configurations, is characterized as resolved by passive microwave sensors. The extraction of these characteristic features from historical data also lends itself to statistical analysis. For example, histograms of brightness temperature distributions allows a rigorous examination of how structural features are conveyed in image products, allowing a better representation of colors and breakpoints as they relate to physical features. Such climatological work also suggests steps to better inform the near-real time application of upcoming satellite datasets to TC analyses.

Top