Novel methodology for pharmaceutical expenditure forecast.
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.
Novel methodology for pharmaceutical expenditure forecast
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. Conclusions This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making. PMID:27226843
A data-driven multi-model methodology with deep feature selection for short-term wind forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias
With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coimbra, Carlos F. M.
2016-02-25
In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior inmore » real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cormier, Dallas; Edra, Sherwin; Espinoza, Michael
This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations,more » identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.« less
The impact of wind power on electricity prices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brancucci Martinez-Anido, Carlo; Brinkman, Greg; Hodge, Bri-Mathias
This paper investigates the impact of wind power on electricity prices using a production cost model of the Independent System Operator - New England power system. Different scenarios in terms of wind penetration, wind forecasts, and wind curtailment are modeled in order to analyze the impact of wind power on electricity prices for different wind penetration levels and for different levels of wind power visibility and controllability. The analysis concludes that electricity price volatility increases even as electricity prices decrease with increasing wind penetration levels. The impact of wind power on price volatility is larger in the shorter term (5-minmore » compared to hour-to-hour). The results presented show that over-forecasting wind power increases electricity prices while under-forecasting wind power reduces them. The modeling results also show that controlling wind power by allowing curtailment increases electricity prices, and for higher wind penetrations it also reduces their volatility.« less
Solar energy market penetration models - Science or number mysticism
NASA Technical Reports Server (NTRS)
Warren, E. H., Jr.
1980-01-01
The forecast market potential of a solar technology is an important factor determining its R&D funding. Since solar energy market penetration models are the method used to forecast market potential, they have a pivotal role in a solar technology's development. This paper critiques the applicability of the most common solar energy market penetration models. It is argued that the assumptions underlying the foundations of rigorously developed models, or the absence of a reasonable foundation for the remaining models, restrict their applicability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Wu, Hongyu; Florita, Anthony R.
The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less
Wang, Qin; Wu, Hongyu; Florita, Anthony R.; ...
2016-11-11
The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less
Microgrid optimal scheduling considering impact of high penetration wind generation
NASA Astrophysics Data System (ADS)
Alanazi, Abdulaziz
The objective of this thesis is to study the impact of high penetration wind energy in economic and reliable operation of microgrids. Wind power is variable, i.e., constantly changing, and nondispatchable, i.e., cannot be controlled by the microgrid controller. Thus an accurate forecasting of wind power is an essential task in order to study its impacts in microgrid operation. Two commonly used forecasting methods including Autoregressive Integrated Moving Average (ARIMA) and Artificial Neural Network (ANN) have been used in this thesis to improve the wind power forecasting. The forecasting error is calculated using a Mean Absolute Percentage Error (MAPE) and is improved using the ANN. The wind forecast is further used in the microgrid optimal scheduling problem. The microgrid optimal scheduling is performed by developing a viable model for security-constrained unit commitment (SCUC) based on mixed-integer linear programing (MILP) method. The proposed SCUC is solved for various wind penetration levels and the relationship between the total cost and the wind power penetration is found. In order to reduce microgrid power transfer fluctuations, an additional constraint is proposed and added to the SCUC formulation. The new constraint would control the time-based fluctuations. The impact of the constraint on microgrid SCUC results is tested and validated with numerical analysis. Finally, the applicability of proposed models is demonstrated through numerical simulations.
Transportation Sector Model of the National Energy Modeling System. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-01-01
This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. The NEMS Transportation Model comprises a series of semi-independent models which address different aspects of the transportation sector. The primary purpose of this model is to provide mid-term forecasts of transportation energy demand by fuel type including, but not limited to, motor gasoline, distillate, jet fuel, and alternative fuels (such as CNG) not commonly associated with transportation. Themore » current NEMS forecast horizon extends to the year 2010 and uses 1990 as the base year. Forecasts are generated through the separate consideration of energy consumption within the various modes of transport, including: private and fleet light-duty vehicles; aircraft; marine, rail, and truck freight; and various modes with minor overall impacts, such as mass transit and recreational boating. This approach is useful in assessing the impacts of policy initiatives, legislative mandates which affect individual modes of travel, and technological developments. The model also provides forecasts of selected intermediate values which are generated in order to determine energy consumption. These elements include estimates of passenger travel demand by automobile, air, or mass transit; estimates of the efficiency with which that demand is met; projections of vehicle stocks and the penetration of new technologies; and estimates of the demand for freight transport which are linked to forecasts of industrial output. Following the estimation of energy demand, TRAN produces forecasts of vehicular emissions of the following pollutants by source: oxides of sulfur, oxides of nitrogen, total carbon, carbon dioxide, carbon monoxide, and volatile organic compounds.« less
Stochastic Multi-Timescale Power System Operations With Variable Wind Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hongyu; Krad, Ibrahim; Florita, Anthony
This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less
Wind Power Forecasting Error Distributions: An International Comparison; Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, B. M.; Lew, D.; Milligan, M.
2012-09-01
Wind power forecasting is expected to be an important enabler for greater penetration of wind power into electricity systems. Because no wind forecasting system is perfect, a thorough understanding of the errors that do occur can be critical to system operation functions, such as the setting of operating reserve levels. This paper provides an international comparison of the distribution of wind power forecasting errors from operational systems, based on real forecast data. The paper concludes with an assessment of similarities and differences between the errors observed in different locations.
A Delphi Forecast of Technology in Education.
ERIC Educational Resources Information Center
Robinson, Burke E.
The forecast reported here surveys expected utilization levels, organizational structures, and values concerning technology in education in 1990. The focus is upon educational technology and forecasting methodology; televised instruction, computer-assisted instruction (CAI), and information services are considered. The methodology employed…
NASA Astrophysics Data System (ADS)
Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.
2015-12-01
One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.
2010-09-01
The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and windmore » forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. In this report, a new methodology to predict the uncertainty ranges for the required balancing capacity, ramping capability and ramp duration is presented. Uncertainties created by system load forecast errors, wind and solar forecast errors, generation forced outages are taken into account. The uncertainty ranges are evaluated for different confidence levels of having the actual generation requirements within the corresponding limits. The methodology helps to identify system balancing reserve requirement based on a desired system performance levels, identify system “breaking points”, where the generation system becomes unable to follow the generation requirement curve with the user-specified probability level, and determine the time remaining to these potential events. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (California ISO) real life data have shown the effectiveness of the proposed approach. A tool developed based on the new methodology described in this report will be integrated with the California ISO systems. Contractual work is currently in place to integrate the tool with the AREVA EMS system.« less
Improved Modeling Tools Development for High Penetration Solar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washom, Byron; Meagher, Kevin
2014-12-11
One of the significant objectives of the High Penetration solar research is to help the DOE understand, anticipate, and minimize grid operation impacts as more solar resources are added to the electric power system. For Task 2.2, an effective, reliable approach to predicting solar energy availability for energy generation forecasts using the University of California, San Diego (UCSD) Sky Imager technology has been demonstrated. Granular cloud and ramp forecasts for the next 5 to 20 minutes over an area of 10 square miles were developed. Sky images taken every 30 seconds are processed to determine cloud locations and cloud motionmore » vectors yielding future cloud shadow locations respective to distributed generation or utility solar power plants in the area. The performance of the method depends on cloud characteristics. On days with more advective cloud conditions, the developed method outperforms persistence forecasts by up to 30% (based on mean absolute error). On days with dynamic conditions, the method performs worse than persistence. Sky Imagers hold promise for ramp forecasting and ramp mitigation in conjunction with inverter controls and energy storage. The pre-commercial Sky Imager solar forecasting algorithm was documented with licensing information and was a Sunshot website highlight.« less
NASA Astrophysics Data System (ADS)
Sadyś, Magdalena; Skjøth, Carsten Ambelas; Kennedy, Roy
2016-04-01
High concentration levels of Ganoderma spp. spores were observed in Worcester, UK, during 2006-2010. These basidiospores are known to cause sensitization due to the allergen content and their small dimensions. This enables them to penetrate the lower part of the respiratory tract in humans. Establishment of a link between occurring symptoms of sensitization to Ganoderma spp. and other basidiospores is challenging due to lack of information regarding spore concentration in the air. Hence, aerobiological monitoring should be conducted, and if possible extended with the construction of forecast models. Daily mean concentration of allergenic Ganoderma spp. spores in the atmosphere of Worcester was measured using 7-day volumetric spore sampler through five consecutive years. The relationships between the presence of spores in the air and the weather parameters were examined. Forecast models were constructed for Ganoderma spp. spores using advanced statistical techniques, i.e. multivariate regression trees and artificial neural networks. Dew point temperature along with maximum temperature was the most important factor influencing the presence of spores in the air of Worcester. Based on these two major factors and several others of lesser importance, thresholds for certain levels of fungal spore concentration, i.e. low (0-49 s m-3), moderate (50-99 s m-3), high (100-149 s m-3) and very high (150 < n s m-3), could be designated. Despite some deviation in results obtained by artificial neural networks, authors have achieved a forecasting model, which was accurate (correlation between observed and predicted values varied from r s = 0.57 to r s = 0.68).
Sadyś, Magdalena; Skjøth, Carsten Ambelas; Kennedy, Roy
2016-04-01
High concentration levels of Ganoderma spp. spores were observed in Worcester, UK, during 2006-2010. These basidiospores are known to cause sensitization due to the allergen content and their small dimensions. This enables them to penetrate the lower part of the respiratory tract in humans. Establishment of a link between occurring symptoms of sensitization to Ganoderma spp. and other basidiospores is challenging due to lack of information regarding spore concentration in the air. Hence, aerobiological monitoring should be conducted, and if possible extended with the construction of forecast models. Daily mean concentration of allergenic Ganoderma spp. spores in the atmosphere of Worcester was measured using 7-day volumetric spore sampler through five consecutive years. The relationships between the presence of spores in the air and the weather parameters were examined. Forecast models were constructed for Ganoderma spp. spores using advanced statistical techniques, i.e. multivariate regression trees and artificial neural networks. Dew point temperature along with maximum temperature was the most important factor influencing the presence of spores in the air of Worcester. Based on these two major factors and several others of lesser importance, thresholds for certain levels of fungal spore concentration, i.e. low (0-49 s m(-3)), moderate (50-99 s m(-3)), high (100-149 s m(-3)) and very high (150 < n s m(-3)), could be designated. Despite some deviation in results obtained by artificial neural networks, authors have achieved a forecasting model, which was accurate (correlation between observed and predicted values varied from r s = 0.57 to r s = 0.68).
Comparing Vertical Distributions of Water Vapor Flux within Two Landfalling Atmospheric Rivers
NASA Astrophysics Data System (ADS)
Rutz, J. J.; Lavers, D. A.
2015-12-01
The West Coast of North America is frequently impacted by atmospheric rivers (ARs), regions of intense horizontal water vapor transport that often produce heavy rain, flooding, and landslides when they interact with near-coastal mountains. Recently, studies have shown that ARs penetrate farther inland on many occasions, with indications that the vertical distribution of vapor transport within the ARs may play a key role in this penetration (Alexander et al. 2015; Rutz et al. 2015). We hypothesize that the amount of near-coastal precipitation and the likelihood of AR penetration farther inland may be inversely linked by vertical distributions of vapor fluxes before, during, and after landfall. To explore whether differing vertical distributions of transport explain differing precipitation and penetration outcomes, we compare two landfalling ARs that had very similar spatial extents and rates of vertically integrated (total) vapor transport, but which nonetheless produced very different amounts of precipitation over northern California. The vertical distribution of water vapor flux, specific humidity, and wind speed during these two ARs are examined along several transects using cross-sectional analyses of the Climate Forecast System Reanalysis with a horizontal resolution of ~0.5° (~63 km) and a sigma-pressure hybrid coordinate at 64 vertical levels. In addition, we pursue similar analyses of forecasts from the NCEP Global Ensemble Forecast System GEFS to assess whether numerical weather prediction models accurately represent these distributions. Finally, we calculate backward trajectories from within each AR to examine whether or not the origins of their respective air parcels play a role in the resulting vertical distribution of water vapor flux. The results have major implications for two problems in weather prediction: (1) the near-coastal precipitation associated with landfalling ARs and (2) the likelihood of AR penetration farther inland.
Short-term energy outlook. Volume 2. Methodology
NASA Astrophysics Data System (ADS)
1983-05-01
Recent changes in forecasting methodology for nonutility distillate fuel oil demand and for the near-term petroleum forecasts are discussed. The accuracy of previous short-term forecasts of most of the major energy sources published in the last 13 issues of the Outlook is evaluated. Macroeconomic and weather assumptions are included in this evaluation. Energy forecasts for 1983 are compared. Structural change in US petroleum consumption, the use of appropriate weather data in energy demand modeling, and petroleum inventories, imports, and refinery runs are discussed.
On the effect of model parameters on forecast objects
NASA Astrophysics Data System (ADS)
Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott
2018-04-01
Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map
. The field for some quantities generally consists of spatially coherent and disconnected objects
. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output
of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.
2018-04-01
A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.
Methodological Problems in the Forecasting of Education
ERIC Educational Resources Information Center
Kostanian, S. L.
1978-01-01
Examines how forecasting of educational development in the Soviet Union can be coordinated with forecasts of scientific and technical progress. Predicts that the efficiency of social forecasting will increase when more empirical data on macro- and micro-processes is collected. (Author/DB)
Louisiana Airport System Plan aviation activity forecasts 1990-2010.
DOT National Transportation Integrated Search
1991-07-01
This report documents the methodology used to develop the aviation activity forecasts prepared as a part of the update to the Louisiana Airport System Plan and provides Louisiana aviation forecasts for the years 1990 to 2010. In general, the forecast...
Applications of a shadow camera system for energy meteorology
NASA Astrophysics Data System (ADS)
Kuhn, Pascal; Wilbert, Stefan; Prahl, Christoph; Garsche, Dominik; Schüler, David; Haase, Thomas; Ramirez, Lourdes; Zarzalejo, Luis; Meyer, Angela; Blanc, Philippe; Pitz-Paal, Robert
2018-02-01
Downward-facing shadow cameras might play a major role in future energy meteorology. Shadow cameras directly image shadows on the ground from an elevated position. They are used to validate other systems (e.g. all-sky imager based nowcasting systems, cloud speed sensors or satellite forecasts) and can potentially provide short term forecasts for solar power plants. Such forecasts are needed for electricity grids with high penetrations of renewable energy and can help to optimize plant operations. In this publication, two key applications of shadow cameras are briefly presented.
An Econometric Model for Forecasting Income and Employment in Hawaii.
ERIC Educational Resources Information Center
Chau, Laurence C.
This report presents the methodology for short-run forecasting of personal income and employment in Hawaii. The econometric model developed in the study is used to make actual forecasts through 1973 of income and employment, with major components forecasted separately. Several sets of forecasts are made, under different assumptions on external…
Analysis of Carbon Policies for Electricity Networks with High Penetration of Green Generation
NASA Astrophysics Data System (ADS)
Feijoo, Felipe A.
In recent decades, climate change has become one of the most crucial challenges for humanity. Climate change has a direct correlation with global warming, caused mainly by the green house gas emissions (GHG). The Environmental Protection Agency in the U.S. (EPA) attributes carbon dioxide to account for approximately 82% of the GHG emissions. Unfortunately, the energy sector is the main producer of carbon dioxide, with China and the U.S. as the highest emitters. Therefore, there is a strong (positive) correlation between energy production, global warming, and climate change. Stringent carbon emissions reduction targets have been established in order to reduce the impacts of GHG. Achieving these emissions reduction goals will require implementation of policies like as cap-and-trade and carbon taxes, together with transformation of the electricity grid into a smarter system with high green energy penetration. However, the consideration of policies solely in view of carbon emissions reduction may adversely impact other market outcomes such as electricity prices and consumption. In this dissertation, a two-layer mathematical-statistical framework is presented, that serves to develop carbon policies to reduce emissions level while minimizing the negative impacts on other market outcomes. The bottom layer of the two layer model comprises a bi-level optimization problem. The top layer comprises a statistical model and a Pareto analysis. Two related but different problems are studied under this methodology. The first problem looks into the design of cap-and-trade policies for deregulated electricity markets that satisfy the interest of different market constituents. Via the second problem, it is demonstrated how the framework can be used to obtain levels of carbon emissions reduction while minimizing the negative impact on electricity demand and maximizing green penetration from microgrids. In the aforementioned studies, forecasts for electricity prices and production cost are considered. This, this dissertation also presents anew forecast model that can be easily integrated in the two-layer framework. It is demonstrated in this dissertation that the proposed framework can be utilized by policy-makers, power companies, consumers, and market regulators in developing emissions policy decisions, bidding strategies, market regulations, and electricity dispatch strategies.
How bootstrap can help in forecasting time series with more than one seasonal pattern
NASA Astrophysics Data System (ADS)
Cordeiro, Clara; Neves, M. Manuela
2012-09-01
The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.
Quantifying the Economic and Grid Reliability Impacts of Improved Wind Power Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qin; Martinez-Anido, Carlo Brancucci; Wu, Hongyu
Wind power forecasting is an important tool in power system operations to address variability and uncertainty. Accurately doing so is important to reducing the occurrence and length of curtailment, enhancing market efficiency, and improving the operational reliability of the bulk power system. This research quantifies the value of wind power forecasting improvements in the IEEE 118-bus test system as modified to emulate the generation mixes of Midcontinent, California, and New England independent system operator balancing authority areas. To measure the economic value, a commercially available production cost modeling tool was used to simulate the multi-timescale unit commitment (UC) and economicmore » dispatch process for calculating the cost savings and curtailment reductions. To measure the reliability improvements, an in-house tool, FESTIV, was used to calculate the system's area control error and the North American Electric Reliability Corporation Control Performance Standard 2. The approach allowed scientific reproducibility of results and cross-validation of the tools. A total of 270 scenarios were evaluated to accommodate the variation of three factors: generation mix, wind penetration level, and wind fore-casting improvements. The modified IEEE 118-bus systems utilized 1 year of data at multiple timescales, including the day-ahead UC, 4-hour-ahead UC, and 5-min real-time dispatch. The value of improved wind power forecasting was found to be strongly tied to the conventional generation mix, existence of energy storage devices, and the penetration level of wind energy. The simulation results demonstrate that wind power forecasting brings clear benefits to power system operations.« less
An experimental system for flood risk forecasting at global scale
NASA Astrophysics Data System (ADS)
Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.
2016-12-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.
ASC-AD penetration modeling FY05 status report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kistler, Bruce L.; Ostien, Jakob T.; Chiesa, Michael L.
2006-04-01
Sandia currently lacks a high fidelity method for predicting loads on and subsequent structural response of earth penetrating weapons. This project seeks to test, debug, improve and validate methodologies for modeling earth penetration. Results of this project will allow us to optimize and certify designs for the B61-11, Robust Nuclear Earth Penetrator (RNEP), PEN-X and future nuclear and conventional penetrator systems. Since this is an ASC Advanced Deployment project the primary goal of the work is to test, debug, verify and validate new Sierra (and Nevada) tools. Also, since this project is part of the V&V program within ASC, uncertaintymore » quantification (UQ), optimization using DAKOTA [1] and sensitivity analysis are an integral part of the work. This project evaluates, verifies and validates new constitutive models, penetration methodologies and Sierra/Nevada codes. In FY05 the project focused mostly on PRESTO [2] using the Spherical Cavity Expansion (SCE) [3,4] and PRESTO Lagrangian analysis with a preformed hole (Pen-X) methodologies. Modeling penetration tests using PRESTO with a pilot hole was also attempted to evaluate constitutive models. Future years work would include the Alegra/SHISM [5] and AlegrdEP (Earth Penetration) methodologies when they are ready for validation testing. Constitutive models such as Soil-and-Foam, the Sandia Geomodel [6], and the K&C Concrete model [7] were also tested and evaluated. This report is submitted to satisfy annual documentation requirements for the ASC Advanced Deployment program. This report summarizes FY05 work performed in the Penetration Mechanical Response (ASC-APPS) and Penetration Mechanics (ASC-V&V) projects. A single report is written to document the two projects because of the significant amount of technical overlap.« less
Metric optimisation for analogue forecasting by simulated annealing
NASA Astrophysics Data System (ADS)
Bliefernicht, J.; Bárdossy, A.
2009-04-01
It is well known that weather patterns tend to recur from time to time. This property of the atmosphere is used by analogue forecasting techniques. They have a long history in weather forecasting and there are many applications predicting hydrological variables at the local scale for different lead times. The basic idea of the technique is to identify past weather situations which are similar (analogue) to the predicted one and to take the local conditions of the analogues as forecast. But the forecast performance of the analogue method depends on user-defined criteria like the choice of the distance function and the size of the predictor domain. In this study we propose a new methodology of optimising both criteria by minimising the forecast error with simulated annealing. The performance of the methodology is demonstrated for the probability forecast of daily areal precipitation. It is compared with a traditional analogue forecasting algorithm, which is used operational as an element of a hydrological forecasting system. The study is performed for several meso-scale catchments located in the Rhine basin in Germany. The methodology is validated by a jack-knife method in a perfect prognosis framework for a period of 48 years (1958-2005). The predictor variables are derived from the NCEP/NCAR reanalysis data set. The Brier skill score and the economic value are determined to evaluate the forecast skill and value of the technique. In this presentation we will present the concept of the optimisation algorithm and the outcome of the comparison. It will be also demonstrated how a decision maker should apply a probability forecast to maximise the economic benefit from it.
Short-term forecasting of turbidity in trunk main networks.
Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward
2017-11-01
Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.
2014-10-30
Force Weather Agency (AFWA) WRF 15-km atmospheric model forecast data and low-level turbulence. Archives of historical model data forecast predictors...Relationships between WRF model predictors and PIREPS were developed using the new data mining methodology. The new methodology was inspired...convection. Predictors of turbulence were collected from the AFWA WRF 15km model, and corresponding PIREPS (the predictand) were collected between 2013
[50 years of the methodology of weather forecasting for medicine].
Grigor'ev, K I; Povazhnaia, E L
2014-01-01
The materials reported in the present article illustrate the possibility of weather forecasting for the medical purposes in the historical aspect. The main characteristics of the relevant organizational and methodological approaches to meteoprophylaxis based of the standard medical forecasts are presented. The emphasis is laid on the priority of the domestic medical school in the development of the principles of diagnostics and treatment of meteosensitivity and meteotropic complications in the patients presenting with various diseases with special reference to their age-related characteristics.
A novel hybrid ensemble learning paradigm for tourism forecasting
NASA Astrophysics Data System (ADS)
Shabri, Ani
2015-02-01
In this paper, a hybrid forecasting model based on Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) is proposed to forecast tourism demand. This methodology first decomposes the original visitor arrival series into several Intrinsic Model Function (IMFs) components and one residual component by EMD technique. Then, IMFs components and the residual components is forecasted respectively using GMDH model whose input variables are selected by using Partial Autocorrelation Function (PACF). The final forecasted result for tourism series is produced by aggregating all the forecasted results. For evaluating the performance of the proposed EMD-GMDH methodologies, the monthly data of tourist arrivals from Singapore to Malaysia are used as an illustrative example. Empirical results show that the proposed EMD-GMDH model outperforms the EMD-ARIMA as well as the GMDH and ARIMA (Autoregressive Integrated Moving Average) models without time series decomposition.
A time series model: First-order integer-valued autoregressive (INAR(1))
NASA Astrophysics Data System (ADS)
Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.
2017-07-01
Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.
Action-based flood forecasting for triggering humanitarian action
NASA Astrophysics Data System (ADS)
Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin
2016-09-01
Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.
A new method for determining the optimal lagged ensemble
DelSole, T.; Tippett, M. K.; Pegion, K.
2017-01-01
Abstract We propose a general methodology for determining the lagged ensemble that minimizes the mean square forecast error. The MSE of a lagged ensemble is shown to depend only on a quantity called the cross‐lead error covariance matrix, which can be estimated from a short hindcast data set and parameterized in terms of analytic functions of time. The resulting parameterization allows the skill of forecasts to be evaluated for an arbitrary ensemble size and initialization frequency. Remarkably, the parameterization also can estimate the MSE of a burst ensemble simply by taking the limit of an infinitely small interval between initialization times. This methodology is applied to forecasts of the Madden Julian Oscillation (MJO) from version 2 of the Climate Forecast System version 2 (CFSv2). For leads greater than a week, little improvement is found in the MJO forecast skill when ensembles larger than 5 days are used or initializations greater than 4 times per day. We find that if the initialization frequency is too infrequent, important structures of the lagged error covariance matrix are lost. Lastly, we demonstrate that the forecast error at leads ≥10 days can be reduced by optimally weighting the lagged ensemble members. The weights are shown to depend only on the cross‐lead error covariance matrix. While the methodology developed here is applied to CFSv2, the technique can be easily adapted to other forecast systems. PMID:28580050
Approaches to Forecasting Demands for Library Network Services. Report No. 10.
ERIC Educational Resources Information Center
Kang, Jong Hoa
The problem of forecasting monthly demands for library network services is considered in terms of using forecasts as inputs to policy analysis models, and in terms of using forecasts to aid in the making of budgeting and staffing decisions. Box-Jenkins time-series methodology, adaptive filtering, and regression approaches are examined and compared…
Pharmaceutical expenditure forecast model to support health policy decision making.
Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
With constant incentives for healthcare payers to contain their pharmaceutical budgets, modelling policy decision impact became critical. The objective of this project was to test the impact of various policy decisions on pharmaceutical budget (developed for the European Commission for the project 'European Union (EU) Pharmaceutical expenditure forecast' - http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). A model was built to assess policy scenarios' impact on the pharmaceutical budgets of seven member states of the EU, namely France, Germany, Greece, Hungary, Poland, Portugal, and the United Kingdom. The following scenarios were tested: expanding the UK policies to EU, changing time to market access, modifying generic price and penetration, shifting the distribution chain of biosimilars (retail/hospital). Applying the UK policy resulted in dramatic savings for Germany (10 times the base case forecast) and substantial additional savings for France and Portugal (2 and 4 times the base case forecast, respectively). Delaying time to market was found be to a very powerful tool to reduce pharmaceutical expenditure. Applying the EU transparency directive (6-month process for pricing and reimbursement) increased pharmaceutical expenditure for all countries (from 1.1 to 4 times the base case forecast), except in Germany (additional savings). Decreasing the price of generics and boosting the penetration rate, as well as shifting distribution of biosimilars through hospital chain were also key methods to reduce pharmaceutical expenditure. Change in the level of reimbursement rate to 100% in all countries led to an important increase in the pharmaceutical budget. Forecasting pharmaceutical expenditure is a critical exercise to inform policy decision makers. The most important leverages identified by the model on pharmaceutical budget were driven by generic and biosimilar prices, penetration rate, and distribution. Reducing, even slightly, the prices of generics had a major impact on savings. However, very aggressive pricing of generic and biosimilar products might make this market unattractive and can be counterproductive. Worth noting, delaying time to access innovative products was also identified as an effective leverage to increase savings but might not be a desirable policy for breakthrough products. Increasing patient financial contributions, either directly or indirectly via their private insurances, is a more likely scenario rather than expanding the national pharmaceutical expenditure coverage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudgins, Andrew P.; Waight, Jim; Grover, Shailendra
OMNETRIC Corp., Duke Energy, CPS Energy, and the University of Texas at San Antonio (UTSA) created a project team to execute the project 'OpenFMB Reference Architecture Demonstration.' The project included development and demonstration of concepts that will enable the electric utility grid to host larger penetrations of renewable resources. The project concept calls for the aggregation of renewable resources and loads into microgrids and the control of these microgrids with an implementation of the OpenFMB Reference Architecture. The production of power from the renewable resources that are appearing on the grid today is very closely linked to the weather. Themore » difficulty of forecasting the weather, which is well understood, leads to difficulty in forecasting the production of renewable resources. The current state of the art in forecasting the power production from renewables (solar PV and wind) are accuracies in the range of 12-25 percent NMAE. In contrast the demand for electricity aggregated to the system level, is easier to predict. The state of the art of demand forecasting done, 24 hours ahead, is about 2-3% MAPE. Forecasting the load to be supplied from conventional resources (demand minus generation from renewable resources) is thus very hard to forecast. This means that even a few hours before the time of consumption, there can be considerable uncertainty over what must be done to balance supply and demand. Adding to the problem of difficulty of forecasting, is the reality of the variability of the actual production of power from renewables. Due to the variability of wind speeds and solar insolation, the actual output of power from renewable resources can vary significantly over a short period of time. Gusts of winds result is variation of power output of wind turbines. The shadows of clouds moving over solar PV arrays result in the variation of power production of the array. This compounds the problem of balancing supply and demand in real time. Establishing a control system that can manage distribution systems with large penetrations of renewable resources is difficult due to two major issues: (1) the lack of standardization and interoperability between the vast array of equipment in operation and on the market, most of which use different and proprietary means of communication and (2) the magnitude of the network and the information it generates and consumes. The objective of this project is to provide the industry with a design concept and tools that will enable the electric power grid to overcome these barriers and support a larger penetration of clean energy from renewable resources.« less
The SPoRT-WRF: Evaluating the Impact of NASA Datasets on Convective Forecasts
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Kozlowski, Danielle; Case, Jonathan; Molthan, Andrew
2012-01-01
Short-term Prediction Research and Transition (SPoRT) seeks to improve short-term, regional weather forecasts using unique NASA products and capabilities SPoRT has developed a unique, real-time configuration of the NASA Unified Weather Research and Forecasting (WRF)WRF (ARW) that integrates all SPoRT modeling research data: (1) 2-km SPoRT Sea Surface Temperature (SST) Composite, (2) 3-km LIS with 1-km Greenness Vegetation Fraction (GVFs) (3) 45-km AIRS retrieved profiles. Transitioned this real-time forecast to NOAA's Hazardous Weather Testbed (HWT) as deterministic model at Experimental Forecast Program (EFP). Feedback from forecasters/participants and internal evaluation of SPoRT-WRF shows a cool, dry bias that appears to suppress convection likely related to methodology for assimilation of AIRS profiles Version 2 of the SPoRT-WRF will premier at the 2012 EFP and include NASA physics, cycling data assimilation methodology, better coverage of precipitation forcing, and new GVFs
NASA Astrophysics Data System (ADS)
Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.
2015-12-01
Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.
Bayesian Hierarchical Models to Augment the Mediterranean Forecast System
2010-09-30
In part 2 (Bonazzi et al., 2010), the impact of the ensemble forecast methodology based on MFS-Wind-BHM perturbations is documented. Forecast...absence of dt data stage inputs, the forecast impact of MFS-Error-BHM is neutral. Experiments are underway now to introduce dt back into the MFS-Error...BHM and quantify forecast impacts at MFS. MFS-SuperEnsemble-BHM We have assembled all needed datasets and completed algorithmic development
Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian
2014-01-01
Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.
Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian
2014-01-01
Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627
The combined value of wind and solar power forecasting improvements and electricity storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Bri-Mathias; Brancucci Martinez-Anido, Carlo; Wang, Qin
As the penetration rates of variable renewable energy increase, the value of power systems operation flexibility technology options, such as renewable energy forecasting improvements and electricity storage, is also assumed to increase. In this work, we examine the value of these two technologies, when used independently and concurrently, for two real case studies that represent the generation mixes for the California and Midcontinent Independent System Operators (CAISO and MISO). Since both technologies provide additional system flexibility they reduce operational costs and renewable curtailment for both generation mixes under study. Interestingly, the relative impacts are quite similar when both technologies aremore » used together. Though both flexibility options can solve some of the same issues that arise with high penetration levels of renewables, they do not seem to significantly increase or decrease the economic potential of the other technology.« less
The combined value of wind and solar power forecasting improvements and electricity storage
Hodge, Bri-Mathias; Brancucci Martinez-Anido, Carlo; Wang, Qin; ...
2018-02-12
As the penetration rates of variable renewable energy increase, the value of power systems operation flexibility technology options, such as renewable energy forecasting improvements and electricity storage, is also assumed to increase. In this work, we examine the value of these two technologies, when used independently and concurrently, for two real case studies that represent the generation mixes for the California and Midcontinent Independent System Operators (CAISO and MISO). Since both technologies provide additional system flexibility they reduce operational costs and renewable curtailment for both generation mixes under study. Interestingly, the relative impacts are quite similar when both technologies aremore » used together. Though both flexibility options can solve some of the same issues that arise with high penetration levels of renewables, they do not seem to significantly increase or decrease the economic potential of the other technology.« less
Zhang, Jie; Hodge, Bri -Mathias; Lu, Siyuan; ...
2015-11-10
Accurate solar photovoltaic (PV) power forecasting allows utilities to reliably utilize solar resources on their systems. However, to truly measure the improvements that any new solar forecasting methods provide, it is important to develop a methodology for determining baseline and target values for the accuracy of solar forecasting at different spatial and temporal scales. This paper aims at developing a framework to derive baseline and target values for a suite of generally applicable, value-based, and custom-designed solar forecasting metrics. The work was informed by close collaboration with utility and independent system operator partners. The baseline values are established based onmore » state-of-the-art numerical weather prediction models and persistence models in combination with a radiative transfer model. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of PV power output. The proposed reserve-based methodology is a reasonable and practical approach that can be used to assess the economic benefits gained from improvements in accuracy of solar forecasting. Lastly, the financial baseline and targets can be translated back to forecasting accuracy metrics and requirements, which will guide research on solar forecasting improvements toward the areas that are most beneficial to power systems operations.« less
Adaptation of Mesoscale Weather Models to Local Forecasting
NASA Technical Reports Server (NTRS)
Manobianco, John T.; Taylor, Gregory E.; Case, Jonathan L.; Dianic, Allan V.; Wheeler, Mark W.; Zack, John W.; Nutter, Paul A.
2003-01-01
Methodologies have been developed for (1) configuring mesoscale numerical weather-prediction models for execution on high-performance computer workstations to make short-range weather forecasts for the vicinity of the Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) and (2) evaluating the performances of the models as configured. These methodologies have been implemented as part of a continuing effort to improve weather forecasting in support of operations of the U.S. space program. The models, methodologies, and results of the evaluations also have potential value for commercial users who could benefit from tailoring their operations and/or marketing strategies based on accurate predictions of local weather. More specifically, the purpose of developing the methodologies for configuring the models to run on computers at KSC and CCAFS is to provide accurate forecasts of winds, temperature, and such specific thunderstorm-related phenomena as lightning and precipitation. The purpose of developing the evaluation methodologies is to maximize the utility of the models by providing users with assessments of the capabilities and limitations of the models. The models used in this effort thus far include the Mesoscale Atmospheric Simulation System (MASS), the Regional Atmospheric Modeling System (RAMS), and the National Centers for Environmental Prediction Eta Model ( Eta for short). The configuration of the MASS and RAMS is designed to run the models at very high spatial resolution and incorporate local data to resolve fine-scale weather features. Model preprocessors were modified to incorporate surface, ship, buoy, and rawinsonde data as well as data from local wind towers, wind profilers, and conventional or Doppler radars. The overall evaluation of the MASS, Eta, and RAMS was designed to assess the utility of these mesoscale models for satisfying the weather-forecasting needs of the U.S. space program. The evaluation methodology includes objective and subjective verification methodologies. Objective (e.g., statistical) verification of point forecasts is a stringent measure of model performance, but when used alone, it is not usually sufficient for quantifying the value of the overall contribution of the model to the weather-forecasting process. This is especially true for mesoscale models with enhanced spatial and temporal resolution that may be capable of predicting meteorologically consistent, though not necessarily accurate, fine-scale weather phenomena. Therefore, subjective (phenomenological) evaluation, focusing on selected case studies and specific weather features, such as sea breezes and precipitation, has been performed to help quantify the added value that cannot be inferred solely from objective evaluation.
NASA Astrophysics Data System (ADS)
Pierro, Marco; De Felice, Matteo; Maggioni, Enrico; Moser, David; Perotto, Alessandro; Spada, Francesco; Cornaro, Cristina
2017-04-01
The growing photovoltaic generation results in a stochastic variability of the electric demand that could compromise the stability of the grid and increase the amount of energy reserve and the energy imbalance cost. On regional scale, solar power estimation and forecast is becoming essential for Distribution System Operators, Transmission System Operator, energy traders, and aggregators of generation. Indeed the estimation of regional PV power can be used for PV power supervision and real time control of residual load. Mid-term PV power forecast can be employed for transmission scheduling to reduce energy imbalance and related cost of penalties, residual load tracking, trading optimization, secondary energy reserve assessment. In this context, a new upscaling method was developed and used for estimation and mid-term forecast of the photovoltaic distributed generation in a small area in the north of Italy under the control of a local DSO. The method was based on spatial clustering of the PV fleet and neural networks models that input satellite or numerical weather prediction data (centered on cluster centroids) to estimate or predict the regional solar generation. It requires a low computational effort and very few input information should be provided by users. The power estimation model achieved a RMSE of 3% of installed capacity. Intra-day forecast (from 1 to 4 hours) obtained a RMSE of 5% - 7% while the one and two days forecast achieve to a RMSE of 7% and 7.5%. A model to estimate the forecast error and the prediction intervals was also developed. The photovoltaic production in the considered region provided the 6.9% of the electric consumption in 2015. Since the PV penetration is very similar to the one observed at national level (7.9%), this is a good case study to analyse the impact of PV generation on the electric grid and the effects of PV power forecast on transmission scheduling and on secondary reserve estimation. It appears that, already with 7% of PV penetration, the distributed PV generation could have a great impact both on the DSO energy need and on the transmission scheduling capability. Indeed, for some hours of the days in summer time, the photovoltaic generation can provide from 50% to 75% of the energy that the local DSO should buy from Italian TSO to cover the electrical demand. Moreover, mid-term forecast can reduce the annual energy imbalance between the scheduled transmission and the actual one from 10% of the TSO energy supply (without considering the PV forecast) to 2%. Furthermore, it was shown that prediction intervals could be used not only to estimate the probability of a specific PV generation bid on the energy market, but also to reduce the energy reserve predicted for the next day. Two different methods for energy reserve estimation were developed and tested. The first is based on a clear sky model while the second makes use of the PV prediction intervals with the 95% of confidence level. The latter reduces the amount of the day-ahead energy reserve of 36% with respect the clear sky method.
Scientific breakthroughs necessary for the commercial success of renewable energy (Invited)
NASA Astrophysics Data System (ADS)
Sharp, J.
2010-12-01
In recent years the wind energy industry has grown at an unprecedented rate, and in certain regions has attained significant penetration into the power infrastructure. This growth has been both a result of, and a precursor to, significant advances in the science and business of wind energy. But as a result of this growth and increasing penetration, further advances and breakthroughs will become increasingly important. These advances will be required in a number of different aspects of wind energy, including: resource assessment, operations and performance analysis, forecasting, and the impacts of increased wind energy development. Resource assessment has benefited from the development of tools specifically designed for this purpose. Despite this, the atmosphere is often portrayed in an extremely simplified manner by these tools. New methodologies should rely upon more sophisticated application of the physics of fluid flows. There will need to be an increasing reliance and acceptance of improved measurement techniques (remote sensing, volume rather than point measurements, etc), and more sophisticated and higher-resolution numerical methods for micrositing. The goals of resource assessment will have to include a better understanding of the variability and forecastability of potential sites. Operational and performance analysis are vital to quantifying how well all aspects of the business are being carried out. Operational wind farms generate large amounts of meteorological and mechanical data. Data mining and detailed analysis of this data has proven to be invaluable to shed light upon poorly understood aspects of the science and industry. Future analysis will need to be even more rigorous and creative. Worthy topics of study include the impact of turbine wakes upon downstream turbine performance, how to utilize operational data to improve resource assessment and forecasting, and what the impacts of large-scale wind energy development might be. Forecasting is an area in which there have been great advances, and yet even greater advances will be required in the future. Until recently, the scale of wind energy made forecasting relatively unimportant - something that could be handled by automated systems augmented with limited observations. Recently, however, the use of human forecasting teams and specialized observation networks has greatly advanced the state of the art. Further advances will need to include dense networks of observations, providing timely and reliable observations over a much deeper layer of the boundary layer. High resolution rapid refresh models incorporating these observations via data assimilation should advance the state of the art further. Finally, understanding potential impacts of increasing wind energy development is an area where there has been significant interest lately. Preliminary studies have raised concerns of possible unintended climatological consequences upon downwind areas. A policy breakthrough was the inclusion of language into SB 1462, providing for research into these concerns. Advances will be required in the areas of transmission system improvements. The generation of large amounts of wind energy itself will impact the energy infrastructure, and will require breakthroughs within all of the topics above, and thus be a breakthrough in its own right.
A Delphi forecast of technology in education
NASA Technical Reports Server (NTRS)
Robinson, B. E.
1973-01-01
The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.
Financial options methodology for analyzing investments in new technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenning, B.D.
1994-12-31
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisionsmore » are being contemplated.« less
Financial options methodology for analyzing investments in new technology
NASA Technical Reports Server (NTRS)
Wenning, B. D.
1995-01-01
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.
Long- Range Forecasting Of The Onset Of Southwest Monsoon Winds And Waves Near The Horn Of Africa
2017-12-01
SUMMARY OF CLIMATE ANALYSIS AND LONG-RANGE FORECAST METHODOLOGY Prior theses from Heidt (2006) and Lemke (2010) used methods similar to ours and to...6 II. DATA AND METHODS .......................................................................................7 A...9 D. ANALYSIS AND FORECAST METHODS .........................................10 1. Predictand Selection
Forecasting--A Systematic Modeling Methodology. Paper No. 489.
ERIC Educational Resources Information Center
Mabert, Vincent A.; Radcliffe, Robert C.
In an attempt to bridge the gap between academic understanding and practical business use, the Box-Jenkins technique of time series analysis for forecasting future events is presented with a minimum of mathematical notation. The method is presented in three stages: a discussion of traditional forecasting techniques, focusing on traditional…
A Short-Term Forecasting Procedure for Institution Enrollments.
ERIC Educational Resources Information Center
Pfitzner, Charles Barry
1987-01-01
Applies the Box-Jenkins time series methodology to enrollment data for the Virginia community college system. Describes the enrollment data set, the Box-Jenkins approach, and the forecasting results. Discusses the value of one-quarter ahead enrollment forecasts and implications for practice. Provides a technical discussion of the model. (DMM)
Client-Friendly Forecasting: Seasonal Runoff Predictions Using Out-of-the-Box Indices
NASA Astrophysics Data System (ADS)
Weil, P.
2013-12-01
For more than a century, statistical relationships have been recognized between atmospheric conditions at locations separated by thousands of miles, referred to as teleconnections. Some of the recognized teleconnections provide useful information about expected hydrologic conditions, so certain records of atmospheric conditions are quantified and published as hydroclimate indices. Certain hydroclimate indices can serve as strong leading indicators of climate patterns over North America and can be used to make skillful forecasts of seasonal runoff. The methodology described here creates a simple-to-use model that utilizes easily accessed data to make forecasts of April through September runoff months before the runoff season begins. For this project, forecasting models were developed for two snowmelt-driven river systems in Colorado and Wyoming. In addition to the global hydroclimate indices, the methodology uses several local hydrologic variables including the previous year's drought severity, headwater snow water equivalent and the reservoir contents for the major reservoirs in each basin. To improve the skill of the forecasts, logistic regression is used to develop a model that provides the likelihood that a year will fall into the upper, middle or lower tercile of historical flows. Categorical forecasting has two major advantages over modeling of specific flow amounts: (1) with less prediction outcomes models tend to have better predictive skill and (2) categorical models are very useful to clients and agencies with specific flow thresholds that dictate major changes in water resources management. The resulting methodology and functional forecasting model product is highly portable, applicable to many major river systems and easily explained to a non-technical audience.
NASA Astrophysics Data System (ADS)
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Plazas-Nossa, Leonardo; Hofer, Thomas; Gruber, Günter; Torres, Andres
2017-02-01
This work proposes a methodology for the forecasting of online water quality data provided by UV-Vis spectrometry. Therefore, a combination of principal component analysis (PCA) to reduce the dimensionality of a data set and artificial neural networks (ANNs) for forecasting purposes was used. The results obtained were compared with those obtained by using discrete Fourier transform (DFT). The proposed methodology was applied to four absorbance time series data sets composed by a total number of 5705 UV-Vis spectra. Absolute percentage errors obtained by applying the proposed PCA/ANN methodology vary between 10% and 13% for all four study sites. In general terms, the results obtained were hardly generalizable, as they appeared to be highly dependent on specific dynamics of the water system; however, some trends can be outlined. PCA/ANN methodology gives better results than PCA/DFT forecasting procedure by using a specific spectra range for the following conditions: (i) for Salitre wastewater treatment plant (WWTP) (first hour) and Graz West R05 (first 18 min), from the last part of UV range to all visible range; (ii) for Gibraltar pumping station (first 6 min) for all UV-Vis absorbance spectra; and (iii) for San Fernando WWTP (first 24 min) for all of UV range to middle part of visible range.
NASA Astrophysics Data System (ADS)
Li, Xiwang
Buildings consume about 41.1% of primary energy and 74% of the electricity in the U.S. Moreover, it is estimated by the National Energy Technology Laboratory that more than 1/4 of the 713 GW of U.S. electricity demand in 2010 could be dispatchable if only buildings could respond to that dispatch through advanced building energy control and operation strategies and smart grid infrastructure. In this study, it is envisioned that neighboring buildings will have the tendency to form a cluster, an open cyber-physical system to exploit the economic opportunities provided by a smart grid, distributed power generation, and storage devices. Through optimized demand management, these building clusters will then reduce overall primary energy consumption and peak time electricity consumption, and be more resilient to power disruptions. Therefore, this project seeks to develop a Net-zero building cluster simulation testbed and high fidelity energy forecasting models for adaptive and real-time control and decision making strategy development that can be used in a Net-zero building cluster. The following research activities are summarized in this thesis: 1) Development of a building cluster emulator for building cluster control and operation strategy assessment. 2) Development of a novel building energy forecasting methodology using active system identification and data fusion techniques. In this methodology, a systematic approach for building energy system characteristic evaluation, system excitation and model adaptation is included. The developed methodology is compared with other literature-reported building energy forecasting methods; 3) Development of the high fidelity on-line building cluster energy forecasting models, which includes energy forecasting models for buildings, PV panels, batteries and ice tank thermal storage systems 4) Small scale real building validation study to verify the performance of the developed building energy forecasting methodology. The outcomes of this thesis can be used for building cluster energy forecasting model development and model based control and operation optimization. The thesis concludes with a summary of the key outcomes of this research, as well as a list of recommendations for future work.
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn; Steinsland, Ingelin
2014-05-01
This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.
Educational Forecasting Methodologies: State of the Art, Trends, and Highlights.
ERIC Educational Resources Information Center
Hudson, Barclay; Bruno, James
This overview of both quantitative and qualitative methods of educational forecasting is introduced by a discussion of a general typology of forecasting methods. In each of the following sections, discussion follows the same general format: a number of basic approaches are identified (e.g. extrapolation, correlation, systems modelling), and each…
Semi-nonparametric VaR forecasts for hedge funds during the recent crisis
NASA Astrophysics Data System (ADS)
Del Brio, Esther B.; Mora-Valencia, Andrés; Perote, Javier
2014-05-01
The need to provide accurate value-at-risk (VaR) forecasting measures has triggered an important literature in econophysics. Although these accurate VaR models and methodologies are particularly demanded for hedge fund managers, there exist few articles specifically devoted to implement new techniques in hedge fund returns VaR forecasting. This article advances in these issues by comparing the performance of risk measures based on parametric distributions (the normal, Student’s t and skewed-t), semi-nonparametric (SNP) methodologies based on Gram-Charlier (GC) series and the extreme value theory (EVT) approach. Our results show that normal-, Student’s t- and Skewed t- based methodologies fail to forecast hedge fund VaR, whilst SNP and EVT approaches accurately success on it. We extend these results to the multivariate framework by providing an explicit formula for the GC copula and its density that encompasses the Gaussian copula and accounts for non-linear dependences. We show that the VaR obtained by the meta GC accurately captures portfolio risk and outperforms regulatory VaR estimates obtained through the meta Gaussian and Student’s t distributions.
Short term load forecasting of anomalous load using hybrid soft computing methods
NASA Astrophysics Data System (ADS)
Rasyid, S. A.; Abdullah, A. G.; Mulyadi, Y.
2016-04-01
Load forecast accuracy will have an impact on the generation cost is more economical. The use of electrical energy by consumers on holiday, show the tendency of the load patterns are not identical, it is different from the pattern of the load on a normal day. It is then defined as a anomalous load. In this paper, the method of hybrid ANN-Particle Swarm proposed to improve the accuracy of anomalous load forecasting that often occur on holidays. The proposed methodology has been used to forecast the half-hourly electricity demand for power systems in the Indonesia National Electricity Market in West Java region. Experiments were conducted by testing various of learning rate and learning data input. Performance of this methodology will be validated with real data from the national of electricity company. The result of observations show that the proposed formula is very effective to short-term load forecasting in the case of anomalous load. Hybrid ANN-Swarm Particle relatively simple and easy as a analysis tool by engineers.
New product forecasting with limited or no data
NASA Astrophysics Data System (ADS)
Ismai, Zuhaimy; Abu, Noratikah; Sufahani, Suliadi
2016-10-01
In the real world, forecasts would always be based on historical data with the assumption that the behaviour be the same for the future. But how do we forecast when there is no such data available? New product or new technologies normally has limited amount of data available. Knowing that forecasting is valuable for decision making, this paper presents forecasting of new product or new technologies using aggregate diffusion models and modified Bass Model. A newly launched Proton car and its penetration was chosen to demonstrate the possibility of forecasting sales demand where there is limited or no data available. The model was developed to forecast diffusion of new vehicle or an innovation in the Malaysian society. It is to represent the level of spread on the new vehicle among a given set of the society in terms of a simple mathematical function that elapsed since the introduction of the new product. This model will forecast the car sales volume. A procedure of the proposed diffusion model was designed and the parameters were estimated. Results obtained by applying the proposed diffusion model and numerical calculation shows that the model is robust and effective for forecasting demand of the new vehicle. The results reveal that newly developed modified Bass diffusion of demand function has significantly contributed for forecasting the diffusion of new Proton car or new product.
Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, WanYin; Zhang, Jie; Florita, Anthony
2015-12-08
Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance,more » cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.« less
DOT National Transportation Integrated Search
2011-02-01
The New Hampshire Department of Transportation Pavement Management Sections scope of work includes monitoring, evaluating, and : sometimes forecasting the condition of New Hampshires 4,560 miles of roadway network in order to provide guidance o...
Pharmaceutical expenditure forecast model to support health policy decision making
Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
Background and objective With constant incentives for healthcare payers to contain their pharmaceutical budgets, modelling policy decision impact became critical. The objective of this project was to test the impact of various policy decisions on pharmaceutical budget (developed for the European Commission for the project ‘European Union (EU) Pharmaceutical expenditure forecast’ – http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods A model was built to assess policy scenarios’ impact on the pharmaceutical budgets of seven member states of the EU, namely France, Germany, Greece, Hungary, Poland, Portugal, and the United Kingdom. The following scenarios were tested: expanding the UK policies to EU, changing time to market access, modifying generic price and penetration, shifting the distribution chain of biosimilars (retail/hospital). Results Applying the UK policy resulted in dramatic savings for Germany (10 times the base case forecast) and substantial additional savings for France and Portugal (2 and 4 times the base case forecast, respectively). Delaying time to market was found be to a very powerful tool to reduce pharmaceutical expenditure. Applying the EU transparency directive (6-month process for pricing and reimbursement) increased pharmaceutical expenditure for all countries (from 1.1 to 4 times the base case forecast), except in Germany (additional savings). Decreasing the price of generics and boosting the penetration rate, as well as shifting distribution of biosimilars through hospital chain were also key methods to reduce pharmaceutical expenditure. Change in the level of reimbursement rate to 100% in all countries led to an important increase in the pharmaceutical budget. Conclusions Forecasting pharmaceutical expenditure is a critical exercise to inform policy decision makers. The most important leverages identified by the model on pharmaceutical budget were driven by generic and biosimilar prices, penetration rate, and distribution. Reducing, even slightly, the prices of generics had a major impact on savings. However, very aggressive pricing of generic and biosimilar products might make this market unattractive and can be counterproductive. Worth noting, delaying time to access innovative products was also identified as an effective leverage to increase savings but might not be a desirable policy for breakthrough products. Increasing patient financial contributions, either directly or indirectly via their private insurances, is a more likely scenario rather than expanding the national pharmaceutical expenditure coverage. PMID:27226830
NASA Technical Reports Server (NTRS)
Weems, J.; Wyse, N.; Madura, J.; Secrist, M.; Pinder, C.
1991-01-01
Lightning plays a pivotal role in the operation decision process for space and ballistic launches at Cape Canaveral Air Force Station and Kennedy Space Center. Lightning forecasts are the responsibility of Detachment 11, 4th Weather Wing's Cape Canaveral Forecast Facility. These forecasts are important to daily ground processing as well as launch countdown decisions. The methodology and equipment used to forecast lightning are discussed. Impact on a recent mission is summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Hantao; Li, Fangxing; Fang, Xin
Our paper deals with extended-term energy storage (ES) arbitrage problems to maximize the annual revenue in deregulated power systems with high penetration wind power. The conventional ES arbitrage model takes the locational marginal prices (LMP) as an input and is unable to account for the impacts of ES operations on system LMPs. This paper proposes a bi-level ES arbitrage model, where the upper level maximizes the ES arbitrage revenue and the lower level simulates the market clearing process considering wind power and ES. The bi-level model is formulated as a mathematical program with equilibrium constraints (MPEC) and then recast intomore » a mixed-integer linear programming (MILP) using strong duality theory. Wind power fluctuations are characterized by the GARCH forecast model and the forecast error is modeled by forecast-bin based Beta distributions. Case studies are performed on a modified PJM 5-bus system and an IEEE 118-bus system with a weekly time horizon over an annual term to show the validity of the proposed bi-level model. The results from the conventional model and the bi-level model are compared under different ES power and energy ratings, and also various load and wind penetration levels.« less
Cui, Hantao; Li, Fangxing; Fang, Xin; ...
2017-10-04
Our paper deals with extended-term energy storage (ES) arbitrage problems to maximize the annual revenue in deregulated power systems with high penetration wind power. The conventional ES arbitrage model takes the locational marginal prices (LMP) as an input and is unable to account for the impacts of ES operations on system LMPs. This paper proposes a bi-level ES arbitrage model, where the upper level maximizes the ES arbitrage revenue and the lower level simulates the market clearing process considering wind power and ES. The bi-level model is formulated as a mathematical program with equilibrium constraints (MPEC) and then recast intomore » a mixed-integer linear programming (MILP) using strong duality theory. Wind power fluctuations are characterized by the GARCH forecast model and the forecast error is modeled by forecast-bin based Beta distributions. Case studies are performed on a modified PJM 5-bus system and an IEEE 118-bus system with a weekly time horizon over an annual term to show the validity of the proposed bi-level model. The results from the conventional model and the bi-level model are compared under different ES power and energy ratings, and also various load and wind penetration levels.« less
NASA Astrophysics Data System (ADS)
Engeland, K.; Steinsland, I.
2012-04-01
This work is driven by the needs of next generation short term optimization methodology for hydro power production. Stochastic optimization are about to be introduced; i.e. optimizing when available resources (water) and utility (prices) are uncertain. In this paper we focus on the available resources, i.e. water, where uncertainty mainly comes from uncertainty in future runoff. When optimizing a water system all catchments and several lead times have to be considered simultaneously. Depending on the system of hydropower reservoirs, it might be a set of headwater catchments, a system of upstream /downstream reservoirs where water used from one catchment /dam arrives in a lower catchment maybe days later, or a combination of both. The aim of this paper is therefore to construct a simultaneous probabilistic forecast for several catchments and lead times, i.e. to provide a predictive distribution for the forecasts. Stochastic optimization methods need samples/ensembles of run-off forecasts as input. Hence, it should also be possible to sample from our probabilistic forecast. A post-processing approach is taken, and an error model based on Box- Cox transformation, power transform and a temporal-spatial copula model is used. It accounts for both between catchment and between lead time dependencies. In operational use it is strait forward to sample run-off ensembles from this models that inherits the catchment and lead time dependencies. The methodology is tested and demonstrated in the Ulla-Førre river system, and simultaneous probabilistic forecasts for five catchments and ten lead times are constructed. The methodology has enough flexibility to model operationally important features in this case study such as hetroscadasety, lead-time varying temporal dependency and lead-time varying inter-catchment dependency. Our model is evaluated using CRPS for marginal predictive distributions and energy score for joint predictive distribution. It is tested against deterministic run-off forecast, climatology forecast and a persistent forecast, and is found to be the better probabilistic forecast for lead time grater then two. From an operational point of view the results are interesting as the between catchment dependency gets stronger with longer lead-times.
Quality Assessment of the Cobel-Isba Numerical Forecast System of Fog and Low Clouds
NASA Astrophysics Data System (ADS)
Bergot, Thierry
2007-06-01
Short-term forecasting of fog is a difficult issue which can have a large societal impact. Fog appears in the surface boundary layer and is driven by the interactions between land surface and the lower layers of the atmosphere. These interactions are still not well parameterized in current operational NWP models, and a new methodology based on local observations, an adaptive assimilation scheme and a local numerical model is tested. The proposed numerical forecast method of foggy conditions has been run during three years at Paris-CdG international airport. This test over a long-time period allows an in-depth evaluation of the forecast quality. This study demonstrates that detailed 1-D models, including detailed physical parameterizations and high vertical resolution, can reasonably represent the major features of the life cycle of fog (onset, development and dissipation) up to +6 h. The error on the forecast onset and burn-off time is typically 1 h. The major weakness of the methodology is related to the evolution of low clouds (stratus lowering). Even if the occurrence of fog is well forecasted, the value of the horizontal visibility is only crudely forecasted. Improvements in the microphysical parameterization and in the translation algorithm converting NWP prognostic variables into a corresponding horizontal visibility seems necessary to accurately forecast the value of the visibility.
Small Business Programs: Benefits, Barriers, Bridges and Critical Success Factors
2009-05-01
Decreased product development time cycles • Product, process and technology innovation • Joint marketing and advertising • Access to new markets or...Increased Joint Marketing and Advertising Yuva (2005) Increased Penetration into New Markets Yuva (2005); Terrill (2007) Improved Forecasting and Response
NASA Astrophysics Data System (ADS)
Kato, Takeyoshi; Sone, Akihito; Shimakage, Toyonari; Suzuoki, Yasuo
A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on the demonstrative studies on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization and daily operation evaluated with the cost. The main results are as follows. The required capacity of NaS battery must be increased by 10-40% against the ideal situation without the forecast error of PVS power output. The influence of forecast error on the received grid electricity would not be so significant on annual basis because the positive and negative forecast error varies with days. The annual total cost of facility and operation increases by 2-7% due to the forecast error applied in this study. The impact of forecast error on the facility optimization and operation optimization is almost the same each other at a few percentages, implying that the forecast accuracy should be improved in terms of both the number of times with large forecast error and the average error.
NASA Astrophysics Data System (ADS)
Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar
2017-02-01
Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.
Deciding the Future: A Forecast of Responsibilities of Secondary Teachers of English, 1970-2000 AD.
ERIC Educational Resources Information Center
Farrell, Edmund J.
This document is a slightly revised version of author's Ph.D. Dissertation, "A Forecast of Responsibilities of Secondary Teachers of English 1970-2000 A.D., with Implications for Teacher Education" (ED 049 253). A study in two parts, Part I presents the need for future planning in education; discusses briefly methodologies for forecasting the…
Resolution of Probabilistic Weather Forecasts with Application in Disease Management.
Hughes, G; McRoberts, N; Burnett, F J
2017-02-01
Predictive systems in disease management often incorporate weather data among the disease risk factors, and sometimes this comes in the form of forecast weather data rather than observed weather data. In such cases, it is useful to have an evaluation of the operational weather forecast, in addition to the evaluation of the disease forecasts provided by the predictive system. Typically, weather forecasts and disease forecasts are evaluated using different methodologies. However, the information theoretic quantity expected mutual information provides a basis for evaluating both kinds of forecast. Expected mutual information is an appropriate metric for the average performance of a predictive system over a set of forecasts. Both relative entropy (a divergence, measuring information gain) and specific information (an entropy difference, measuring change in uncertainty) provide a basis for the assessment of individual forecasts.
Methodology for the Assessment of the Macroeconomic Impacts of Stricter CAFE Standards - Addendum
2002-01-01
This assessment of the economic impacts of Corporate Average Fuel Economy (CAFÉ) standards marks the first time the Energy Information Administration has used the new direct linkage of the DRI-WEFA Macroeconomic Model to the National Energy Modeling System (NEMS) in a policy setting. This methodology assures an internally consistent solution between the energy market concepts forecast by NEMS and the aggregate economy as forecast by the DRI-WEFA Macroeconomic Model of the U.S. Economy.
Neural network based short-term load forecasting using weather compensation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chow, T.W.S.; Leung, C.T.
This paper presents a novel technique for electric load forecasting based on neural weather compensation. The proposed method is a nonlinear generalization of Box and Jenkins approach for nonstationary time-series prediction. A weather compensation neural network is implemented for one-day ahead electric load forecasting. The weather compensation neural network can accurately predict the change of actual electric load consumption from the previous day. The results, based on Hong Kong Island historical load demand, indicate that this methodology is capable of providing a more accurate load forecast with a 0.9% reduction in forecast error.
Quantifying automobile refinishing VOC air emissions - a methodology with estimates and forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S.P.; Rubick, C.
1996-12-31
Automobile refinishing coatings (referred to as paints), paint thinners, reducers, hardeners, catalysts, and cleanup solvents used during their application, contain volatile organic compounds (VOCs) which are precursors to ground level ozone formation. Some of these painting compounds create hazardous air pollutants (HAPs) which are toxic. This paper documents the methodology, data sets, and the results of surveys (conducted in the fall of 1995) used to develop revised per capita emissions factors for estimating and forecasting the VOC air emissions from the area source category of automobile refinishing. Emissions estimates, forecasts, trends, and reasons for these trends are presented. Future emissionsmore » inventory (EI) challenges are addressed in light of data availability and information networks.« less
Potential Vorticity Analysis of Low Level Thunderstorm Dynamics in an Idealized Supercell Simulation
2009-03-01
Severe Weather, Supercell, Weather Research and Forecasting Model , Advanced WRF 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...27 A. ADVANCED RESEARCH WRF MODEL .................................................27 1. Data, Model Setup, and Methodology...03/11/2006 GFS model run. Top row: 11/12Z initialization. Middle row: 12 hour forecast valid at 12/00Z. Bottom row: 24 hour forecast valid at
NASA Astrophysics Data System (ADS)
Liu, Y.; Zhang, Y.; Wood, A.; Lee, H. S.; Wu, L.; Schaake, J. C.
2016-12-01
Seasonal precipitation forecasts are a primary driver for seasonal streamflow prediction that is critical for a range of water resources applications, such as reservoir operations and drought management. However, it is well known that seasonal precipitation forecasts from climate models are often biased and also too coarse in spatial resolution for hydrologic applications. Therefore, post-processing procedures such as downscaling and bias correction are often needed. In this presentation, we discuss results from a recent study that applies a two-step methodology to downscale and correct the ensemble mean precipitation forecasts from the Climate Forecast System (CFS). First, CFS forecasts are downscaled and bias corrected using monthly reforecast analogs: we identify past precipitation forecasts that are similar to the current forecast, and then use the finer-scale observational analysis fields from the corresponding dates to represent the post-processed ensemble forecasts. Second, we construct the posterior distribution of forecast precipitation from the post-processed ensemble by integrating climate indices: a correlation analysis is performed to identify dominant climate indices for the study region, which are then used to weight the analysis analogs selected in the first step using a Bayesian approach. The methodology is applied to the California Nevada River Forecast Center (CNRFC) and the Middle Atlantic River Forecast Center (MARFC) regions for 1982-2015, using the North American Land Data Assimilation System (NLDAS-2) precipitation as the analysis. The results from cross validation show that the post-processed CFS precipitation forecast are considerably more skillful than the raw CFS with the analog approach only. Integrating climate indices can further improve the skill if the number of ensemble members considered is large enough; however, the improvement is generally limited to the first couple of months when compared against climatology. Impacts of various factors such as ensemble size, lead time, and choice of climate indices will also be discussed.
Wind Power Forecasting Error Frequency Analyses for Operational Power System Studies: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Florita, A.; Hodge, B. M.; Milligan, M.
2012-08-01
The examination of wind power forecasting errors is crucial for optimal unit commitment and economic dispatch of power systems with significant wind power penetrations. This scheduling process includes both renewable and nonrenewable generators, and the incorporation of wind power forecasts will become increasingly important as wind fleets constitute a larger portion of generation portfolios. This research considers the Western Wind and Solar Integration Study database of wind power forecasts and numerical actualizations. This database comprises more than 30,000 locations spread over the western United States, with a total wind power capacity of 960 GW. Error analyses for individual sites andmore » for specific balancing areas are performed using the database, quantifying the fit to theoretical distributions through goodness-of-fit metrics. Insights into wind-power forecasting error distributions are established for various levels of temporal and spatial resolution, contrasts made among the frequency distribution alternatives, and recommendations put forth for harnessing the results. Empirical data are used to produce more realistic site-level forecasts than previously employed, such that higher resolution operational studies are possible. This research feeds into a larger work of renewable integration through the links wind power forecasting has with various operational issues, such as stochastic unit commitment and flexible reserve level determination.« less
How can we deal with ANN in flood forecasting? As a simulation model or updating kernel!
NASA Astrophysics Data System (ADS)
Hassan Saddagh, Mohammad; Javad Abedini, Mohammad
2010-05-01
Flood forecasting and early warning, as a non-structural measure for flood control, is often considered to be the most effective and suitable alternative to mitigate the damage and human loss caused by flood. Forecast results which are output of hydrologic, hydraulic and/or black box models should secure accuracy of flood values and timing, especially for long lead time. The application of the artificial neural network (ANN) in flood forecasting has received extensive attentions in recent years due to its capability to capture the dynamics inherent in complex processes including flood. However, results obtained from executing plain ANN as simulation model demonstrate dramatic reduction in performance indices as lead time increases. This paper is intended to monitor the performance indices as it relates to flood forecasting and early warning using two different methodologies. While the first method employs a multilayer neural network trained using back-propagation scheme to forecast output hydrograph of a hypothetical river for various forecast lead time up to 6.0 hr, the second method uses 1D hydrodynamic MIKE11 model as forecasting model and multilayer neural network as updating kernel to monitor and assess the performance indices compared to ANN alone in light of increase in lead time. Results presented in both graphical and tabular format indicate superiority of MIKE11 coupled with ANN as updating kernel compared to ANN as simulation model alone. While plain ANN produces more accurate results for short lead time, the errors increase expeditiously for longer lead time. The second methodology provides more accurate and reliable results for longer forecast lead time.
NASA Astrophysics Data System (ADS)
Sone, Akihito; Kato, Takeyoshi; Shimakage, Toyonari; Suzuoki, Yasuo
A microgrid (MG) is one of the measures for enhancing the high penetration of renewable energy (RE)-based distributed generators (DGs). If a number of MGs are controlled to maintain the predetermined electricity demand including RE-based DGs as negative demand, they would contribute to supply-demand balancing of whole electric power system. For constructing a MG economically, the capacity optimization of controllable DGs against RE-based DGs is essential. By using a numerical simulation model developed based on a demonstrative study on a MG using PAFC and NaS battery as controllable DGs and photovoltaic power generation system (PVS) as a RE-based DG, this study discusses the influence of forecast accuracy of PVS output on the capacity optimization. Three forecast cases with different accuracy are compared. The main results are as follows. Even with no forecast error during every 30 min. as the ideal forecast method, the required capacity of NaS battery reaches about 40% of PVS capacity for mitigating the instantaneous forecast error within 30 min. The required capacity to compensate for the forecast error is doubled with the actual forecast method. The influence of forecast error can be reduced by adjusting the scheduled power output of controllable DGs according to the weather forecast. Besides, the required capacity can be reduced significantly if the error of balancing control in a MG is acceptable for a few percentages of periods, because the total periods of large forecast error is not so often.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Siyuan; Hwang, Youngdeok; Khabibrakhmanov, Ildar
With increasing penetration of solar and wind energy to the total energy supply mix, the pressing need for accurate energy forecasting has become well-recognized. Here we report the development of a machine-learning based model blending approach for statistically combining multiple meteorological models for improving the accuracy of solar/wind power forecast. Importantly, we demonstrate that in addition to parameters to be predicted (such as solar irradiance and power), including additional atmospheric state parameters which collectively define weather situations as machine learning input provides further enhanced accuracy for the blended result. Functional analysis of variance shows that the error of individual modelmore » has substantial dependence on the weather situation. The machine-learning approach effectively reduces such situation dependent error thus produces more accurate results compared to conventional multi-model ensemble approaches based on simplistic equally or unequally weighted model averaging. Validation over an extended period of time results show over 30% improvement in solar irradiance/power forecast accuracy compared to forecasts based on the best individual model.« less
Influenza forecasting in human populations: a scoping review.
Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A; McKenzie, F Ellis
2014-01-01
Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms "influenza AND (forecast* OR predict*)", excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials.
Influenza Forecasting in Human Populations: A Scoping Review
Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A.; McKenzie, F. Ellis
2014-01-01
Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms “influenza AND (forecast* OR predict*)”, excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials. PMID:24714027
Three-model ensemble wind prediction in southern Italy
NASA Astrophysics Data System (ADS)
Torcasio, Rosa Claudia; Federico, Stefano; Calidonna, Claudia Roberta; Avolio, Elenio; Drofa, Oxana; Landi, Tony Christian; Malguzzi, Piero; Buzzi, Andrea; Bonasoni, Paolo
2016-03-01
Quality of wind prediction is of great importance since a good wind forecast allows the prediction of available wind power, improving the penetration of renewable energies into the energy market. Here, a 1-year (1 December 2012 to 30 November 2013) three-model ensemble (TME) experiment for wind prediction is considered. The models employed, run operationally at National Research Council - Institute of Atmospheric Sciences and Climate (CNR-ISAC), are RAMS (Regional Atmospheric Modelling System), BOLAM (BOlogna Limited Area Model), and MOLOCH (MOdello LOCale in H coordinates). The area considered for the study is southern Italy and the measurements used for the forecast verification are those of the GTS (Global Telecommunication System). Comparison with observations is made every 3 h up to 48 h of forecast lead time. Results show that the three-model ensemble outperforms the forecast of each individual model. The RMSE improvement compared to the best model is between 22 and 30 %, depending on the season. It is also shown that the three-model ensemble outperforms the IFS (Integrated Forecasting System) of the ECMWF (European Centre for Medium-Range Weather Forecast) for the surface wind forecasts. Notably, the three-model ensemble forecast performs better than each unbiased model, showing the added value of the ensemble technique. Finally, the sensitivity of the three-model ensemble RMSE to the length of the training period is analysed.
Trends in methodological differences
Daniel J. Stynes; Malcolm I. Bevins; Tommy L. Brown
1980-01-01
Inconsistency in data collection has confounded attempts to identify and forecast outdoor recreation trends. Problems are highlighted through an evaluation of the methods employed in national outdoor recreation participation surveys and projections. Recommendations are advanced for improving data collection, trend measurement, and forecasting within outdoor recreation...
The development of ecological forecasts, namely, methodologies to predict the chemical, biological, and physical changes in terrestrial and aquatic ecosystems is desirable so that effective strategies for reducing the adverse impacts of human activities and extreme natural events...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiangqi; Zhang, Yingchen
This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less
Possible Improvements of the ACE Diversity Interchange Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel V.; Zhou, Ning; Makarov, Yuri V.
2010-07-26
North American Electric Reliability Corporation (NERC) grid is operated by about 131 balancing authorities (BA). Within each BA, operators are responsible for managing the unbalance (caused by both load and wind). As wind penetration levels increase, the challenges of managing power variation increases. Working independently, balancing area with limited regulating/load following generation and high wind power penetration faces significant challenges. The benefits of BA cooperation and consolidation increase when there is a significant wind energy penetration. To explore the benefits of BA cooperation, this paper investigates ACE sharing approach. A technology called ACE diversity interchange (ADI) is already in usemore » in the western interconnection. A new methodology extending ADI is proposed in the paper. The proposed advanced ADI overcoming some limitations existing in conventional ADI. Simulations using real statistical data of CAISO and BPA have shown high performance of the proposed advanced ADI methodology.« less
Flood forecasting using non-stationarity in a river with tidal influence - a feasibility study
NASA Astrophysics Data System (ADS)
Killick, Rebecca; Kretzschmar, Ann; Ilic, Suzi; Tych, Wlodek
2017-04-01
Flooding is the most common natural hazard causing damage, disruption and loss of life worldwide. Despite improvements in modelling and forecasting of water levels and flood inundation (Kretzschmar et al., 2014; Hoitink and Jay, 2016), there are still large discrepancies between predictions and observations particularly during storm events when accurate predictions are most important. Many models exist for forecasting river levels (Smith et al., 2013; Leedal et al., 2013) however they commonly assume that the errors in the data are independent, stationary and normally distributed. This is generally not the case especially during storm events suggesting that existing models are not describing the drivers of river level in an appropriate fashion. Further challenges exist in the lower sections of a river influenced by both river and tidal flows and their interaction and there is scope for improvement in prediction. This paper investigates the use of a powerful statistical technique to adaptively forecast river levels by modelling the process as locally stationary. The proposed methodology takes information on both upstream and downstream river levels and incorporates meteorological information (rainfall forecasts) and tidal levels when required to forecast river levels at a specified location. Using this approach, a single model will be capable of predicting water levels in both tidal and non-tidal river reaches. In this pilot project, the methodology of Smith et al. (2013) using harmonic tidal analysis and data based mechanistic modelling is compared with the methodology developed by Killick et al. (2016) utilising data-driven wavelet decomposition to account for the information contained in the upstream and downstream river data to forecast a non-stationary time-series. Preliminary modelling has been carried out using the tidal stretch of the River Lune in North-west England and initial results are presented here. Future work includes expanding the methodology to forecast river levels at a network of locations simultaneously. References Hoitink, A. J. F., and D. A. Jay (2016), Tidal river dynamics: Implications for deltas, Rev. Geophys., 54, 240-272 Killick, R., Knight, M., Nason, G.P., Eckley, I.A. (2016) The Local Partial Autocorrelation Function and its Application to the Forecasting of Locally Stationary Time Series. Submitted Kretzschmar, Ann and Tych, Wlodek and Chappell, Nick A (2014) Reversing hydrology: estimation of sub-hourly rainfall time-series from streamflow. Env. Modell Softw., 60. pp. 290-301 D. Leedal, A. H. Weerts, P. J. Smith, & K. J. Beven. (2013). Application of data-based mechanistic modelling for flood forecasting at multiple locations in the Eden catchment in the National Flood Forecasting System (England and Wales). HESS, 17(1), 177-185. Smith, P., Beven, K., Horsburgh, K., Hardaker, P., & Collier, C. (2013). Data-based mechanistic modelling of tidally affected river reaches for flood warning purposes: An example on the River Dee, UK. , Q.J.R. Meteorol. Soc. 139(671), 340-349.
A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data
NASA Astrophysics Data System (ADS)
Awajan, Ahmad Mohd; Ismail, Mohd Tahir
2017-08-01
Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.
Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, J.; Hodge, B. M.; Florita, A.
2013-10-01
Forecasting solar energy generation is a challenging task due to the variety of solar power systems and weather regimes encountered. Forecast inaccuracies can result in substantial economic losses and power system reliability issues. This paper presents a suite of generally applicable and value-based metrics for solar forecasting for a comprehensive set of scenarios (i.e., different time horizons, geographic locations, applications, etc.). In addition, a comprehensive framework is developed to analyze the sensitivity of the proposed metrics to three types of solar forecasting improvements using a design of experiments methodology, in conjunction with response surface and sensitivity analysis methods. The resultsmore » show that the developed metrics can efficiently evaluate the quality of solar forecasts, and assess the economic and reliability impact of improved solar forecasting.« less
Community-based early warning systems for flood risk mitigation in Nepal
NASA Astrophysics Data System (ADS)
Smith, Paul J.; Brown, Sarah; Dugar, Sumit
2017-03-01
This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.
Methodology for Air Quality Forecast Downscaling from Regional- to Street-Scale
NASA Astrophysics Data System (ADS)
Baklanov, Alexander; Nuterman, Roman; Mahura, Alexander; Amstrup, Bjarne; Hansen Saas, Bent; Havskov Sørensen, Jens; Lorenzen, Thomas; Weismann, Jakob
2010-05-01
The most serious air pollution events occur in cities where there is a combination of high population density and air pollution, e.g. from vehicles. The pollutants can lead to serious human health problems, including asthma, irritation of the lungs, bronchitis, pneumonia, decreased resistance to respiratory infections, and premature death. In particular air pollution is associated with increase in cardiovascular disease and lung cancer. In 2000 WHO estimated that between 2.5 % and 11 % of total annual deaths are caused by exposure to air pollution. However, European-scale air quality models are not suited for local forecasts, as their grid-cell is typically of the order of 5 to 10km and they generally lack detailed representation of urban effects. Two suites are used in the framework of the EC FP7 project MACC (Monitoring of Atmosphere Composition and Climate) to demonstrate how downscaling from the European MACC ensemble to local-scale air quality forecast will be carried out: one will illustrate capabilities for the city of Copenhagen (Denmark); the second will focus on the city of Bucharest (Romania). This work is devoted to the first suite, where methodological aspects of downscaling from regional (European/ Denmark) to urban scale (Copenhagen), and from the urban down to street scale. The first results of downscaling according to the proposed methodology are presented. The potential for downscaling of European air quality forecasts by operating urban and street-level forecast models is evaluated. This will bring a strong support for continuous improvement of the regional forecast modelling systems for air quality in Europe, and underline clear perspectives for the future regional air quality core and downstream services for end-users. At the end of the MACC project, requirements on "how-to-do" downscaling of European air-quality forecasts to the city and street levels with different approaches will be formulated.
Population forecasts for Bangladesh, using a Bayesian methodology.
Mahsin, Md; Hossain, Syed Shahadat
2012-12-01
Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.
Automated pavement analysis in Missouri using ground penetrating radar
DOT National Transportation Integrated Search
2003-02-01
Current geotechnical procedures for monitoring the condition of roadways are time consuming and can be disruptive to traffic, often requiring extensive invasive procedures (e.g., coring). Ground penetrating radar (GPR) technology offers a methodology...
Bayesian Population Forecasting: Extending the Lee-Carter Method.
Wiśniowski, Arkadiusz; Smith, Peter W F; Bijak, Jakub; Raymer, James; Forster, Jonathan J
2015-06-01
In this article, we develop a fully integrated and dynamic Bayesian approach to forecast populations by age and sex. The approach embeds the Lee-Carter type models for forecasting the age patterns, with associated measures of uncertainty, of fertility, mortality, immigration, and emigration within a cohort projection model. The methodology may be adapted to handle different data types and sources of information. To illustrate, we analyze time series data for the United Kingdom and forecast the components of population change to the year 2024. We also compare the results obtained from different forecast models for age-specific fertility, mortality, and migration. In doing so, we demonstrate the flexibility and advantages of adopting the Bayesian approach for population forecasting and highlight areas where this work could be extended.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Optis, Michael; Scott, George N.; Draxl, Caroline
The goal of this analysis was to assess the wind power forecast accuracy of the Vermont Weather Analytics Center (VTWAC) forecast system and to identify potential improvements to the forecasts. Based on the analysis at Georgia Mountain, the following recommendations for improving forecast performance were made: 1. Resolve the significant negative forecast bias in February-March 2017 (50% underprediction on average) 2. Improve the ability of the forecast model to capture the strong diurnal cycle of wind power 3. Add ability for forecast model to assess internal wake loss, particularly at sites where strong diurnal shifts in wind direction are present.more » Data availability and quality limited the robustness of this forecast assessment. A more thorough analysis would be possible given a longer period of record for the data (at least one full year), detailed supervisory control and data acquisition data for each wind plant, and more detailed information on the forecast system input data and methodologies.« less
Obtaining high-resolution stage forecasts by coupling large-scale hydrologic models with sensor data
NASA Astrophysics Data System (ADS)
Fries, K. J.; Kerkez, B.
2017-12-01
We investigate how "big" quantities of distributed sensor data can be coupled with a large-scale hydrologic model, in particular the National Water Model (NWM), to obtain hyper-resolution forecasts. The recent launch of the NWM provides a great example of how growing computational capacity is enabling a new generation of massive hydrologic models. While the NWM spans an unprecedented spatial extent, there remain many questions about how to improve forecast at the street-level, the resolution at which many stakeholders make critical decisions. Further, the NWM runs on supercomputers, so water managers who may have access to their own high-resolution measurements may not readily be able to assimilate them into the model. To that end, we ask the question: how can the advances of the large-scale NWM be coupled with new local observations to enable hyper-resolution hydrologic forecasts? A methodology is proposed whereby the flow forecasts of the NWM are directly mapped to high-resolution stream levels using Dynamical System Identification. We apply the methodology across a sensor network of 182 gages in Iowa. Of these sites, approximately one third have shown to perform well in high-resolution flood forecasting when coupled with the outputs of the NWM. The quality of these forecasts is characterized using Principal Component Analysis and Random Forests to identify where the NWM may benefit from new sources of local observations. We also discuss how this approach can help municipalities identify where they should place low-cost sensors to most benefit from flood forecasts of the NWM.
Ground penetrating radar study--phase I : technology review and evaluation.
DOT National Transportation Integrated Search
2006-08-01
In December 2005 Mississippi Department of Transportation (MDOT) initiated State Study No. 182 on review : and evaluation of ground penetrating radar (GPR) technology. This Phase I study has reviewed GPR equipment : and data interpretation methodolog...
Measuring the effectiveness of earthquake forecasting in insurance strategies
NASA Astrophysics Data System (ADS)
Mignan, A.; Muir-Wood, R.
2009-04-01
Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.
Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy
Rosewater, David; Ferreira, Summer; Schoenwald, David; ...
2018-01-25
Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less
Battery Energy Storage State-of-Charge Forecasting: Models, Optimization, and Accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosewater, David; Ferreira, Summer; Schoenwald, David
Battery energy storage systems (BESS) are a critical technology for integrating high penetration renewable power on an intelligent electrical grid. As limited energy restricts the steady-state operational state-of-charge (SoC) of storage systems, SoC forecasting models are used to determine feasible charge and discharge schedules that supply grid services. Smart grid controllers use SoC forecasts to optimize BESS schedules to make grid operation more efficient and resilient. This study presents three advances in BESS state-of-charge forecasting. First, two forecasting models are reformulated to be conducive to parameter optimization. Second, a new method for selecting optimal parameter values based on operational datamore » is presented. Last, a new framework for quantifying model accuracy is developed that enables a comparison between models, systems, and parameter selection methods. The accuracies achieved by both models, on two example battery systems, with each method of parameter selection are then compared in detail. The results of this analysis suggest variation in the suitability of these models for different battery types and applications. Finally, the proposed model formulations, optimization methods, and accuracy assessment framework can be used to improve the accuracy of SoC forecasts enabling better control over BESS charge/discharge schedules.« less
DOT National Transportation Integrated Search
2012-01-01
In Florida, low elevations can make transportation infrastructure in coastal and low-lying areas potentially vulnerable to sea level rise (SLR). Becuase global SLR forecasts lack precision at local or regional scales, SLR forecasts or scenarios for p...
ERIC Educational Resources Information Center
Cliffe, Neil; Stone, Roger; Coutts, Jeff; Reardon-Smith, Kathryn; Mushtaq, Shahbaz
2016-01-01
Purpose: This paper documents and evaluates collaborative learning processes aimed at developing farmer's knowledge, skills and aspirations to use seasonal climate forecasting (SCF). Methodology: Thirteen workshops conducted in 2012 engaged over 200 stakeholders across Australian sugar production regions. Workshop design promoted participant…
Forecast Occupational Supply: A Methodological Handbook.
ERIC Educational Resources Information Center
McKinlay, Bruce; Johnson, Lowell E.
Greater concern with unemployment in recent years has increased the need for accurate forecasting of future labor market requirements, in order to plan for vocational education and other manpower programs. However, past emphasis has been placed on labor demand, rather than supply, even though either side by itself is useless in determining skill…
Post-Secondary Enrolment Forecasting with Traditional and Cross Pressure-Impact Methodologies.
ERIC Educational Resources Information Center
Hoffman, Bernard B.
A model for forecasting postsecondary enrollment, the PDEM-1, is considered, which combines the traditional with a cross-pressure impact decision-making model. The model is considered in relation to its background, assumptions, survey instrument, model conception, applicability to educational environments, and implementation difficulties. The…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... quantitative research and evaluation process that forecasts economic excess sector returns (over/under the... proprietary SectorSAM quantitative research and evaluation process. \\8\\ The following convictions constitute... Allocation Methodology'' (``SectorSAM''), which is a proprietary quantitative analysis, to forecast each...
Documentation of volume 3 of the 1978 Energy Information Administration annual report to congress
NASA Astrophysics Data System (ADS)
1980-02-01
In a preliminary overview of the projection process, the relationship between energy prices, supply, and demand is addressed. Topics treated in detail include a description of energy economic interactions, assumptions regarding world oil prices, and energy modeling in the long term beyond 1995. Subsequent sections present the general approach and methodology underlying the forecasts, and define and describe the alternative projection series and their associated assumptions. Short term forecasting, midterm forecasting, long term forecasting of petroleum, coal, and gas supplies are included. The role of nuclear power as an energy source is also discussed.
Forecasting daily passenger traffic volumes in the Moscow metro
NASA Astrophysics Data System (ADS)
Ivanov, V. V.; Osetrov, E. S.
2018-01-01
In this paper we have developed a methodology for the medium-term prediction of daily volumes of passenger traffic in the Moscow metro. It includes three options for the forecast: (1) based on artificial neural networks (ANNs), (2) singular-spectral analysis implemented in the Caterpillar-SSA package, and (3) a combination of the ANN and Caterpillar-SSA approaches. The methods and algorithms allow the mediumterm forecasting of passenger traffic flows in the Moscow metro with reasonable accuracy.
Increasing the temporal resolution of direct normal solar irradiance forecasted series
NASA Astrophysics Data System (ADS)
Fernández-Peruchena, Carlos M.; Gastón, Martin; Schroedter-Homscheidt, Marion; Marco, Isabel Martínez; Casado-Rubio, José L.; García-Moya, José Antonio
2017-06-01
A detailed knowledge of the solar resource is a critical point in the design and control of Concentrating Solar Power (CSP) plants. In particular, accurate forecasting of solar irradiance is essential for the efficient operation of solar thermal power plants, the management of energy markets, and the widespread implementation of this technology. Numerical weather prediction (NWP) models are commonly used for solar radiation forecasting. In the ECMWF deterministic forecasting system, all forecast parameters are commercially available worldwide at 3-hourly intervals. Unfortunately, as Direct Normal solar Irradiance (DNI) exhibits a great variability due to the dynamic effects of passing clouds, 3-h time resolution is insufficient for accurate simulations of CSP plants due to their nonlinear response to DNI, governed by various thermal inertias due to their complex response characteristics. DNI series of hourly or sub-hourly frequency resolution are normally used for an accurate modeling and analysis of transient processes in CSP technologies. In this context, the objective of this study is to propose a methodology for generating synthetic DNI time series at 1-h (or higher) temporal resolution from 3-h DNI series. The methodology is based upon patterns as being defined with help of the clear-sky envelope approach together with a forecast of maximum DNI value, and it has been validated with high quality measured DNI data.
A methodology for reduced order modeling and calibration of the upper atmosphere
NASA Astrophysics Data System (ADS)
Mehta, Piyush M.; Linares, Richard
2017-10-01
Atmospheric drag is the largest source of uncertainty in accurately predicting the orbit of satellites in low Earth orbit (LEO). Accurately predicting drag for objects that traverse LEO is critical to space situational awareness. Atmospheric models used for orbital drag calculations can be characterized either as empirical or physics-based (first principles based). Empirical models are fast to evaluate but offer limited real-time predictive/forecasting ability, while physics based models offer greater predictive/forecasting ability but require dedicated parallel computational resources. Also, calibration with accurate data is required for either type of models. This paper presents a new methodology based on proper orthogonal decomposition toward development of a quasi-physical, predictive, reduced order model that combines the speed of empirical and the predictive/forecasting capabilities of physics-based models. The methodology is developed to reduce the high dimensionality of physics-based models while maintaining its capabilities. We develop the methodology using the Naval Research Lab's Mass Spectrometer Incoherent Scatter model and show that the diurnal and seasonal variations can be captured using a small number of modes and parameters. We also present calibration of the reduced order model using the CHAMP and GRACE accelerometer-derived densities. Results show that the method performs well for modeling and calibration of the upper atmosphere.
An application of ensemble/multi model approach for wind power production forecasting
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Pinson, P.; Hagedorn, R.; Decimi, G.; Sperati, S.
2011-02-01
The wind power forecasts of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast applied in this study is based on meteorological models that provide the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. For this purpose a training of a Neural Network (NN) to link directly the forecasted meteorological data and the power data has been performed. One wind farm has been examined located in a mountain area in the south of Italy (Sicily). First we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by the combination of models (RAMS, ECMWF deterministic, LAMI). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error (normalized by nominal power) of at least 1% compared to the singles models approach. Finally we have focused on the possibility of using the ensemble model system (EPS by ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first three days ahead period.
Forecasting waste compositions: A case study on plastic waste of electronic display housings.
Peeters, Jef R; Vanegas, Paul; Kellens, Karel; Wang, Feng; Huisman, Jaco; Dewulf, Wim; Duflou, Joost R
2015-12-01
Because of the rapid succession of technological developments, the architecture and material composition of many products used in daily life have drastically changed over the last decades. As a result, well-adjusted recycling technologies need to be developed and installed to cope with these evolutions. This is essential to guarantee continued access to materials and to reduce the ecological impact of our material consumption. However, limited information is currently available on the material composition of arising waste streams and even less on how these waste streams will evolve. Therefore, this paper presents a methodology to forecast trends in the material composition of waste streams. To demonstrate the applicability and value of the proposed methodology, it is applied to forecast the evolution of plastic housing waste from flat panel display (FPD) TVs, FPD monitors, cathode ray tube (CRT) TVs and CRT monitors. The results of the presented forecasts indicate that a wide variety of plastic types and additives, such as flame retardants, are found in housings of similar products. The presented case study demonstrates that the proposed methodology allows the identification of trends in the evolution of the material composition of waste streams. In addition, it is demonstrated that the recycling sector will need to adapt its processes to deal with the increasing complexity of plastics of end-of-life electronic displays while respecting relevant directives. Copyright © 2015 Elsevier Ltd. All rights reserved.
NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science
NASA Astrophysics Data System (ADS)
Robertson, F. R.; Roberts, J. B.
2014-12-01
This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.
NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science
NASA Technical Reports Server (NTRS)
Robertson, Franklin R.; Roberts, Jason B.
2014-01-01
This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samaan, Nader A.; Makarov, Yuri V.; Nguyen, Tony B.
2017-05-07
The study described in this chapter demonstrates the benefits of BA consolidation with the help of a detailed WECC system model and advanced methodology, which is also described in this chapter. The study aims to determine the potential savings in production cost and reduction in balancing reserve requirements in the WECC system. The study has found that effective use of the diversity in load and variable generation over a wide area can indeed help to achieve significant savings. The implementation cost for the consolidation was beyond the scope of this study. The analysis was performed for two different scenarios ofmore » VG penetration: 11% (8% wind and 3% solar) and 33% (24% wind and 9% solar) of WECC projected energy demand in 2020. In analysis of balancing reserves, the objective was to determine the reduction in balancing reserve requirements due to BA consolidation, in terms of required capacity and ramp-rates. Hour-ahead and 10-minute ahead forecast errors for load, wind, and solar were simulated. In addition, 1-minute resolution load, wind and solar data were used to derive balancing reserve requirements i.e. load-following and regulation requirements for each individual BA and for the consolidated BA (CBA). The reduction in balancing reserves was determined by calculating the difference between total reserve requirements that need to be carried by different BAs if they operate individually, and reserve requirements that need to be carried by the CBA. The study results show that the consolidated WECC system would have about a 50% overall reduction in balancing reserves for the 11% penetration scenario and a 65% reduction for the 33% penetration scenario in comparison with total reserve requirements that need to be carried by different BAs if they operate individually.« less
The Use of Factorial Forecasting to Predict Public Response
ERIC Educational Resources Information Center
Weiss, David J.
2012-01-01
Policies that call for members of the public to change their behavior fail if people don't change; predictions of whether the requisite changes will take place are needed prior to implementation. I propose to solve the prediction problem with Factorial Forecasting, a version of functional measurement methodology that employs group designs. Aspects…
The Field Production of Water for Injection
1985-12-01
L/day Bedridden Patient 0.75 L/day Average Diseased Patient 0.50 L/day e (There is no feasible methodology to forecast the number of procedures per... Bedridden Patient 0.75 All Diseased Patients 0.50 An estimate of the liters/day needed may be calculated based on a forecasted patient stream, including
Smoothing Forecasting Methods for Academic Library Circulations: An Evaluation and Recommendation.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Forys, John W., Jr.
1986-01-01
Circulation time-series data from 50 midwest academic libraries were used to test 110 variants of 8 smoothing forecasting methods. Data and methodologies and illustrations of two recommended methods--the single exponential smoothing method and Brown's one-parameter linear exponential smoothing method--are given. Eight references are cited. (EJS)
Toward a science of tumor forecasting for clinical oncology
Yankeelov, Thomas E.; Quaranta, Vito; Evans, Katherine J.; ...
2015-03-15
We propose that the quantitative cancer biology community makes a concerted effort to apply lessons from weather forecasting to develop an analogous methodology for predicting and evaluating tumor growth and treatment response. Currently, the time course of tumor response is not predicted; instead, response is only assessed post hoc by physical examination or imaging methods. This fundamental practice within clinical oncology limits optimization of a treatment regimen for an individual patient, as well as to determine in real time whether the choice was in fact appropriate. This is especially frustrating at a time when a panoply of molecularly targeted therapiesmore » is available, and precision genetic or proteomic analyses of tumors are an established reality. By learning from the methods of weather and climate modeling, we submit that the forecasting power of biophysical and biomathematical modeling can be harnessed to hasten the arrival of a field of predictive oncology. Furthermore, with a successful methodology toward tumor forecasting, it should be possible to integrate large tumor-specific datasets of varied types and effectively defeat one cancer patient at a time.« less
Towards a Science of Tumor Forecasting for Clinical Oncology
Yankeelov, Thomas E.; Quaranta, Vito; Evans, Katherine J.; Rericha, Erin C.
2015-01-01
We propose that the quantitative cancer biology community make a concerted effort to apply lessons from weather forecasting to develop an analogous methodology for predicting and evaluating tumor growth and treatment response. Currently, the time course of tumor response is not predicted; instead, response is- only assessed post hoc by physical exam or imaging methods. This fundamental practice within clinical oncology limits optimization of atreatment regimen for an individual patient, as well as to determine in real time whether the choice was in fact appropriate. This is especially frustrating at a time when a panoply of molecularly targeted therapies is available, and precision genetic or proteomic analyses of tumors are an established reality. By learning from the methods of weather and climate modeling, we submit that the forecasting power of biophysical and biomathematical modeling can be harnessed to hasten the arrival of a field of predictive oncology. With a successful methodology towards tumor forecasting, it should be possible to integrate large tumor specific datasets of varied types, and effectively defeat cancer one patient at a time. PMID:25592148
Toward a science of tumor forecasting for clinical oncology.
Yankeelov, Thomas E; Quaranta, Vito; Evans, Katherine J; Rericha, Erin C
2015-03-15
We propose that the quantitative cancer biology community makes a concerted effort to apply lessons from weather forecasting to develop an analogous methodology for predicting and evaluating tumor growth and treatment response. Currently, the time course of tumor response is not predicted; instead, response is only assessed post hoc by physical examination or imaging methods. This fundamental practice within clinical oncology limits optimization of a treatment regimen for an individual patient, as well as to determine in real time whether the choice was in fact appropriate. This is especially frustrating at a time when a panoply of molecularly targeted therapies is available, and precision genetic or proteomic analyses of tumors are an established reality. By learning from the methods of weather and climate modeling, we submit that the forecasting power of biophysical and biomathematical modeling can be harnessed to hasten the arrival of a field of predictive oncology. With a successful methodology toward tumor forecasting, it should be possible to integrate large tumor-specific datasets of varied types and effectively defeat one cancer patient at a time. ©2015 American Association for Cancer Research.
NASA Technical Reports Server (NTRS)
Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.
1985-01-01
The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.
Consumption trend analysis in the industrial sector: Existing forecasts
NASA Astrophysics Data System (ADS)
1981-08-01
The Gas Research Institute (GRI) is engaged in medium- to long-range research and development in various sectors of the economy that depend on gasing technologies and equipment. To assess the potential demand for natural gas in the industrial sector, forecasts available from private and public sources were compared and analyzed. More than 20 projections were examined, and 10 of the most appropriate long-range demand forecasts were analyzed and compared with respect to the various assumptions, methodologies and criteria on which each was based.
A method for the determination of potentially profitable service patterns for commuter air carriers
NASA Technical Reports Server (NTRS)
Ransone, R. K.; Kuhlthau, A. R.; Deptula, D. A.
1975-01-01
A methodology for estimating market conception was developed as a part of the short-haul air transportation program. It is based upon an analysis of actual documents which provide a record of known travel history. Applying this methodology a forecast was made of the demand for an air feeder service between Charlottesville, Virginia and Dulles International Airport. Local business travel vouchers and local travel agent records were selected to provide the documentation. The market was determined to be profitable for an 8-passenger Cessna 402B aircraft flying a 2-hour daily service pattern designed to mesh to the best extent possible with the connecting schedules at Dulles. The Charlottesville - Dulles air feeder service market conception forecast and its methodology are documented.
NASA Astrophysics Data System (ADS)
García, Alicia; De la Cruz-Reyna, Servando; Marrero, José M.; Ortiz, Ramón
2016-05-01
Under certain conditions, volcano-tectonic (VT) earthquakes may pose significant hazards to people living in or near active volcanic regions, especially on volcanic islands; however, hazard arising from VT activity caused by localized volcanic sources is rarely addressed in the literature. The evolution of VT earthquakes resulting from a magmatic intrusion shows some orderly behaviour that may allow the occurrence and magnitude of major events to be forecast. Thus governmental decision makers can be supplied with warnings of the increased probability of larger-magnitude earthquakes on the short-term timescale. We present here a methodology for forecasting the occurrence of large-magnitude VT events during volcanic crises; it is based on a mean recurrence time (MRT) algorithm that translates the Gutenberg-Richter distribution parameter fluctuations into time windows of increased probability of a major VT earthquake. The MRT forecasting algorithm was developed after observing a repetitive pattern in the seismic swarm episodes occurring between July and November 2011 at El Hierro (Canary Islands). From then on, this methodology has been applied to the consecutive seismic crises registered at El Hierro, achieving a high success rate in the real-time forecasting, within 10-day time windows, of volcano-tectonic earthquakes.
1998-01-01
The blending of oxygenates, such as fuel ethanol and methyl tertiary butyl ether (MTBE), into motor gasoline has increased dramatically in the last few years because of the oxygenated and reformulated gasoline programs. Because of the significant role oxygenates now have in petroleum product markets, the Short-Term Integrated Forecasting System (STIFS) was revised to include supply and demand balances for fuel ethanol and MTBE. The STIFS model is used for producing forecasts in the Short-Term Energy Outlook. A review of the historical data sources and forecasting methodology for oxygenate production, imports, inventories, and demand is presented in this report.
Short-Term fo F2 Forecast: Present Day State of Art
NASA Astrophysics Data System (ADS)
Mikhailov, A. V.; Depuev, V. H.; Depueva, A. H.
An analysis of the F2-layer short-term forecast problem has been done. Both objective and methodological problems prevent us from a deliberate F2-layer forecast issuing at present. An empirical approach based on statistical methods may be recommended for practical use. A forecast method based on a new aeronomic index (a proxy) AI has been proposed and tested over selected 64 severe storm events. The method provides an acceptable prediction accuracy both for strongly disturbed and quiet conditions. The problems with the prediction of the F2-layer quiet-time disturbances as well as some other unsolved problems are discussed
Impacts of Short-Term Solar Power Forecasts in System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibanez, Eduardo; Krad, Ibrahim; Hodge, Bri-Mathias
2016-05-05
Solar generation is experiencing an exponential growth in power systems worldwide and, along with wind power, is posing new challenges to power system operations. Those challenges are characterized by an increase of system variability and uncertainty across many time scales: from days, down to hours, minutes, and seconds. Much of the research in the area has focused on the effect of solar forecasting across hours or days. This paper presents a methodology to capture the effect of short-term forecasting strategies and analyzes the economic and reliability implications of utilizing a simple, yet effective forecasting method for solar PV in intra-daymore » operations.« less
Forecasting daily meteorological time series using ARIMA and regression models
NASA Astrophysics Data System (ADS)
Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir
2018-04-01
The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.
NASA Astrophysics Data System (ADS)
Schaefer, Andreas; Daniell, James; Wenzel, Friedemann
2015-04-01
Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, Judith
This project addressed the challenge of providing weather and climate information to support the operation, management and planning for wind-energy systems. The need for forecast information is extending to longer projection windows with increasing penetration of wind power into the grid and also with diminishing reserve margins to meet peak loads during significant weather events. Maintenance planning and natural gas trading is being influenced increasingly by anticipation of wind generation on timescales of weeks to months. Future scenarios on decadal time scales are needed to support assessment of wind farm siting, government planning, long-term wind purchase agreements and the regulatorymore » environment. The challenge of making wind forecasts on these longer time scales is associated with a wide range of uncertainties in general circulation and regional climate models that make them unsuitable for direct use in the design and planning of wind-energy systems. To address this challenge, CFAN has developed a hybrid statistical/dynamical forecasting scheme for delivering probabilistic forecasts on time scales from one day to seven months using what is arguably the best forecasting system in the world (European Centre for Medium Range Weather Forecasting, ECMWF). The project also provided a framework to assess future wind power through developing scenarios of interannual to decadal climate variability and change. The Phase II research has successfully developed an operational wind power forecasting system for the U.S., which is being extended to Europe and possibly Asia.« less
Advanced, Cost-Based Indices for Forecasting the Generation of Photovoltaic Power
NASA Astrophysics Data System (ADS)
Bracale, Antonio; Carpinelli, Guido; Di Fazio, Annarita; Khormali, Shahab
2014-01-01
Distribution systems are undergoing significant changes as they evolve toward the grids of the future, which are known as smart grids (SGs). The perspective of SGs is to facilitate large-scale penetration of distributed generation using renewable energy sources (RESs), encourage the efficient use of energy, reduce systems' losses, and improve the quality of power. Photovoltaic (PV) systems have become one of the most promising RESs due to the expected cost reduction and the increased efficiency of PV panels and interfacing converters. The ability to forecast power-production information accurately and reliably is of primary importance for the appropriate management of an SG and for making decisions relative to the energy market. Several forecasting methods have been proposed, and many indices have been used to quantify the accuracy of the forecasts of PV power production. Unfortunately, the indices that have been used have deficiencies and usually do not directly account for the economic consequences of forecasting errors in the framework of liberalized electricity markets. In this paper, advanced, more accurate indices are proposed that account directly for the economic consequences of forecasting errors. The proposed indices also were compared to the most frequently used indices in order to demonstrate their different, improved capability. The comparisons were based on the results obtained using a forecasting method based on an artificial neural network. This method was chosen because it was deemed to be one of the most promising methods available due to its capability for forecasting PV power. Numerical applications also are presented that considered an actual PV plant to provide evidence of the forecasting performances of all of the indices that were considered.
This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key metho...
ERIC Educational Resources Information Center
Cain, Timothy J.; Cheek, Fern M.; Kupsco, Jeremy; Hartel, Lynda J.; Getselman, Anna
2016-01-01
To better understand the value of current information services and to forecast the evolving information and data management needs of researchers, a study was conducted at two research-intensive universities. The methodology and planning framework applied by health science librarians at Emory University and The Ohio State University focused on…
Integrating Solar PV in Utility System Operations: Analytical Framework and Arizona Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jing; Botterud, Audun; Mills, Andrew
2015-06-01
A systematic framework is proposed to estimate the impact on operating costs due to uncertainty and variability in renewable resources. The framework quantifies the integration costs associated with subhourly variability and uncertainty as well as day-ahead forecasting errors in solar PV (photovoltaics) power. A case study illustrates how changes in system operations may affect these costs for a utility in the southwestern United States (Arizona Public Service Company). We conduct an extensive sensitivity analysis under different assumptions about balancing reserves, system flexibility, fuel prices, and forecasting errors. We find that high solar PV penetrations may lead to operational challenges, particularlymore » during low-load and high solar periods. Increased system flexibility is essential for minimizing integration costs and maintaining reliability. In a set of sensitivity cases where such flexibility is provided, in part, by flexible operations of nuclear power plants, the estimated integration costs vary between $1.0 and $4.4/MWh-PV for a PV penetration level of 17%. The integration costs are primarily due to higher needs for hour-ahead balancing reserves to address the increased sub-hourly variability and uncertainty in the PV resource. (C) 2015 Elsevier Ltd. All rights reserved.« less
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
Multilayer Stock Forecasting Model Using Fuzzy Time Series
Javedani Sadaei, Hossein; Lee, Muhammad Hisyam
2014-01-01
After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS. PMID:24605058
Impact of Land-Sea Thermal Contrast on Inland Penetration of Sea Fog over The Yellow Sea
NASA Astrophysics Data System (ADS)
Lee, H. Y.; Chang, E. C.
2017-12-01
Sea fog can be classified into a cold sea fog that occurs when sea surface temperature (SST) is colder than sea air temperature (SAT) and a warm sea fog that occurs when the SST is warmer than the SAT. We simulated two sea fog events over the Yellow Sea which is surrounded by Korean Peninsula and mainland China using Weather Research and Forecasting (WRF) model. Our first aim is to understand contributions of major factors for the sea fog formation. First, the two sea fog events are designated as cold and warm types, and cooling rates as well as moistening rates are calculated employing bulk aerodynamic methods. Both cases show cooling and moistening by turbulent fluxes play an important role in condensation either favorably or unfavorably. However, longwave radiative cooling is as or even stronger than turbulent cooling, suggesting it is the most decisive factor in formation of sea fogs regardless of their type. Our second purpose of the study is to understand inland penetration of sea fog in terms of thermal contrast (TC) and it was conducted through sensitivity tests of SST and land skin temperature (LST). In the SST sensitivity tests, increase of SSTs lead to that of upward turbulent heat fluxes so that SATs rise which are responsible for evaporation of cloud waters and it is common response of the two events. In addition, change of the SST induce that of the TC and may affect the inland penetration of sea fog. However, when the cloud waters over the sea evaporate, it is hard to fully determine the inland penetration. As a remedy for this limitation, LST is now modified instead of SST to minimize the evaporation effect, maintaining the equivalent TC. In the case of cold sea fog, land air temperature (LAT) is warmer than SAT. Here, decrease of the LAT leads to weakening of the TC and favors the inland penetration. On the other hand, LAT is colder than the SAT in the warm sea fog event. When the LAT decreases, the TC is intensified resulting in blocking of the penetration. Although our study mainly focused on the TC, the results can offer new perspective which would be helpful for forecasting the visibility in the coastal area.
Short-term load and wind power forecasting using neural network-based prediction intervals.
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2014-02-01
Electrical power systems are evolving from today's centralized bulk systems to more decentralized systems. Penetrations of renewable energies, such as wind and solar power, significantly increase the level of uncertainty in power systems. Accurate load forecasting becomes more complex, yet more important for management of power systems. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in system operations. To quantify potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for the construction of prediction intervals (PIs). A newly introduced method, called lower upper bound estimation (LUBE), is applied and extended to develop PIs using NN models. A new problem formulation is proposed, which translates the primary multiobjective problem into a constrained single-objective problem. Compared with the cost function, this new formulation is closer to the primary problem and has fewer parameters. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Electrical demands from Singapore and New South Wales (Australia), as well as wind power generation from Capital Wind Farm, are used to validate the PSO-based LUBE method. Comparative results show that the proposed method can construct higher quality PIs for load and wind power generation forecasts in a short time.
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2015-09-01
Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
NASA Astrophysics Data System (ADS)
Georgakakos, A. P.; Kistenmacher, M.; Yao, H.; Georgakakos, K. P.
2014-12-01
The 2014 National Climate Assessment of the US Global Change Research Program emphasizes that water resources managers and planners in most US regions will have to cope with new risks, vulnerabilities, and opportunities, and recommends the development of adaptive capacity to effectively respond to the new water resources planning and management challenges. In the face of these challenges, adaptive reservoir regulation is becoming all the more ncessary. Water resources management in Northern California relies on the coordinated operation of several multi-objective reservoirs on the Trinity, Sacramento, American, Feather, and San Joaquin Rivers. To be effective, reservoir regulation must be able to (a) account for forecast uncertainty; (b) assess changing tradeoffs among water uses and regions; and (c) adjust management policies as conditions change; and (d) evaluate the socio-economic and environmental benefits and risks of forecasts and policies for each region and for the system as a whole. The Integrated Forecast and Reservoir Management (INFORM) prototype demonstration project operated in Northern California through the collaboration of several forecast and management agencies has shown that decision support systems (DSS) with these attributes add value to stakeholder decision processes compared to current, less flexible management practices. Key features of the INFORM DSS include: (a) dynamically downscaled operational forecasts and climate projections that maintain the spatio-temporal coherence of the downscaled land surface forcing fields within synoptic scales; (b) use of ensemble forecast methodologies for reservoir inflows; (c) assessment of relevant tradeoffs among water uses on regional and local scales; (d) development and evaluation of dynamic reservoir policies with explicit consideration of hydro-climatic forecast uncertainties; and (e) focus on stakeholder information needs.This article discusses the INFORM integrated design concept, underlying methodologies, and selected applications with the California water resources system.
Improving 7-Day Forecast Skill by Assimilation of Retrieved AIRS Temperature Profiles
NASA Technical Reports Server (NTRS)
Susskind, Joel; Rosenberg, Bob
2016-01-01
We conducted a new set of Data Assimilation Experiments covering the period January 1 to February 29, 2016 using the GEOS-5 DAS. Our experiments assimilate all data used operationally by GMAO (Control) with some modifications. Significant improvement in Global and Southern Hemisphere Extra-tropical 7-day forecast skill was obtained when: We assimilated AIRS Quality Controlled temperature profiles in place of observed AIRS radiances, and also did not assimilate CrISATMS radiances, nor did we assimilate radiosonde temperature profiles or aircraft temperatures. This new methodology did not improve or degrade 7-day Northern Hemispheric Extra-tropical forecast skill. We are conducting experiments aimed at further improving of Northern Hemisphere Extra-tropical forecast skill.
Reither, Eric N; Olshansky, S Jay; Yang, Yang
2011-08-01
Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.
NASA Astrophysics Data System (ADS)
Areekul, Phatchakorn; Senjyu, Tomonobu; Urasaki, Naomitsu; Yona, Atsushi
Electricity price forecasting is becoming increasingly relevant to power producers and consumers in the new competitive electric power markets, when planning bidding strategies in order to maximize their benefits and utilities, respectively. This paper proposed a method to predict hourly electricity prices for next-day electricity markets by combination methodology of ARIMA and ANN models. The proposed method is examined on the Australian National Electricity Market (NEM), New South Wales regional in year 2006. Comparison of forecasting performance with the proposed ARIMA, ANN and combination (ARIMA-ANN) models are presented. Empirical results indicate that an ARIMA-ANN model can improve the price forecasting accuracy.
A global flash flood forecasting system
NASA Astrophysics Data System (ADS)
Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin
2016-04-01
The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial resolution appropriate to the NWP system. We then demonstrate how these warning areas could eventually complement existing global systems such as the Global Flood Awareness System (GloFAS), to give warnings of flash floods. This work demonstrates the possibility of creating a global flash flood forecasting system based on forecasts from existing global NWP systems. Future developments, in post-processing for example, will need to address an under-prediction bias, for extreme point rainfall, that is innate to current-generation global models.
ERIC Educational Resources Information Center
Borghans, Lex; de Grip, Andries; Heijke, Hans
The problem of planning and making labor market forecasts by occupation and qualification in the context of a constantly changing labor market was examined. The examination focused on the following topics: assumptions, benefits, and pitfalls of the labor requirement model of projecting future imbalances between labor supply and demand for certain…
Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
2017-07-01
In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.
Parameter estimation and forecasting for multiplicative log-normal cascades.
Leövey, Andrés E; Lux, Thomas
2012-04-01
We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing et al. [Physica D 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica D 193, 195 (2004)] and Kiyono et al. [Phys. Rev. E 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono et al.'s procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases
NASA Astrophysics Data System (ADS)
Ramaswamy, V.; Saleh, F.
2017-12-01
Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.
Time-Series Forecast Modeling on High-Bandwidth Network Measurements
Yoo, Wucherl; Sim, Alex
2016-06-24
With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less
Basic principles, methodology, and applications of remote sensing in agriculture
NASA Technical Reports Server (NTRS)
Moreira, M. A. (Principal Investigator); Deassuncao, G. V.
1984-01-01
The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.
Prioritization Methodology for Chemical Replacement
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1993-01-01
This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.
The definitive analysis of the Bendandi's methodology performed with a specific software
NASA Astrophysics Data System (ADS)
Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro
2015-04-01
The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.
Model documentation, Coal Market Module of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives and the conceptual and methodological approach used in the development of the National Energy Modeling System`s (NEMS) Coal Market Module (CMM) used to develop the Annual Energy Outlook 1998 (AEO98). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of CMM`s two submodules. These are the Coal Production Submodule (CPS) and the Coal Distribution Submodule (CDS). CMM provides annual forecasts of prices, production, and consumption of coal for NEMS. In general, the CDS integrates the supply inputs from the CPS to satisfy demands for coal from exogenous demand models. The internationalmore » area of the CDS forecasts annual world coal trade flows from major supply to major demand regions and provides annual forecasts of US coal exports for input to NEMS. Specifically, the CDS receives minemouth prices produced by the CPS, demand and other exogenous inputs from other NEMS components, and provides delivered coal prices and quantities to the NEMS economic sectors and regions.« less
NASA Technical Reports Server (NTRS)
Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.
Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A
2016-01-01
Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.
Forecasting Effects of MISO Actions: An ABM Methodology
2013-12-01
process of opinion change for polio vaccination in Ut- tar Pradesh, India. His analysis combines word-of-mouth and mass media broadcasting for agent...this abstraction is an appropriate comparison between treatments , rather than actual forecasting of specific levels of rebellion or anti-government...effect of breadth upon griev- ance. There is insufficient evidence to show that this effect differs between treatments . Breadth and campaign type
Determining and Forecasting Savings from Competing Previously Sole Source/Noncompetitive Contracts
1978-10-01
SUMMARY A. BACKGROUND. Within the defense market , It is difficult to isolate, identify and quantify the impact of competition on acquisition costs...63 C. F04iCASTING METhODOLOGY .................. . 7 0. COMPETITION INDEX . . . . . . . . . . . . . . . . . . .. . 77 E . USE AS A FORECASTING TOOL...program is still active. e . From this projection, calculate the actual total contract price coiencing with the buy-out competition by multiplying the
High-Penetration Photovoltaic Planning Methodologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian
The main objective of this report is to provide an overview of select U.S. utility methodologies for performing high-penetration photovoltaic (HPPV) system planning and impact studies. This report covers the Federal Energy Regulatory Commission's orders related to photovoltaic (PV) power system interconnection, particularly the interconnection processes for the Large Generation Interconnection Procedures and Small Generation Interconnection Procedures. In addition, it includes U.S. state interconnection standards and procedures. The procedures used by these regulatory bodies consider the impacts of HPPV power plants on the networks. Technical interconnection requirements for HPPV voltage regulation include aspects of power monitoring, grounding, synchronization, connection tomore » the overall distribution system, back-feeds, disconnecting means, abnormal operating conditions, and power quality. This report provides a summary of mitigation strategies to minimize the impact of HPPV. Recommendations and revisions to the standards may take place as the penetration level of renewables on the grid increases and new technologies develop in future years.« less
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
Computer-Aided Analysis of Patents for Product Technology Maturity Forecasting
NASA Astrophysics Data System (ADS)
Liang, Yanhong; Gan, Dequan; Guo, Yingchun; Zhang, Peng
Product technology maturity foresting is vital for any enterprises to hold the chance for innovation and keep competitive for a long term. The Theory of Invention Problem Solving (TRIZ) is acknowledged both as a systematic methodology for innovation and a powerful tool for technology forecasting. Based on TRIZ, the state -of-the-art on the technology maturity of product and the limits of application are discussed. With the application of text mining and patent analysis technologies, this paper proposes a computer-aided approach for product technology maturity forecasting. It can overcome the shortcomings of the current methods.
Prioritization methodology for chemical replacement
NASA Technical Reports Server (NTRS)
Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott
1995-01-01
This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Real-time forecasting of the April 11, 2012 Sumatra tsunami
Wang, Dailin; Becker, Nathan C.; Walsh, David; Fryer, Gerard J.; Weinstein, Stuart A.; McCreery, Charles S.; ,
2012-01-01
The April 11, 2012, magnitude 8.6 earthquake off the northern coast of Sumatra generated a tsunami that was recorded at sea-level stations as far as 4800 km from the epicenter and at four ocean bottom pressure sensors (DARTs) in the Indian Ocean. The governments of India, Indonesia, Sri Lanka, Thailand, and Maldives issued tsunami warnings for their coastlines. The United States' Pacific Tsunami Warning Center (PTWC) issued an Indian Ocean-wide Tsunami Watch Bulletin in its role as an Interim Service Provider for the region. Using an experimental real-time tsunami forecast model (RIFT), PTWC produced a series of tsunami forecasts during the event that were based on rapidly derived earthquake parameters, including initial location and Mwp magnitude estimates and the W-phase centroid moment tensor solutions (W-phase CMTs) obtained at PTWC and at the U. S. Geological Survey (USGS). We discuss the real-time forecast methodology and how successive, real-time tsunami forecasts using the latest W-phase CMT solutions improved the accuracy of the forecast.
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
Network bandwidth utilization forecast model on high bandwidth networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wuchert; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2%. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less
Network Bandwidth Utilization Forecast Model on High Bandwidth Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex
With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less
NASA Astrophysics Data System (ADS)
Pathiraja, S.; Anghileri, D.; Burlando, P.; Sharma, A.; Marshall, L.; Moradkhani, H.
2018-03-01
The global prevalence of rapid and extensive land use change necessitates hydrologic modelling methodologies capable of handling non-stationarity. This is particularly true in the context of Hydrologic Forecasting using Data Assimilation. Data Assimilation has been shown to dramatically improve forecast skill in hydrologic and meteorological applications, although such improvements are conditional on using bias-free observations and model simulations. A hydrologic model calibrated to a particular set of land cover conditions has the potential to produce biased simulations when the catchment is disturbed. This paper sheds new light on the impacts of bias or systematic errors in hydrologic data assimilation, in the context of forecasting in catchments with changing land surface conditions and a model calibrated to pre-change conditions. We posit that in such cases, the impact of systematic model errors on assimilation or forecast quality is dependent on the inherent prediction uncertainty that persists even in pre-change conditions. Through experiments on a range of catchments, we develop a conceptual relationship between total prediction uncertainty and the impacts of land cover changes on the hydrologic regime to demonstrate how forecast quality is affected when using state estimation Data Assimilation with no modifications to account for land cover changes. This work shows that systematic model errors as a result of changing or changed catchment conditions do not always necessitate adjustments to the modelling or assimilation methodology, for instance through re-calibration of the hydrologic model, time varying model parameters or revised offline/online bias estimation.
Parameter estimation and forecasting for multiplicative log-normal cascades
NASA Astrophysics Data System (ADS)
Leövey, Andrés E.; Lux, Thomas
2012-04-01
We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing [Physica DPDNPDT0167-278910.1016/0167-2789(90)90035-N 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica DPDNPDT0167-278910.1016/j.physd.2004.01.020 193, 195 (2004)] and Kiyono [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.76.041113 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono 's procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.
Developing a robust methodology for assessing the value of weather/climate services
NASA Astrophysics Data System (ADS)
Krijnen, Justin; Golding, Nicola; Buontempo, Carlo
2016-04-01
Increasingly, scientists involved in providing weather and climate services are expected to demonstrate the value of their work for end users in order to justify the costs of developing and delivering these services. This talk will outline different approaches that can be used to assess the socio-economic benefits of weather and climate services, including, among others, willingness to pay and avoided costs. The advantages and limitations of these methods will be discussed and relevant case-studies will be used to illustrate each approach. The choice of valuation method may be influenced by different factors, such as resource and time constraints and the end purposes of the study. In addition, there are important methodological differences which will affect the value assessed. For instance the ultimate value of a weather/climate forecast to a decision-maker will not only depend on forecast accuracy but also on other factors, such as how the forecast is communicated to and consequently interpreted by the end-user. Thus, excluding these additional factors may result in inaccurate socio-economic value estimates. In order to reduce the inaccuracies in this valuation process we propose an approach that assesses how the initial weather/climate forecast information can be incorporated within the value chain of a given sector, taking into account value gains and losses at each stage of the delivery process. By this we aim to more accurately depict the socio-economic benefits of a weather/climate forecast to decision-makers.
Socioeconomic Impact Assessment: Communications Industry. Phase III. Technology Forecast.
1979-02-02
8217. Some add-on devices , such as automatic answering systems, have already penetrated the home market substantially. In the future, however, ( major changes ...equipment. This class includes garage door openers, wireless micro- phones , cordless telephones, and radio and TV receivers. These -( devices can...ACUMENICS 1-9 1.2.2 Switching Devices The first automatic switching devices which began to replace operator-switched telephone traffic in the early
Wind-Friendly Flexible Ramping Product Design in Multi-Timescale Power System Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Mingjian; Zhang, Jie; Wu, Hongyu
With increasing wind power penetration in the electricity grid, system operators are recognizing the need for additional flexibility, and some are implementing new ramping products as a type of ancillary service. However, wind is generally thought of as causing the need for ramping services, not as being a potential source for the service. In this paper, a multi-timescale unit commitment and economic dispatch model is developed to consider the wind power ramping product (WPRP). An optimized swinging door algorithm with dynamic programming is applied to identify and forecast wind power ramps (WPRs). Designed as positive characteristics of WPRs, the WPRPmore » is then integrated into the multi-timescale dispatch model that considers new objective functions, ramping capacity limits, active power limits, and flexible ramping requirements. Numerical simulations on the modified IEEE 118-bus system show the potential effectiveness of WPRP in increasing the economic efficiency of power system operations with high levels of wind power penetration. It is found that WPRP not only reduces the production cost by using less ramping reserves scheduled by conventional generators, but also possibly enhances the reliability of power system operations. Moreover, wind power forecasts play an important role in providing high-quality WPRP service.« less
A Progressive Damage Methodology for Residual Strength Predictions of Notched Composite Panels
NASA Technical Reports Server (NTRS)
Coats, Timothy W.; Harris, Charles E.
1998-01-01
The translaminate fracture behavior of carbon/epoxy structural laminates with through-penetration notches was investigated to develop a residual strength prediction methodology for composite structures. An experimental characterization of several composite materials systems revealed a fracture resistance behavior that was very similar to the R-curve behavior exhibited by ductile metals. Fractographic examinations led to the postulate that the damage growth resistance was primarily due to fractured fibers in the principal load-carrying plies being bridged by intact fibers of the adjacent plies. The load transfer associated with this bridging mechanism suggests that a progressive damage analysis methodology will be appropriate for predicting the residual strength of laminates with through-penetration notches. A progressive damage methodology developed by the authors was used to predict the initiation and growth of matrix cracks and fiber fracture. Most of the residual strength predictions for different panel widths, notch lengths, and material systems were within about 10% of the experimental failure loads.
Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias
2016-06-25
This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion.
Xu, Qinneng; Gel, Yulia R; Ramirez Ramirez, L Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung
2017-01-01
The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
A New Objective Technique for Verifying Mesoscale Numerical Weather Prediction Models
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Manobianco, John; Lane, John E.; Immer, Christopher D.
2003-01-01
This report presents a new objective technique to verify predictions of the sea-breeze phenomenon over east-central Florida by the Regional Atmospheric Modeling System (RAMS) mesoscale numerical weather prediction (NWP) model. The Contour Error Map (CEM) technique identifies sea-breeze transition times in objectively-analyzed grids of observed and forecast wind, verifies the forecast sea-breeze transition times against the observed times, and computes the mean post-sea breeze wind direction and speed to compare the observed and forecast winds behind the sea-breeze front. The CEM technique is superior to traditional objective verification techniques and previously-used subjective verification methodologies because: It is automated, requiring little manual intervention, It accounts for both spatial and temporal scales and variations, It accurately identifies and verifies the sea-breeze transition times, and It provides verification contour maps and simple statistical parameters for easy interpretation. The CEM uses a parallel lowpass boxcar filter and a high-order bandpass filter to identify the sea-breeze transition times in the observed and model grid points. Once the transition times are identified, CEM fits a Gaussian histogram function to the actual histogram of transition time differences between the model and observations. The fitted parameters of the Gaussian function subsequently explain the timing bias and variance of the timing differences across the valid comparison domain. Once the transition times are all identified at each grid point, the CEM computes the mean wind direction and speed during the remainder of the day for all times and grid points after the sea-breeze transition time. The CEM technique performed quite well when compared to independent meteorological assessments of the sea-breeze transition times and results from a previously published subjective evaluation. The algorithm correctly identified a forecast or observed sea-breeze occurrence or absence 93% of the time during the two- month evaluation period from July and August 2000. Nearly all failures in CEM were the result of complex precipitation features (observed or forecast) that contaminated the wind field, resulting in a false identification of a sea-breeze transition. A qualitative comparison between the CEM timing errors and the subjectively determined observed and forecast transition times indicate that the algorithm performed very well overall. Most discrepancies between the CEM results and the subjective analysis were again caused by observed or forecast areas of precipitation that led to complex wind patterns. The CEM also failed on a day when the observed sea- breeze transition affected only a very small portion of the verification domain. Based on the results of CEM, the RAMS tended to predict the onset and movement of the sea-breeze transition too early and/or quickly. The domain-wide timing biases provided by CEM indicated an early bias on 30 out of 37 days when both an observed and forecast sea breeze occurred over the same portions of the analysis domain. These results are consistent with previous subjective verifications of the RAMS sea breeze predictions. A comparison of the mean post-sea breeze winds indicate that RAMS has a positive wind-speed bias for .all days, which is also consistent with the early bias in the sea-breeze transition time since the higher wind speeds resulted in a faster inland penetration of the sea breeze compared to reality.
Space station integrated wall design and penetration damage control
NASA Technical Reports Server (NTRS)
Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.
1987-01-01
A methodology was developed to allow a designer to optimize the pressure wall, insulation, and meteoroid/debris shield system of a manned spacecraft for a given spacecraft configuration and threat environment. The threat environment consists of meteoroids and orbital debris, as specified for an arbitrary orbit and expected lifetime. An overall probability of no penetration is calculated, as well as contours of equal threat that take into account spacecraft geometry and orientation. Techniques, tools, and procedures for repairing an impacted and penetrated pressure wall were developed and tested. These techniques are applied from the spacecraft interior and account for the possibility of performing the repair in a vacuum. Hypervelocity impact testing was conducted to: (1) develop and refine appropriate penetration functions, and (2) determine the internal effects of a penetration on personnel and equipment.
Identified EM Earthquake Precursors
NASA Astrophysics Data System (ADS)
Jones, Kenneth, II; Saxton, Patrick
2014-05-01
Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for recurrence, duration, and frequency response. At the Southern California field sites, one loop antenna was positioned for omni-directional reception and also detected a strong First Schumann Resonance; however, additional Schumann Resonances were absent. At the Timpson, TX field sites, loop antennae were positioned for directional reception, due to earthquake-induced, hydraulic fracturing activity currently conducted by the oil and gas industry. Two strong signals, one moderately strong signal, and approximately 6-8 weaker signals were detected in the immediate vicinity. The three stronger signals were mapped by a biangulation technique, followed by a triangulation technique for confirmation. This was the first antenna mapping technique ever performed for determining possible earthquake epicenters. Six and a half months later, Timpson experienced two M4 (M4.1 and M4.3) earthquakes on September 2, 2013 followed by a M2.4 earthquake three days later, all occurring at a depth of five kilometers. The Timpson earthquake activity now has a cyclical rate and a forecast was given to the proper authorities. As a result, the Southern California and Timpson, TX field results led to an improved design and construction of a third prototype antenna. With a loop antenna array, a viable communication system, and continuous monitoring, a full fracture cycle can be established and observed in real-time. In addition, field data could be reviewed quickly for assessment and lead to a much more improved earthquake forecasting capability. The EM precursors determined by this method appear to surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.
NASA Astrophysics Data System (ADS)
Pinson, Pierre
2016-04-01
The operational management of renewable energy generation in power systems and electricity markets requires forecasts in various forms, e.g., deterministic or probabilistic, continuous or categorical, depending upon the decision process at hand. Besides, such forecasts may also be necessary at various spatial and temporal scales, from high temporal resolutions (in the order of minutes) and very localized for an offshore wind farm, to coarser temporal resolutions (hours) and covering a whole country for day-ahead power scheduling problems. As of today, weather predictions are a common input to forecasting methodologies for renewable energy generation. Since for most decision processes, optimal decisions can only be made if accounting for forecast uncertainties, ensemble predictions and density forecasts are increasingly seen as the product of choice. After discussing some of the basic approaches to obtaining ensemble forecasts of renewable power generation, it will be argued that space-time trajectories of renewable power production may or may not be necessitate post-processing ensemble forecasts for relevant weather variables. Example approaches and test case applications will be covered, e.g., looking at the Horns Rev offshore wind farm in Denmark, or gridded forecasts for the whole continental Europe. Eventually, we will illustrate some of the limitations of current frameworks to forecast verification, which actually make it difficult to fully assess the quality of post-processing approaches to obtain renewable energy predictions.
NASA Astrophysics Data System (ADS)
Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.
2017-04-01
Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.
PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments
NASA Astrophysics Data System (ADS)
Schmitz, G. H.; Cullmann, J.
2008-10-01
SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.
John Butnor; Brian Roth; Kurt Johnsen
2005-01-01
Tree root systems are commonly evaluated via labor intensive, destructive, time-consuming excavations. Ground-penetrating radar (GPR) can be used to detect and monitor roots if there is sufficient electromagnetic contrast with the surrounding soil matrix. This methodology is commonly used in civil engineering for non-destructive testing of concrete as well as road and...
1988-03-01
remaining penetrators. The overall effectiveness of the model is measured by the total value extracted by a given number of penetrators that have...V2 , V 3M......... V% and Vk 1 , Vb2 , Vba, ......... Vf. respectively (see Fig. 5.1) k 71I .OL X - -. - . - , : ’ )v .: o, - -. 7 , -.~ ’..-J . .’ : r
Communicating uncertainty in hydrological forecasts: mission impossible?
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian
2010-05-01
Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.
Financial forecasts accuracy in Brazil's social security system.
Silva, Carlos Patrick Alves da; Puty, Claudio Alberto Castelo Branco; Silva, Marcelino Silva da; Carvalho, Solon Venâncio de; Francês, Carlos Renato Lisboa
2017-01-01
Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.
Financial forecasts accuracy in Brazil’s social security system
2017-01-01
Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government’s proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts. PMID:28859172
An experimental study of summertime coastal fog and its inland penetration in Northern California
NASA Astrophysics Data System (ADS)
Lucena Kreppel Paes, P.; Torres, P.; Faloona, I. C.; Torregrosa, A.; Gultepe, I.
2012-12-01
The occurrence and continental inundation of marine stratocumulus and fog along the California Coast during summer has been linked to many environmental concerns including redwood ecosystem vitality, air traffic control, power grid load balancing, and radiatve climate forcing. An exploratory study was instigated this past summer at the Bodega Marine Laboratory and Pepperwood Preserve, a large nature reserve located 40 km inland in Sonoma County, in order to investigate fog formation, persistence, and penetration through the orographic gap in the Pacific coastal mountain range. Analysis of the synoptic patterns and in-situ meteorological observations, including visibility and boundary layer depth, are presented with the aim of improving fog forecasts and elucidating the principal physical parameters that control summertime fog formation and dissipation along the Northern California Coast.
NASA Astrophysics Data System (ADS)
Liubartseva, Svitlana; Coppini, Giovanni; Ciliberti, Stefania Angela; Lecci, Rita
2017-04-01
In operational oil spill modeling, MEDSLIK-II (De Dominicis et al., 2013) focuses on the reliability of the oil drift and fate predictions routinely fed by operational oceanographic and atmospheric forecasting chain. Uncertainty calculations enhance oil spill forecast efficiency, supplying probability maps to quantify the propagation of various uncertainties. Recently, we have developed the methodology that allows users to evaluate the variability of oil drift forecast caused by uncertain data on the initial oil spill conditions (Liubartseva et al., 2016). One of the key methodological aspects is a reasonable choice of a way of parameter perturbation. In case of starting oil spill location and time, these scalars might be treated as independent random parameters. If we want to perturb the underlying ocean currents and wind, we have to deal with deterministic vector parameters. To a first approximation, we suggest rolling forecasts as a set of perturbed ocean currents and wind. This approach does not need any extra hydrodynamic calculations, and it is quick enough to be performed in web-based applications. The capabilities of the proposed methodology are explored using the Black Sea Forecasting System (BSFS) recently implemented by Ciliberti et al. (2016) for the Copernicus Marine Environment Monitoring Service (http://marine.copernicus.eu/services-portfolio/access-to-products). BSFS horizontal resolution is 1/36° in zonal and 1/27° in meridional direction (ca. 3 km). Vertical domain discretization is represented by 31 unevenly spaced vertical levels. Atmospheric wind data are provided by the European Centre for Medium-Range Weather Forecasts (ECMWF) forecasts, at 1/8° (ca. 12.5 km) horizontal and 6-hour temporal resolution. A great variety of probability patterns controlled by different underlying flows is represented including the cyclonic Rim Current, flow bifurcations in anticyclonic eddies (e.g., Sevastopol and Batumi), northwestern shelf circulation, etc. Uncertainty imprints in the oil mass balance components are also analyzed. This work is conducted in the framework of the REACT Project funded by Fondazione CON IL SUD/Brains2South. References Ciliberti, S.A., Peneva, E., Storto, A., Kandilarov, R., Lecci, R., Yang, C., Coppini, G., Masina, S., Pinardi, N., 2016. Implementation of Black Sea numerical model based on NEMO and 3DVAR data assimilation scheme for operational forecasting, Geophys. Res. Abs., 18, EGU2016-16222. De Dominicis, M., Pinardi, N., Zodiatis, G., Lardner, R., 2013. MEDSLIK-II, a Lagrangian marine surface oil spill model for short term forecasting-Part 1: Theory, Geosci. Model Dev., 6, 1851-1869. Liubartseva, S., Coppini, G., Pinardi, N., De Dominicis, M., Lecci, R., Turrisi, G., Cretì, S., Martinelli, S., Agostini, P., Marra, P., Palermo, F., 2016. Decision support system for emergency management of oil spill accidents in the Mediterranean Sea, Nat. Hazards Earth Syst. Sci., 16, 2009-2020.
2017-06-01
importantly, it examines the methodology used to build the class IX block embarked on ship prior to deployment. The class IX block is defined as a repository...compared to historical data to evaluate model and simulation outputs. This thesis provides recommendations on improving the methodology implemented in...improving the level of organic support available to deployed units. More importantly, it examines the methodology used to build the class IX block
NASA Astrophysics Data System (ADS)
Ebardaloza, J. B. R.; Trogo, R.; Sabido, D. J.; Tongson, E.; Bagtasa, G.; Balderama, O. F.
2015-12-01
Corn farms in the Philippines are rainfed farms, hence, it is of utmost importance to choose the start of planting date so that the critical growth stages that are in need of water will fall on dates when there is rain. Most farmers in the Philippines use superstitions and traditions as basis for farming decisions such as when to start planting [1]. Before climate change, superstitions like planting after a feast day of a saint has worked for them but with the recent progression of climate change, farmers now recognize that there is a need for technological intervention [1]. The application discussed in this paper presents a solution that makes use of meteorological station sensors, localized seasonal climate forecast, localized weather forecast and a crop simulation model to provide recommendations to farmers based on the crop cultivar, soil type and fertilizer type used by farmers. It is critical that the recommendations given to farmers are not generic as each farmer would have different needs based on their cultivar, soil, fertilizer, planting schedule and even location [2]. This application allows the farmer to inquire about whether it will rain in the next seven days, the best date to start planting based on the potential yield upon harvest, when to apply fertilizer and by how much, when to water and by how much. Short messaging service (SMS) is the medium chosen for this application because while mobile penetration in the Philippines is as high as 101%, the smart phone penetration is only at 15% [3]. SMS has been selected as it has been identified as the most effective way of reaching farmers with timely agricultural information and knowledge [4,5]. The recommendations while derived from making use of Automated Weather Station (AWS) sensor data, Weather Research Forecasting (WRF) models and DSSAT 4.5 [9], are translated into the local language of the farmers and in a format that is easily understood as recommended in [6,7,8]. A pilot study has been started in May 2015 and the harvest of this pilot season will be September 2015.
The Second NWRA Flare-Forecasting Comparison Workshop: Methods Compared and Methodology
NASA Astrophysics Data System (ADS)
Leka, K. D.; Barnes, G.; the Flare Forecasting Comparison Group
2013-07-01
The Second NWRA Workshop to compare methods of solar flare forecasting was held 2-4 April 2013 in Boulder, CO. This is a follow-on to the First NWRA Workshop on Flare Forecasting Comparison, also known as the ``All-Clear Forecasting Workshop'', held in 2009 jointly with NASA/SRAG and NOAA/SWPC. For this most recent workshop, many researchers who are active in the field participated, and diverse methods were represented in terms of both the characterization of the Sun and the statistical approaches used to create a forecast. A standard dataset was created for this investigation, using data from the Solar Dynamics Observatory/ Helioseismic and Magnetic Imager (SDO/HMI) vector magnetic field HARP series. For each HARP on each day, 6 hours of data were used, allowing for nominal time-series analysis to be included in the forecasts. We present here a summary of the forecasting methods that participated and the standardized dataset that was used. Funding for the workshop and the data analysis was provided by NASA/Living with a Star contract NNH09CE72C and NASA/Guest Investigator contract NNH12CG10C.
Geng, Xiaohua; Podlaha, Elizabeth J
2016-12-14
A new methodology is reported to shape template-assisted electrodeposition of Fe-rich, Fe-Ni-Co nanowires to have a thin nanowire segment using a coupled displacement reaction with a more noble elemental ion, Cu(II), and at the same time dealloying predominantly Fe from Fe-Ni-Co by the reduction of protons (H + ), followed by a subsequent etching step. The displacement/dealloyed layer was sandwiched between two trilayers of Fe-Ni-Co to facilitate the characterization of the reaction front, or penetration length. The penetration length region was found to be a function of the ratio of proton and Cu(II) concentration, and a ratio of 0.5 was found to provide the largest penetration rate, and hence the larger thinned length of the nanowire. Altering the etching time affected the diameter of the thinned region. This methodology presents a new way to thin nanowire segments connected to larger nanowire sections and also introduces a way to study the propagation of a reaction front into a nanowire.
Earthquake cycles and physical modeling of the process leading up to a large earthquake
NASA Astrophysics Data System (ADS)
Ohnaka, Mitiyasu
2004-08-01
A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.
Forecasting Influenza Epidemics in Hong Kong.
Yang, Wan; Cowling, Benjamin J; Lau, Eric H Y; Shaman, Jeffrey
2015-07-01
Recent advances in mathematical modeling and inference methodologies have enabled development of systems capable of forecasting seasonal influenza epidemics in temperate regions in real-time. However, in subtropical and tropical regions, influenza epidemics can occur throughout the year, making routine forecast of influenza more challenging. Here we develop and report forecast systems that are able to predict irregular non-seasonal influenza epidemics, using either the ensemble adjustment Kalman filter or a modified particle filter in conjunction with a susceptible-infected-recovered (SIR) model. We applied these model-filter systems to retrospectively forecast influenza epidemics in Hong Kong from January 1998 to December 2013, including the 2009 pandemic. The forecast systems were able to forecast both the peak timing and peak magnitude for 44 epidemics in 16 years caused by individual influenza strains (i.e., seasonal influenza A(H1N1), pandemic A(H1N1), A(H3N2), and B), as well as 19 aggregate epidemics caused by one or more of these influenza strains. Average forecast accuracies were 37% (for both peak timing and magnitude) at 1-3 week leads, and 51% (peak timing) and 50% (peak magnitude) at 0 lead. Forecast accuracy increased as the spread of a given forecast ensemble decreased; the forecast accuracy for peak timing (peak magnitude) increased up to 43% (45%) for H1N1, 93% (89%) for H3N2, and 53% (68%) for influenza B at 1-3 week leads. These findings suggest that accurate forecasts can be made at least 3 weeks in advance for subtropical and tropical regions.
Forecasting Influenza Epidemics in Hong Kong
Yang, Wan; Cowling, Benjamin J.; Lau, Eric H. Y.; Shaman, Jeffrey
2015-01-01
Recent advances in mathematical modeling and inference methodologies have enabled development of systems capable of forecasting seasonal influenza epidemics in temperate regions in real-time. However, in subtropical and tropical regions, influenza epidemics can occur throughout the year, making routine forecast of influenza more challenging. Here we develop and report forecast systems that are able to predict irregular non-seasonal influenza epidemics, using either the ensemble adjustment Kalman filter or a modified particle filter in conjunction with a susceptible-infected-recovered (SIR) model. We applied these model-filter systems to retrospectively forecast influenza epidemics in Hong Kong from January 1998 to December 2013, including the 2009 pandemic. The forecast systems were able to forecast both the peak timing and peak magnitude for 44 epidemics in 16 years caused by individual influenza strains (i.e., seasonal influenza A(H1N1), pandemic A(H1N1), A(H3N2), and B), as well as 19 aggregate epidemics caused by one or more of these influenza strains. Average forecast accuracies were 37% (for both peak timing and magnitude) at 1-3 week leads, and 51% (peak timing) and 50% (peak magnitude) at 0 lead. Forecast accuracy increased as the spread of a given forecast ensemble decreased; the forecast accuracy for peak timing (peak magnitude) increased up to 43% (45%) for H1N1, 93% (89%) for H3N2, and 53% (68%) for influenza B at 1-3 week leads. These findings suggest that accurate forecasts can be made at least 3 weeks in advance for subtropical and tropical regions. PMID:26226185
Flight Departure Delay and Rerouting Under Uncertainty in En Route Convective Weather
NASA Technical Reports Server (NTRS)
Mukherjee, Avijit; Grabbe, Shon; Sridhar, Banavar
2011-01-01
Delays caused by uncertainty in weather forecasts can be reduced by improving traffic flow management decisions. This paper presents a methodology for traffic flow management under uncertainty in convective weather forecasts. An algorithm for assigning departure delays and reroutes to aircraft is presented. Departure delay and route assignment are executed at multiple stages, during which, updated weather forecasts and flight schedules are used. At each stage, weather forecasts up to a certain look-ahead time are treated as deterministic and flight scheduling is done to mitigate the impact of weather on four-dimensional flight trajectories. Uncertainty in weather forecasts during departure scheduling results in tactical airborne holding of flights. The amount of airborne holding depends on the accuracy of forecasts as well as the look-ahead time included in the departure scheduling. The weather forecast look-ahead time is varied systematically within the experiments performed in this paper to analyze its effect on flight delays. Based on the results, longer look-ahead times cause higher departure delays and additional flying time due to reroutes. However, the amount of airborne holding necessary to prevent weather incursions reduces when the forecast look-ahead times are higher. For the chosen day of traffic and weather, setting the look-ahead time to 90 minutes yields the lowest total delay cost.
An experimental system for flood risk forecasting and monitoring at global scale
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Alfieri, Lorenzo; Kalas, Milan; Lorini, Valerio; Salamon, Peter
2017-04-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by a wide range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasting, combining streamflow estimations with expected inundated areas and flood impacts. Finally, emerging technologies such as crowdsourcing and social media monitoring can play a crucial role in flood disaster management and preparedness. Here, we present some recent advances of an experimental procedure for near-real time flood mapping and impact assessment. The procedure translates in near real-time the daily streamflow forecasts issued by the Global Flood Awareness System (GloFAS) into event-based flood hazard maps, which are then combined with exposure and vulnerability information at global scale to derive risk forecast. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To increase the reliability of our forecasts we propose the integration of model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification and correction of impact forecasts. Finally, we present the results of preliminary tests which show the potential of the proposed procedure in supporting emergency response and management.
NASA Astrophysics Data System (ADS)
Sardinha-Lourenço, A.; Andrade-Campos, A.; Antunes, A.; Oliveira, M. S.
2018-03-01
Recent research on water demand short-term forecasting has shown that models using univariate time series based on historical data are useful and can be combined with other prediction methods to reduce errors. The behavior of water demands in drinking water distribution networks focuses on their repetitive nature and, under meteorological conditions and similar consumers, allows the development of a heuristic forecast model that, in turn, combined with other autoregressive models, can provide reliable forecasts. In this study, a parallel adaptive weighting strategy of water consumption forecast for the next 24-48 h, using univariate time series of potable water consumption, is proposed. Two Portuguese potable water distribution networks are used as case studies where the only input data are the consumption of water and the national calendar. For the development of the strategy, the Autoregressive Integrated Moving Average (ARIMA) method and a short-term forecast heuristic algorithm are used. Simulations with the model showed that, when using a parallel adaptive weighting strategy, the prediction error can be reduced by 15.96% and the average error by 9.20%. This reduction is important in the control and management of water supply systems. The proposed methodology can be extended to other forecast methods, especially when it comes to the availability of multiple forecast models.
Forecast Based Financing for Managing Weather and Climate Risks to Reduce Potential Disaster Impacts
NASA Astrophysics Data System (ADS)
Arrighi, J.
2017-12-01
There is a critical window of time to reduce potential impacts of a disaster after a forecast for heightened risk is issued and before an extreme event occurs. The concept of Forecast-based Financing focuses on this window of opportunity. Through advanced preparation during system set-up, tailored methodologies are used to 1) analyze a range of potential extreme event forecasts, 2) identify emergency preparedness measures that can be taken when factoring in forecast lead time and inherent uncertainty and 3) develop standard operating procedures that are agreed on and tied to guaranteed funding sources to facilitate emergency measures led by the Red Cross or government actors when preparedness measures are triggered. This presentation will focus on a broad overview of the current state of theory and approaches used in developing a forecast-based financing systems - with a specific focus on hydrologic events, case studies of success and challenges in various contexts where this approach is being piloted, as well as what is on the horizon to be further explored and developed from a research perspective as the application of this approach continues to expand.
Does money matter in inflation forecasting?
NASA Astrophysics Data System (ADS)
Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.
2010-11-01
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.
High-Density Liquid-State Machine Circuitry for Time-Series Forecasting.
Rosselló, Josep L; Alomar, Miquel L; Morro, Antoni; Oliver, Antoni; Canals, Vincent
2016-08-01
Spiking neural networks (SNN) are the last neural network generation that try to mimic the real behavior of biological neurons. Although most research in this area is done through software applications, it is in hardware implementations in which the intrinsic parallelism of these computing systems are more efficiently exploited. Liquid state machines (LSM) have arisen as a strategic technique to implement recurrent designs of SNN with a simple learning methodology. In this work, we show a new low-cost methodology to implement high-density LSM by using Boolean gates. The proposed method is based on the use of probabilistic computing concepts to reduce hardware requirements, thus considerably increasing the neuron count per chip. The result is a highly functional system that is applied to high-speed time series forecasting.
ERIC Educational Resources Information Center
Children's Television Workshop, New York, NY.
The findings of a 1973 study covering the performance of Sesame Street and The Electric Company in ghetto communities are reported briefly. The steps taken to repeat the methodology of earlier Sesame Street studies are described. Data are given on: penetration of Sesame Street among preschool children in Bedford Stuyvesant, East Harlem, Chicago,…
NASA Astrophysics Data System (ADS)
Clark, E.; Wood, A.; Nijssen, B.; Clark, M. P.
2017-12-01
Short- to medium-range (1- to 7-day) streamflow forecasts are important for flood control operations and in issuing potentially life-save flood warnings. In the U.S., the National Weather Service River Forecast Centers (RFCs) issue such forecasts in real time, depending heavily on a manual data assimilation (DA) approach. Forecasters adjust model inputs, states, parameters and outputs based on experience and consideration of a range of supporting real-time information. Achieving high-quality forecasts from new automated, centralized forecast systems will depend critically on the adequacy of automated DA approaches to make analogous corrections to the forecasting system. Such approaches would further enable systematic evaluation of real-time flood forecasting methods and strategies. Toward this goal, we have implemented a real-time Sequential Importance Resampling particle filter (SIR-PF) approach to assimilate observed streamflow into simulated initial hydrologic conditions (states) for initializing ensemble flood forecasts. Assimilating streamflow alone in SIR-PF improves simulated streamflow and soil moisture during the model spin up period prior to a forecast, with consequent benefits for forecasts. Nevertheless, it only consistently limits error in simulated snow water equivalent during the snowmelt season and in basins where precipitation falls primarily as snow. We examine how the simulated initial conditions with and without SIR-PF propagate into 1- to 7-day ensemble streamflow forecasts. Forecasts are evaluated in terms of reliability and skill over a 10-year period from 2005-2015. The focus of this analysis is on how interactions between hydroclimate and SIR-PF performance impact forecast skill. To this end, we examine forecasts for 5 hydroclimatically diverse basins in the western U.S. Some of these basins receive most of their precipitation as snow, others as rain. Some freeze throughout the mid-winter while others experience significant mid-winter melt events. We describe the methodology and present seasonal and inter-basin variations in DA-enhanced forecast skill.
Economic Models for Projecting Industrial Capacity for Defense Production: A Review
1983-02-01
macroeconomic forecast to establish the level of civilian final demand; all use the DoD Bridge Table to allocate budget category outlays to industries. Civilian...output table.’ 3. Macroeconomic Assumptions and the Prediction of Final Demand All input-output models require as a starting point a prediction of final... macroeconomic fore- cast of GNP and its components and (2) a methodology to transform these forecast values of consumption, investment, exports, etc. into
[A method for forecasting the seasonal dynamic of malaria in the municipalities of Colombia].
Velásquez, Javier Oswaldo Rodríguez
2010-03-01
To develop a methodology for forecasting the seasonal dynamic of malaria outbreaks in the municipalities of Colombia. Epidemiologic ranges were defined by multiples of 50 cases for the six municipalities with the highest incidence, 25 cases for the municipalities that ranked 10th and 11th by incidence, 10 for the municipality that ranked 193rd, and 5 for the municipality that ranked 402nd. The specific probability values for each epidemiologic range appearing in each municipality, as well as the S/k value--the ratio between entropy (S) and the Boltzmann constant (k)--were calculated for each three-week set, along with the differences in this ratio divided by the consecutive sets of weeks. These mathematical ratios were used to determine the values for forecasting the case dynamic, which were compared with the actual epidemiologic data from the period 2003-2007. The probability of the epidemiologic ranges appearing ranged from 0.019 and 1.00, while the differences in the S/k ratio between the sets of consecutive weeks ranged from 0.23 to 0.29. Three ratios were established to determine whether the dynamic corresponded to an outbreak. These ratios were corroborated with real epidemiological data from 810 Colombian municipalities. This methodology allows us to forecast the malaria case dynamic and outbreaks in the municipalities of Colombia and can be used in planning interventions and public health policies.
Parsons, Thomas E.; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.
2014-01-01
We calculate stress changes resulting from the M= 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.
NASA Astrophysics Data System (ADS)
Parsons, Tom; Segou, Margaret; Sevilgen, Volkan; Milner, Kevin; Field, Edward; Toda, Shinji; Stein, Ross S.
2014-12-01
We calculate stress changes resulting from the M = 6.0 West Napa earthquake on north San Francisco Bay area faults. The earthquake ruptured within a series of long faults that pose significant hazard to the Bay area, and we are thus concerned with potential increases in the probability of a large earthquake through stress transfer. We conduct this exercise as a prospective test because the skill of stress-based aftershock forecasting methodology is inconclusive. We apply three methods: (1) generalized mapping of regional Coulomb stress change, (2) stress changes resolved on Uniform California Earthquake Rupture Forecast faults, and (3) a mapped rate/state aftershock forecast. All calculations were completed within 24 h after the main shock and were made without benefit of known aftershocks, which will be used to evaluative the prospective forecast. All methods suggest that we should expect heightened seismicity on parts of the southern Rodgers Creek, northern Hayward, and Green Valley faults.
Forecasting Temporal Dynamics of Cutaneous Leishmaniasis in Northeast Brazil
Lewnard, Joseph A.; Jirmanus, Lara; Júnior, Nivison Nery; Machado, Paulo R.; Glesby, Marshall J.; Ko, Albert I.; Carvalho, Edgar M.; Schriefer, Albert; Weinberger, Daniel M.
2014-01-01
Introduction Cutaneous leishmaniasis (CL) is a vector-borne disease of increasing importance in northeastern Brazil. It is known that sandflies, which spread the causative parasites, have weather-dependent population dynamics. Routinely-gathered weather data may be useful for anticipating disease risk and planning interventions. Methodology/Principal Findings We fit time series models using meteorological covariates to predict CL cases in a rural region of Bahía, Brazil from 1994 to 2004. We used the models to forecast CL cases for the period 2005 to 2008. Models accounting for meteorological predictors reduced mean squared error in one, two, and three month-ahead forecasts by up to 16% relative to forecasts from a null model accounting only for temporal autocorrelation. Significance These outcomes suggest CL risk in northeastern Brazil might be partially dependent on weather. Responses to forecasted CL epidemics may include bolstering clinical capacity and disease surveillance in at-risk areas. Ecological mechanisms by which weather influences CL risk merit future research attention as public health intervention targets. PMID:25356734
NASA Astrophysics Data System (ADS)
Maslova, I.; Ticlavilca, A. M.; McKee, M.
2012-12-01
There has been an increased interest in wavelet-based streamflow forecasting models in recent years. Often overlooked in this approach are the circularity assumptions of the wavelet transform. We propose a novel technique for minimizing the wavelet decomposition boundary condition effect to produce long-term, up to 12 months ahead, forecasts of streamflow. A simulation study is performed to evaluate the effects of different wavelet boundary rules using synthetic and real streamflow data. A hybrid wavelet-multivariate relevance vector machine model is developed for forecasting the streamflow in real-time for Yellowstone River, Uinta Basin, Utah, USA. The inputs of the model utilize only the past monthly streamflow records. They are decomposed into components formulated in terms of wavelet multiresolution analysis. It is shown that the model model accuracy can be increased by using the wavelet boundary rule introduced in this study. This long-term streamflow modeling and forecasting methodology would enable better decision-making and managing water availability risk.
NASA Astrophysics Data System (ADS)
Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.
2018-02-01
Weather forecasting is an important issue in the field of meteorology all over the world. The pattern and amount of rainfall are the essential factors that affect agricultural systems. India experiences the precious Southwest monsoon season for four months from June to September. The present paper describes an empirical study for modeling and forecasting the time series of Southwest monsoon rainfall patterns in the North-East India. The Box-Jenkins Seasonal Autoregressive Integrated Moving Average (SARIMA) methodology has been adopted for model identification, diagnostic checking and forecasting for this region. The study has shown that the SARIMA (0, 1, 1) (1, 0, 1)4 model is appropriate for analyzing and forecasting the future rainfall patterns. The Analysis of Means (ANOM) is a useful alternative to the analysis of variance (ANOVA) for comparing the group of treatments to study the variations and critical comparisons of rainfall patterns in different months of the season.
Forecast of the United States telecommunications demand through the year 2000
NASA Astrophysics Data System (ADS)
Kratochvil, D.
1984-01-01
The telecommunications forecasts considered in the present investigation were developed in studies conducted by Kratochvil et al. (1983). The overall purpose of these studies was to forecast the potential U.S. domestic telecommunications demand for satellite-provided fixed communications voice, data, and video services through the year 2000, so that this information on service demand would be available to aid in NASA communications program planning. Aspects of forecasting methodology are discussed, taking into account forecasting activity flow, specific services and selected techniques, and an event/trend cross-impact model. Events, or market determinant factors, which are very likely to occur by 1995 and 2005, are presented in a table. It is found that the demand for telecommunications in general, and for satellite telecommunications in particular, will increase significantly between now and the year 2000. The required satellite capacity will surpass both the potential and actual capacities in the early 1990s, indicating a need for Ka-band at that time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendes, J.; Bessa, R.J.; Keko, H.
Wind power forecasting (WPF) provides important inputs to power system operators and electricity market participants. It is therefore not surprising that WPF has attracted increasing interest within the electric power industry. In this report, we document our research on improving statistical WPF algorithms for point, uncertainty, and ramp forecasting. Below, we provide a brief introduction to the research presented in the following chapters. For a detailed overview of the state-of-the-art in wind power forecasting, we refer to [1]. Our related work on the application of WPF in operational decisions is documented in [2]. Point forecasts of wind power are highlymore » dependent on the training criteria used in the statistical algorithms that are used to convert weather forecasts and observational data to a power forecast. In Chapter 2, we explore the application of information theoretic learning (ITL) as opposed to the classical minimum square error (MSE) criterion for point forecasting. In contrast to the MSE criterion, ITL criteria do not assume a Gaussian distribution of the forecasting errors. We investigate to what extent ITL criteria yield better results. In addition, we analyze time-adaptive training algorithms and how they enable WPF algorithms to cope with non-stationary data and, thus, to adapt to new situations without requiring additional offline training of the model. We test the new point forecasting algorithms on two wind farms located in the U.S. Midwest. Although there have been advancements in deterministic WPF, a single-valued forecast cannot provide information on the dispersion of observations around the predicted value. We argue that it is essential to generate, together with (or as an alternative to) point forecasts, a representation of the wind power uncertainty. Wind power uncertainty representation can take the form of probabilistic forecasts (e.g., probability density function, quantiles), risk indices (e.g., prediction risk index) or scenarios (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.« less
NASA Technical Reports Server (NTRS)
Nelson, J. M.; Lempriere, B. M.
1987-01-01
A program to develop a methodology is documented for detecting and locating meteoroid and debris impacts and penetrations of a wall configuration currently specified for use on space station. Testing consisted of penetrating and non-penetrating hypervelocity impacts on single and dual plate test configurations, including a prototype 1.22 m x 2.44 m x 3.56 mm (4 ft x 8 ft x 0.140 in) aluminum waffle grid backwall with multilayer insulation and a 0.063-in shield. Acoustic data were gathered with transducers and associated data acquisition systems and stored for later analysis with a multichannel digitizer. Preliminary analysis of test data included sensor evaluation, impact repeatability, first waveform arrival, and Fourier spectral analysis.
Cockpit automation - In need of a philosophy
NASA Technical Reports Server (NTRS)
Wiener, E. L.
1985-01-01
Concern has been expressed over the rapid development and deployment of automatic devices in transport aircraft, due mainly to the human interface and particularly the role of automation in inducing human error. The paper discusses the need for coherent philosophies of automation, and proposes several approaches: (1) flight management by exception, which states that as long as a crew stays within the bounds of regulations, air traffic control and flight safety, it may fly as it sees fit; (2) exceptions by forecasting, where the use of forecasting models would predict boundary penetration, rather than waiting for it to happen; (3) goal-sharing, where a computer is informed of overall goals, and subsequently has the capability of checking inputs and aircraft position for consistency with the overall goal or intentions; and (4) artificial intelligence and expert systems, where intelligent machines could mimic human reason.
Emerging technologies for the changing global market
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Data Mining for Financial Applications
NASA Astrophysics Data System (ADS)
Kovalerchuk, Boris; Vityaev, Evgenii
This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.
Airfreight forecasting methodology and results
NASA Technical Reports Server (NTRS)
1978-01-01
A series of econometric behavioral equations was developed to explain and forecast the evolution of airfreight traffic demand for the total U.S. domestic airfreight system, the total U.S. international airfreight system, and the total scheduled international cargo traffic carried by the top 44 foreign airlines. The basic explanatory variables used in these macromodels were the real gross national products of the countries involved and a measure of relative transportation costs. The results of the econometric analysis reveal that the models explain more than 99 percent of the historical evolution of freight traffic. The long term traffic forecasts generated with these models are based on scenarios of the likely economic outlook in the United States and 31 major foreign countries.
High-Resolution Hydrological Sub-Seasonal Forecasting for Water Resources Management Over Europe
NASA Astrophysics Data System (ADS)
Wood, E. F.; Wanders, N.; Pan, M.; Sheffield, J.; Samaniego, L. E.; Thober, S.; Kumar, R.; Prudhomme, C.; Houghton-Carr, H.
2017-12-01
For decision-making at the sub-seasonal and seasonal time scale, hydrological forecasts with a high temporal and spatial resolution are required by water managers. So far such forecasts have been unavailable due to 1) lack of availability of meteorological seasonal forecasts, 2) coarse temporal resolution of meteorological seasonal forecasts, requiring temporal downscaling, 3) lack of consistency between observations and seasonal forecasts, requiring bias-correction. The EDgE (End-to-end Demonstrator for improved decision making in the water sector in Europe) project commissioned by the ECMWF (C3S) created a unique dataset of hydrological seasonal forecasts derived from four global climate models (CanCM4, FLOR-B01, ECMF, LFPW) in combination with four global hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), resulting in 208 forecasts for any given day. The forecasts provide a daily temporal and 5-km spatial resolution, and are bias corrected against E-OBS meteorological observations. The forecasts are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs), created in collaboration with the end-user community of the EDgE project (e.g. the percentage of ensemble realizations above the 10th percentile of monthly river flow, or below the 90th). Results show skillful forecasts for discharge from 3 months to 6 months (latter for N Europe due to snow); for soil moisture up to three months due precipitation forecast skill and short initial condition memory; and for groundwater greater than 6 months (lowest skill in western Europe.) The SCIIs are effective in communicating both forecast skill and uncertainty. Overall the new system provides an unprecedented ensemble for seasonal forecasts with significant skill over Europe to support water management. The consistency in both the GCM forecasts and the LSM parameterization ensures a stable and reliable forecast framework and methodology, even if additional GCMs or LSMs are added in the future.
NASA Astrophysics Data System (ADS)
Yildiz, Baran; Bilbao, Jose I.; Dore, Jonathon; Sproul, Alistair B.
2018-05-01
Smart grid components such as smart home and battery energy management systems, high penetration of renewable energy systems, and demand response activities, require accurate electricity demand forecasts for the successful operation of the electricity distribution networks. For example, in order to optimize residential PV generation and electricity consumption and plan battery charge-discharge regimes by scheduling household appliances, forecasts need to target and be tailored to individual household electricity loads. The recent uptake of smart meters allows easier access to electricity readings at very fine resolutions; hence, it is possible to utilize this source of available data to create forecast models. In this paper, models which predominantly use smart meter data alongside with weather variables, or smart meter based models (SMBM), are implemented to forecast individual household loads. Well-known machine learning models such as artificial neural networks (ANN), support vector machines (SVM) and Least-Square SVM are implemented within the SMBM framework and their performance is compared. The analysed household stock consists of 14 households from the state of New South Wales, Australia, with at least a year worth of 5 min. resolution data. In order for the results to be comparable between different households, our study first investigates household load profiles according to their volatility and reveals the relationship between load standard deviation and forecast performance. The analysis extends previous research by evaluating forecasts over four different data resolution; 5, 15, 30 and 60 min, each resolution analysed for four different horizons; 1, 6, 12 and 24 h ahead. Both, data resolution and forecast horizon, proved to have significant impact on the forecast performance and the obtained results provide important insights for the operation of various smart grid applications. Finally, it is shown that the load profile of some households vary significantly across different days; as a result, providing a single model for the entire period may result in limited performance. By the use of a pre-clustering step, similar daily load profiles are grouped together according to their standard deviation, and instead of applying one SMBM for the entire data-set of a particular household, separate SMBMs are applied to each one of the clusters. This preliminary clustering step increases the complexity of the analysis however it results in significant improvements in forecast performance.
Model documentation renewable fuels module of the National Energy Modeling System
NASA Astrophysics Data System (ADS)
1995-06-01
This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1995 Annual Energy Outlook (AEO95) forecasts. The report catalogs and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. The RFM also reads in hydroelectric facility capacities and capacity factors from a data file for use by the NEMS Electricity Market Module (EMM). The purpose of the RFM is to define the technological, cost, and resource size characteristics of renewable energy technologies. These characteristics are used to compute a levelized cost to be competed against other similarly derived costs from other energy sources and technologies. The competition of these energy sources over the NEMS time horizon determines the market penetration of these renewable energy technologies. The characteristics include available energy capacity, capital costs, fixed operating costs, variable operating costs, capacity factor, heat rate, construction lead time, and fuel product price.
System-wide emissions implications of increased wind power penetration.
Valentino, Lauren; Valenzuela, Viviana; Botterud, Audun; Zhou, Zhi; Conzelmann, Guenter
2012-04-03
This paper discusses the environmental effects of incorporating wind energy into the electric power system. We present a detailed emissions analysis based on comprehensive modeling of power system operations with unit commitment and economic dispatch for different wind penetration levels. First, by minimizing cost, the unit commitment model decides which thermal power plants will be utilized based on a wind power forecast, and then, the economic dispatch model dictates the level of production for each unit as a function of the realized wind power generation. Finally, knowing the power production from each power plant, the emissions are calculated. The emissions model incorporates the effects of both cycling and start-ups of thermal power plants in analyzing emissions from an electric power system with increasing levels of wind power. Our results for the power system in the state of Illinois show significant emissions effects from increased cycling and particularly start-ups of thermal power plants. However, we conclude that as the wind power penetration increases, pollutant emissions decrease overall due to the replacement of fossil fuels.
Linden, Ariel
2018-05-11
Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. ITSA is commonly evaluated using methods which may produce biased results if model assumptions are violated. In this paper, treatment effects are alternatively assessed by using forecasting methods to closely fit the preintervention observations and then forecast the post-intervention trend. A treatment effect may be inferred if the actual post-intervention observations diverge from the forecasts by some specified amount. The forecasting approach is demonstrated using the effect of California's Proposition 99 for reducing cigarette sales. Three forecast models are fit to the preintervention series-linear regression (REG), Holt-Winters (HW) non-seasonal smoothing, and autoregressive moving average (ARIMA)-and forecasts are generated into the post-intervention period. The actual observations are then compared with the forecasts to assess intervention effects. The preintervention data were fit best by HW, followed closely by ARIMA. REG fit the data poorly. The actual post-intervention observations were above the forecasts in HW and ARIMA, suggesting no intervention effect, but below the forecasts in the REG (suggesting a treatment effect), thereby raising doubts about any definitive conclusion of a treatment effect. In a single-group ITSA, treatment effects are likely to be biased if the model is misspecified. Therefore, evaluators should consider using forecast models to accurately fit the preintervention data and generate plausible counterfactual forecasts, thereby improving causal inference of treatment effects in single-group ITSA studies. © 2018 John Wiley & Sons, Ltd.
Price of gasoline: forecasting comparisons. [Box-Jenkins, econometric, and regression methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bopp, A.E.; Neri, J.A.
Gasoline prices are simulated using three popular forecasting methodologies: A Box--Jenkins type method, an econometric method, and a regression method. One-period-ahead and 18-period-ahead comparisons are made. For the one-period-ahead method, a Box--Jenkins type time-series model simulated best, although all do well. However, for the 18-period simulation, the econometric and regression methods perform substantially better than the Box-Jenkins formulation. A rationale for and implications of these results ae discussed. 11 references.
Volcanic Eruption Forecasts From Accelerating Rates of Drumbeat Long-Period Earthquakes
NASA Astrophysics Data System (ADS)
Bell, Andrew F.; Naylor, Mark; Hernandez, Stephen; Main, Ian G.; Gaunt, H. Elizabeth; Mothes, Patricia; Ruiz, Mario
2018-02-01
Accelerating rates of quasiperiodic "drumbeat" long-period earthquakes (LPs) are commonly reported before eruptions at andesite and dacite volcanoes, and promise insights into the nature of fundamental preeruptive processes and improved eruption forecasts. Here we apply a new Bayesian Markov chain Monte Carlo gamma point process methodology to investigate an exceptionally well-developed sequence of drumbeat LPs preceding a recent large vulcanian explosion at Tungurahua volcano, Ecuador. For more than 24 hr, LP rates increased according to the inverse power law trend predicted by material failure theory, and with a retrospectively forecast failure time that agrees with the eruption onset within error. LPs resulted from repeated activation of a single characteristic source driven by accelerating loading, rather than a distributed failure process, showing that similar precursory trends can emerge from quite different underlying physics. Nevertheless, such sequences have clear potential for improving forecasts of eruptions at Tungurahua and analogous volcanoes.
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion
Ramirez Ramirez, L. Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung
2017-01-01
Background The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Methods Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. Results DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. Conclusions The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient. PMID:28464015
Alteration of Box-Jenkins methodology by implementing genetic algorithm method
NASA Astrophysics Data System (ADS)
Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad
2015-02-01
A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.
ERIC Educational Resources Information Center
Cuadra, Ernesto; Crouch, Luis
Student promotion, repetition, and dropout rates constitute the basic data needed to forecast future enrollment and new resources. Information on student flow is significantly related to policy formulation aimed at improving internal efficiency, because dropping out and grade repetition increase per pupil cost, block access to eligible school-age…
NASA Astrophysics Data System (ADS)
Siegert, Stefan
2017-04-01
Initialised climate forecasts on seasonal time scales, run several months or even years ahead, are now an integral part of the battery of products offered by climate services world-wide. The availability of seasonal climate forecasts from various modeling centres gives rise to multi-model ensemble forecasts. Post-processing such seasonal-to-decadal multi-model forecasts is challenging 1) because the cross-correlation structure between multiple models and observations can be complicated, 2) because the amount of training data to fit the post-processing parameters is very limited, and 3) because the forecast skill of numerical models tends to be low on seasonal time scales. In this talk I will review new statistical post-processing frameworks for multi-model ensembles. I will focus particularly on Bayesian hierarchical modelling approaches, which are flexible enough to capture commonly made assumptions about collective and model-specific biases of multi-model ensembles. Despite the advances in statistical methodology, it turns out to be very difficult to out-perform the simplest post-processing method, which just recalibrates the multi-model ensemble mean by linear regression. I will discuss reasons for this, which are closely linked to the specific characteristics of seasonal multi-model forecasts. I explore possible directions for improvements, for example using informative priors on the post-processing parameters, and jointly modelling forecasts and observations.
Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity
NASA Astrophysics Data System (ADS)
Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján
2017-06-01
It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.
A novel hybrid forecasting model for PM₁₀ and SO₂ daily concentrations.
Wang, Ping; Liu, Yong; Qin, Zuodong; Zhang, Guisheng
2015-02-01
Air-quality forecasting in urban areas is difficult because of the uncertainties in describing both the emission and meteorological fields. The use of incomplete information in the training phase restricts practical air-quality forecasting. In this paper, we propose a hybrid artificial neural network and a hybrid support vector machine, which effectively enhance the forecasting accuracy of an artificial neural network (ANN) and support vector machine (SVM) by revising the error term of the traditional methods. The hybrid methodology can be described in two stages. First, we applied the ANN or SVM forecasting system with historical data and exogenous parameters, such as meteorological variables. Then, the forecasting target was revised by the Taylor expansion forecasting model using the residual information of the error term in the previous stage. The innovation involved in this approach is that it sufficiently and validly utilizes the useful residual information on an incomplete input variable condition. The proposed method was evaluated by experiments using a 2-year dataset of daily PM₁₀ (particles with a diameter of 10 μm or less) concentrations and SO₂ (sulfur dioxide) concentrations from four air pollution monitoring stations located in Taiyuan, China. The theoretical analysis and experimental results demonstrated that the forecasting accuracy of the proposed model is very promising. Copyright © 2014 Elsevier B.V. All rights reserved.
A Hybrid Model for Predicting the Prevalence of Schistosomiasis in Humans of Qianjiang City, China
Wang, Ying; Lu, Zhouqin; Tian, Lihong; Tan, Li; Shi, Yun; Nie, Shaofa; Liu, Li
2014-01-01
Backgrounds/Objective Schistosomiasis is still a major public health problem in China, despite the fact that the government has implemented a series of strategies to prevent and control the spread of the parasitic disease. Advanced warning and reliable forecasting can help policymakers to adjust and implement strategies more effectively, which will lead to the control and elimination of schistosomiasis. Our aim is to explore the application of a hybrid forecasting model to track the trends of the prevalence of schistosomiasis in humans, which provides a methodological basis for predicting and detecting schistosomiasis infection in endemic areas. Methods A hybrid approach combining the autoregressive integrated moving average (ARIMA) model and the nonlinear autoregressive neural network (NARNN) model to forecast the prevalence of schistosomiasis in the future four years. Forecasting performance was compared between the hybrid ARIMA-NARNN model, and the single ARIMA or the single NARNN model. Results The modelling mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model was 0.1869×10−4, 0.0029, 0.0419 with a corresponding testing error of 0.9375×10−4, 0.0081, 0.9064, respectively. These error values generated with the hybrid model were all lower than those obtained from the single ARIMA or NARNN model. The forecasting values were 0.75%, 0.80%, 0.76% and 0.77% in the future four years, which demonstrated a no-downward trend. Conclusion The hybrid model has high quality prediction accuracy in the prevalence of schistosomiasis, which provides a methodological basis for future schistosomiasis monitoring and control strategies in the study area. It is worth attempting to utilize the hybrid detection scheme in other schistosomiasis-endemic areas including other infectious diseases. PMID:25119882
NASA Astrophysics Data System (ADS)
Baklanov, Alexander; Smith Korsholm, Ulrik; Nuterman, Roman; Mahura, Alexander; Pagh Nielsen, Kristian; Hansen Sass, Bent; Rasmussen, Alix; Zakey, Ashraf; Kaas, Eigil; Kurganskiy, Alexander; Sørensen, Brian; González-Aparicio, Iratxe
2017-08-01
The Environment - High Resolution Limited Area Model (Enviro-HIRLAM) is developed as a fully online integrated numerical weather prediction (NWP) and atmospheric chemical transport (ACT) model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI) in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2), in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct) on radiation and (first and second indirect effects) on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform - HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose
model configurations for the meteorological and air quality communities are discussed.
An application of ensemble/multi model approach for wind power production forecast.
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Decimi, G.; Hagedorn, R.; Sperati, S.
2010-09-01
The wind power forecast of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast is based on a mesoscale meteorological models that provides the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. The corrected wind data are then used as input in the wind farm power curve to obtain the power forecast. These computations require historical time series of wind measured data (by an anemometer located in the wind farm or on the nacelle) and power data in order to be able to perform the statistical analysis on the past. For this purpose a Neural Network (NN) is trained on the past data and then applied in the forecast task. Considering that the anemometer measurements are not always available in a wind farm a different approach has also been adopted. A training of the NN to link directly the forecasted meteorological data and the power data has also been performed. The normalized RMSE forecast error seems to be lower in most cases by following the second approach. We have examined two wind farms, one located in Denmark on flat terrain and one located in a mountain area in the south of Italy (Sicily). In both cases we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by using two or more models (RAMS, ECMWF deterministic, LAMI, HIRLAM). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error of at least 1% compared to the singles models approach. Moreover the use of a deterministic global model, (e.g. ECMWF deterministic model) seems to reach similar level of accuracy of those of the mesocale models (LAMI and RAMS). Finally we have focused on the possibility of using the ensemble model (ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first day ahead period. In fact low spreads often correspond to low forecast error. For longer forecast horizon the correlation between RMSE and ensemble spread decrease becoming too low to be used for this purpose.
Zhao, Xin; Han, Meng; Ding, Lili; Calin, Adrian Cantemir
2018-01-01
The accurate forecast of carbon dioxide emissions is critical for policy makers to take proper measures to establish a low carbon society. This paper discusses a hybrid of the mixed data sampling (MIDAS) regression model and BP (back propagation) neural network (MIDAS-BP model) to forecast carbon dioxide emissions. Such analysis uses mixed frequency data to study the effects of quarterly economic growth on annual carbon dioxide emissions. The forecasting ability of MIDAS-BP is remarkably better than MIDAS, ordinary least square (OLS), polynomial distributed lags (PDL), autoregressive distributed lags (ADL), and auto-regressive moving average (ARMA) models. The MIDAS-BP model is suitable for forecasting carbon dioxide emissions for both the short and longer term. This research is expected to influence the methodology for forecasting carbon dioxide emissions by improving the forecast accuracy. Empirical results show that economic growth has both negative and positive effects on carbon dioxide emissions that last 15 quarters. Carbon dioxide emissions are also affected by their own change within 3 years. Therefore, there is a need for policy makers to explore an alternative way to develop the economy, especially applying new energy policies to establish a low carbon society.
Skin Penetration Enhancement by Natural Oils for Dihydroquercetin Delivery.
Čižinauskas, Vytis; Elie, Nicolas; Brunelle, Alain; Briedis, Vitalis
2017-09-12
Natural oils are commonly used in topical pharmaceutical formulations as emulsifiers, stabilizers or solubility enhancers. They are presented as safe and inert components, mainly used for formulation purposes. It is confirmed that natural oils can affect the skin penetration of various substances. Fatty acids are mainly responsible for this effect. Current understanding lacks reliable scientific data on penetration of natural oils into the skin and their skin penetration enhancement potential. In the current study, fatty acid content analysis was used to determine the principal fatty acids in soybean, olive, avocado, sea-buckthorn pulp, raspberry seed and coconut oils. Time of flight secondary ion mass spectrometry bioimaging was used to determine the distribution of these fatty acids in human skin ex vivo after application of the oils. Skin penetration enhancement ratios were determined for a perspective antioxidant compound dihydroquercetin. The results demonstrated skin penetration of fatty acids from all oils tested. Only soybean and olive oils significantly increased the skin distribution of dihydroquercetin and can be used as skin penetration enhancers. However, no correlation can be determined between the fatty acids' composition and skin penetration enhancement using currently available methodological approaches. This indicates that potential chemical penetration enhancement should be evaluated during formulation of topically applied products containing natural oils.
Potential economic value of drought information to support early warning in Africa
NASA Astrophysics Data System (ADS)
Quiroga, S.; Iglesias, A.; Diz, A.; Garrote, L.
2012-04-01
We present a methodology to estimate the economic value of advanced climate information for food production in Africa under climate change scenarios. The results aim to facilitate better choices in water resources management. The methodology includes 4 sequential steps. First two contrasting management strategies (with and without early warning) are defined. Second, the associated impacts of the management actions are estimated by calculating the effect of drought in crop productivity under climate change scenarios. Third, the optimal management option is calculated as a function of the drought information and risk aversion of potential information users. Finally we use these optimal management simulations to compute the economic value of enhanced water allocation rules to support stable food production in Africa. Our results show how a timely response to climate variations can help reduce loses in food production. The proposed framework is developed within the Dewfora project (Early warning and forecasting systems to predict climate related drought vulnerability and risk in Africa) that aims to improve the knowledge on drought forecasting, warning and mitigation, and advance the understanding of climate related vulnerability to drought and to develop a prototype operational forecasting.
NASA Astrophysics Data System (ADS)
Concha Larrauri, P.
2015-12-01
Orange production in Florida has experienced a decline over the past decade. Hurricanes in 2004 and 2005 greatly affected production, almost to the same degree as strong freezes that occurred in the 1980's. The spread of the citrus greening disease after the hurricanes has also contributed to a reduction in orange production in Florida. The occurrence of hurricanes and diseases cannot easily be predicted but the additional effects of climate on orange yield can be studied and incorporated into existing production forecasts that are based on physical surveys, such as the October Citrus forecast issued every year by the USDA. Specific climate variables ocurring before and after the October forecast is issued can have impacts on flowering, orange drop rates, growth, and maturation, and can contribute to the forecast error. Here we present a methodology to incorporate local climate variables to predict the USDA's orange production forecast error, and we study the local effects of climate on yield in different counties in Florida. This information can aid farmers to gain an insight on what is to be expected during the orange production cycle, and can help supply chain managers to better plan their strategy.
NASA Astrophysics Data System (ADS)
Higgins, S. M. W.; Du, H. L.; Smith, L. A.
2012-04-01
Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.
Environmental forecasting and turbulence modeling
NASA Astrophysics Data System (ADS)
Hunt, J. C. R.
This review describes the fundamental assumptions and current methodologies of the two main kinds of environmental forecast; the first is valid for a limited period of time into the future and over a limited space-time ‘target’, and is largely determined by the initial and preceding state of the environment, such as the weather or pollution levels, up to the time when the forecast is issued and by its state at the edges of the region being considered; the second kind provides statistical information over long periods of time and/or over large space-time targets, so that they only depend on the statistical averages of the initial and ‘edge’ conditions. Environmental forecasts depend on the various ways that models are constructed. These range from those based on the ‘reductionist’ methodology (i.e., the combination of separate, scientifically based, models for the relevant processes) to those based on statistical methodologies, using a mixture of data and scientifically based empirical modeling. These are, as a rule, focused on specific quantities required for the forecast. The persistence and predictability of events associated with environmental and turbulent flows and the reasons for variation in the accuracy of their forecasts (of the first and second kinds) are now better understood and better modeled. This has partly resulted from using analogous results of disordered chaotic systems, and using the techniques of calculating ensembles of realizations, ideally involving several different models, so as to incorporate in the probabilistic forecasts a wider range of possible events. The rationale for such an approach needs to be developed. However, other insights have resulted from the recognition of the ordered, though randomly occurring, nature of the persistent motions in these flows, whose scales range from those of synoptic weather patterns (whether storms or ‘blocked’ anticyclones) to small scale vortices. These eigen states can be predicted from the reductionist models or may be modeled specifically, for example, in terms of ‘self-organized’ critical phenomena. It is noted how in certain applications of turbulent modeling its methods are beginning to resemble those of environmental simulations, because of the trend to introduce ‘on-line’ controls of the turbulent flows in advanced flows in advanced engineering fluid systems. In real time simulations, for both local environmental processes and these engineering systems, maximum information is needed about the likely flow patterns in order to optimize both the assimilation of limited real-time data and the use of limited real-time computing capacity. It is concluded that philosophical studies of how scientific models develop and of the concept of determinism in science are helpful in considering these complex issues.
Forecasting production in Liquid Rich Shale plays
NASA Astrophysics Data System (ADS)
Nikfarman, Hanieh
Production from Liquid Rich Shale (LRS) reservoirs is taking center stage in the exploration and production of unconventional reservoirs. Production from the low and ultra-low permeability LRS plays is possible only through multi-fractured horizontal wells (MFHW's). There is no existing workflow that is applicable to forecasting multi-phase production from MFHW's in LRS plays. This project presents a practical and rigorous workflow for forecasting multiphase production from MFHW's in LRS reservoirs. There has been much effort in developing workflows and methodology for forecasting in tight/shale plays in recent years. The existing workflows, however, are applicable only to single phase flow, and are primarily used in shale gas plays. These methodologies do not apply to the multi-phase flow that is inevitable in LRS plays. To account for complexities of multiphase flow in MFHW's the only available technique is dynamic modeling in compositional numerical simulators. These are time consuming and not practical when it comes to forecasting production and estimating reserves for a large number of producers. A workflow was developed, and validated by compositional numerical simulation. The workflow honors physics of flow, and is sufficiently accurate while practical so that an analyst can readily apply it to forecast production and estimate reserves in a large number of producers in a short period of time. To simplify the complex multiphase flow in MFHW, the workflow divides production periods into an initial period where large production and pressure declines are expected, and the subsequent period where production decline may converge into a common trend for a number of producers across an area of interest in the field. Initial period assumes the production is dominated by single-phase flow of oil and uses the tri-linear flow model of Erdal Ozkan to estimate the production history. Commercial software readily available can simulate flow and forecast production in this period. In the subsequent Period, dimensionless rate and dimensionless time functions are introduced that help identify transition from initial period into subsequent period. The production trends in terms of the dimensionless parameters converge for a range of rock permeability and stimulation intensity. This helps forecast production beyond transition to the end of life of well. This workflow is applicable to single fluid system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yishen; Zhou, Zhi; Liu, Cong
2016-08-01
As more wind power and other renewable resources are being integrated into the electric power grid, the forecast uncertainty brings operational challenges for the power system operators. In this report, different operational strategies for uncertainty management are presented and evaluated. A comprehensive and consistent simulation framework is developed to analyze the performance of different reserve policies and scheduling techniques under uncertainty in wind power. Numerical simulations are conducted on a modified version of the IEEE 118-bus system with a 20% wind penetration level, comparing deterministic, interval, and stochastic unit commitment strategies. The results show that stochastic unit commitment provides amore » reliable schedule without large increases in operational costs. Moreover, decomposition techniques, such as load shift factor and Benders decomposition, can help in overcoming the computational obstacles to stochastic unit commitment and enable the use of a larger scenario set to represent forecast uncertainty. In contrast, deterministic and interval unit commitment tend to give higher system costs as more reserves are being scheduled to address forecast uncertainty. However, these approaches require a much lower computational effort Choosing a proper lower bound for the forecast uncertainty is important for balancing reliability and system operational cost in deterministic and interval unit commitment. Finally, we find that the introduction of zonal reserve requirements improves reliability, but at the expense of higher operational costs.« less
A hybrid spatiotemporal drought forecasting model for operational use
NASA Astrophysics Data System (ADS)
Vasiliades, L.; Loukas, A.
2010-09-01
Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. This study develops a hybrid spatiotemporal scheme for integrated spatial and temporal forecasting. Temporal forecasting is achieved using feed-forward neural networks and the temporal forecasts are extended to the spatial dimension using a spatial recurrent neural network model. The methodology is demonstrated for an operational meteorological drought index the Standardized Precipitation Index (SPI) calculated at multiple timescales. 48 precipitation stations and 18 independent precipitation stations, located at Pinios river basin in Thessaly region, Greece, were used for the development and spatiotemporal validation of the hybrid spatiotemporal scheme. Several quantitative temporal and spatial statistical indices were considered for the performance evaluation of the models. Furthermore, qualitative statistical criteria based on contingency tables between observed and forecasted drought episodes were calculated. The results show that the lead time of forecasting for operational use depends on the SPI timescale. The hybrid spatiotemporal drought forecasting model could be operationally used for forecasting up to three months ahead for SPI short timescales (e.g. 3-6 months) up to six months ahead for large SPI timescales (e.g. 24 months). The above findings could be useful in developing a drought preparedness plan in the region.
1979-10-01
expansion, relative stopping power according to one or another formula or test method, penetration, ricochet, and other fragments. In the past, the solution...the data gathered for each test round, in the following documents: a. "Ammunition For Law Enforcement: Part II, Data Obtained for Bullets Penetrating...high velocity testing , chamber pressures exceeded those permissible in standard handguns. For safety, then, Mann test barrels were used. At this point
Application of SeaWinds Scatterometer and TMI-SSM/I Rain Rates to Hurricane Analysis and Forecasting
NASA Technical Reports Server (NTRS)
Atlas, Robert; Hou, Arthur; Reale, Oreste
2004-01-01
Results provided by two different assimilation methodologies involving data from passive and active space-borne microwave instruments are presented. The impact of the precipitation estimates produced by the TRMM Microwave Imager (TMI) and Special Sensor Microwave/Imager (SSM/I) in a previously developed 1D variational continuous assimilation algorithm for assimilating tropical rainfall is shown on two hurricane cases. Results on the impact of the SeaWinds scatterometer on the intensity and track forecast of a mid-Atlantic hurricane are also presented. This work is the outcome of a collaborative effort between NASA and NOAA and indicates the substantial improvement in tropical cyclone forecasting that can result from the assimilation of space-based data in global atmospheric models.
An evolving-requirements technology assessment process for advanced propulsion concepts
NASA Astrophysics Data System (ADS)
McClure, Erin Kathleen
The following dissertation investigates the development of a methodology suitable for the evaluation of advanced propulsion concepts. At early stages of development, both the future performance of these concepts and their requirements are highly uncertain, making it difficult to forecast their future value. Developing advanced propulsion concepts requires a huge investment of resources. The methodology was developed to enhance the decision-makers understanding of the concepts, so that they could mitigate the risks associated with developing such concepts. A systematic methodology to identify potential advanced propulsion concepts and assess their robustness is necessary to reduce the risk of developing advanced propulsion concepts. Existing advanced design methodologies have evaluated the robustness of technologies or concepts to variations in requirements, but they are not suitable to evaluate a large number of dissimilar concepts. Variations in requirements have been shown to impact the development of advanced propulsion concepts, and any method designed to evaluate these concepts must incorporate the possible variations of the requirements into the assessment. In order to do so, a methodology was formulated to be capable of accounting for two aspects of the problem. First, it had to systemically identify a probabilistic distribution for the future requirements. Such a distribution would allow decision-makers to quantify the uncertainty introduced by variations in requirements. Second, the methodology must be able to assess the robustness of the propulsion concepts as a function of that distribution. This dissertation describes in depth these enabling elements and proceeds to synthesize them into a new method, the Evolving Requirements Technology Assessment (ERTA). As a proof of concept, the ERTA method was used to evaluate and compare advanced propulsion systems that will be capable of powering a hurricane tracking, High Altitude, Long Endurance (HALE) unmanned aerial vehicle (UAV). The use of the ERTA methodology to assess HALE UAV propulsion concepts demonstrated that potential variations in requirements do significantly impact the assessment and selection of propulsion concepts. The proof of concept also demonstrated that traditional forecasting techniques, such as the cross impact analysis, could be used to forecast the requirements for advanced propulsion concepts probabilistically. "Fitness", a measure of relative goodness, was used to evaluate the concepts. Finally, stochastic optimizations were used to evaluate the propulsion concepts across the range of requirement sets that were considered.
Long-term effects of health factor modification in Milwaukee County.
Shi, Lu; van Meijgaard, Jeroen; Fielding, Jonathan E
2013-01-01
We use the UCLA Health Forecasting Tool to forecast the 2011-2050 health trends in Milwaukee County. We first simulate a baseline scenario (S-1) that assumes no health behavior change, and compare this with three simulated intervention scenarios: expansion of Quitline reach to enhance smoking cessation (S-2), an increased penetration of diabetes screening (S-3) and construction of additional recreational facilities (S-4). We compared the disease-free life years (DFLY) gained from each intervention scenario by 2050 on a year-by-year and cumulative basis. Simulation results show that increasing access to recreational facilities achieves the greatest gain in DFLYs for every year from 2011 to 2050. By 2050, the cumulative DFLY gain is 22 393, 5956 and 41 396 for S-2, S-3, and S-4, respectively. The cost-effectiveness ratios for Quitline expansion, diabetes screening, and recreational facility construction are $1802, $1285, and $1322, per DFLY gained, respectively.
NASA Astrophysics Data System (ADS)
Mohammed, Touseef Ahmed Faisal
Since 2000, renewable electricity installations in the United States (excluding hydropower) have more than tripled. Renewable electricity has grown at a compounded annual average of nearly 14% per year from 2000-2010. Wind, Concentrated Solar Power (CSP) and solar Photo Voltaic (PV) are the fastest growing renewable energy sectors. In 2010 in the U.S., solar PV grew over 71% and CSP grew by 18% from the previous year. Globally renewable electricity installations have more than quadrupled from 2000-2010. Solar PV generation grew by a factor of more than 28 between 2000 and 2010. The amount of CSP and solar PV installations are increasing on the distribution grid. These PV installations transmit electrical current from the load centers to the generating stations. But the transmission and distribution grid have been designed for uni-directional flow of electrical energy from generating stations to load centers. This causes imbalances in voltage and switchgear of the electrical circuitry. With the continuous rise in PV installations, analysis of voltage profile and penetration levels remain an active area of research. Standard distributed photovoltaic (PV) generators represented in simulation studies do not reflect the exact location and variability properties such as distance between interconnection points to substations, voltage regulators, solar irradiance and other environmental factors. Quasi-Static simulations assist in peak load planning hour and day ahead as it gives a time sequence analysis to help in generation allocation. Simulation models can be daily, hourly or yearly depending on duty cycle and dynamics of the system. High penetration of PV into the power grid changes the voltage profile and power flow dynamically in the distribution circuits due to the inherent variability of PV. There are a number of modeling and simulations tools available for the study of such high penetration PV scenarios. This thesis will specifically utilize OpenDSS, a open source Distribution System Simulator developed by Electric Power Research Institute, to simulate grid voltage profile with a large scale PV system under quasi-static time series considering variations of PV output in seconds, minutes, and the average daily load variations. A 13 bus IEEE distribution feeder model is utilized with distributed residential and commercial scale PV at different buses for simulation studies. Time series simulations are discussed for various modes of operation considering dynamic PV penetration at different time periods in a day. In addition, this thesis demonstrates simulations taking into account the presence of moving cloud for solar forecasting studies.
Nonlinear problems in data-assimilation : Can synchronization help?
NASA Astrophysics Data System (ADS)
Tribbia, J. J.; Duane, G. S.
2009-12-01
Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.
2013-01-01
Background In Japan, a shortage of physicians, who serve a key role in healthcare provision, has been pointed out as a major medical issue. The healthcare workforce policy planner should consider future dynamic changes in physician numbers. The purpose of this study was to propose a physician supply forecasting methodology by applying system dynamics modeling to estimate future absolute and relative numbers of physicians. Method We constructed a forecasting model using a system dynamics approach. Forecasting the number of physician was performed for all clinical physician and OB/GYN specialists. Moreover, we conducted evaluation of sufficiency for the number of physicians and sensitivity analysis. Result & conclusion As a result, it was forecast that the number of physicians would increase during 2008–2030 and the shortage would resolve at 2026 for all clinical physicians. However, the shortage would not resolve for the period covered. This suggests a need for measures for reconsidering the allocation system of new entry physicians to resolve maldistribution between medical departments, in addition, for increasing the overall number of clinical physicians. PMID:23981198
Ishikawa, Tomoki; Ohba, Hisateru; Yokooka, Yuki; Nakamura, Kozo; Ogasawara, Katsuhiko
2013-08-27
In Japan, a shortage of physicians, who serve a key role in healthcare provision, has been pointed out as a major medical issue. The healthcare workforce policy planner should consider future dynamic changes in physician numbers. The purpose of this study was to propose a physician supply forecasting methodology by applying system dynamics modeling to estimate future absolute and relative numbers of physicians. We constructed a forecasting model using a system dynamics approach. Forecasting the number of physician was performed for all clinical physician and OB/GYN specialists. Moreover, we conducted evaluation of sufficiency for the number of physicians and sensitivity analysis. As a result, it was forecast that the number of physicians would increase during 2008-2030 and the shortage would resolve at 2026 for all clinical physicians. However, the shortage would not resolve for the period covered. This suggests a need for measures for reconsidering the allocation system of new entry physicians to resolve maldistribution between medical departments, in addition, for increasing the overall number of clinical physicians.
System learning approach to assess sustainability and ...
This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key method in Information Theory, to preserve trends in the historical data and prevent over fitting projections. The methodology was applied to demographic, environmental, food and energy consumption, and agricultural production in the San Luis Basin regional system in Colorado, U.S.A. These variables are important for tracking conditions in human and natural systems. However, available data are often so far out of date that they limit the ability to manage these systems. Results indicate that the approaches developed provide viable tools for forecasting outcomes with the aim of assisting management toward sustainable trends. This methodology is also applicable for modeling different scenarios in other dynamic systems. Indicators are indispensable for tracking conditions in human and natural systems, however, available data is sometimes far out of date and limit the ability to gauge system status. Techniques like regression and simulation are not sufficient because system characteristics have to be modeled ensuring over simplification of complex dynamics. This work presents a methodology combining the power of an Artificial Neural Network and Information Theory to capture patterns in a real dyna
Hui, Xiaoying; Lamel, Sonia; Qiao, Peter; Maibach, Howard I
2013-03-01
Cutaneously directed chemical warfare agents can elicit significant morbidity and mortality. The optimization of prophylactic and therapeutic interventions counteracting these agents is crucial, and the development of decontamination protocols and methodology of post dermal exposure risk assessments would be additionally applicable to common industrial and consumer dermatotoxicants. Percutaneous (PC) penetration is often considered a simple one-step diffusion process but presently consists of at least 15 steps. The systemic exposure to an agent depends on multiple factors and the second part of this review covers absorption and excretion kinetics, wash and rub effects, skin substantivity and transfer, among others. Importantly, the partitioning behavior and diffusion through the stratum corneum (SC) of a wide physicochemical array of compounds shows that many compounds have approximately the same diffusion coefficient which determines their percutaneous absorption in vivo. After accounting for anatomical variation of the SC, the penetration flux value of a substance depends mainly on its SC/vehicle partition coefficient. Additionally, the SC acts as a 'reservoir' for topically applied molecules, and tape stripping methodology can quantify the remaining chemical in the SC which can predict the total molecular penetration in vivo. The determination of ideal decontamination protocols is of utmost importance to reduce morbidity and mortality. However, even expeditious standard washing procedures post dermal chemical exposure often fails to remove chemicals. The second part of this overview continues to review percutaneous penetration extending insights into the complexities of penetration, decontamination and potential newer assays that may be of practical importance. Copyright © 2012 John Wiley & Sons, Ltd.
Verifying and Postprocesing the Ensemble Spread-Error Relationship
NASA Astrophysics Data System (ADS)
Hopson, Tom; Knievel, Jason; Liu, Yubao; Roux, Gregory; Wu, Wanli
2013-04-01
With the increased utilization of ensemble forecasts in weather and hydrologic applications, there is a need to verify their benefit over less expensive deterministic forecasts. One such potential benefit of ensemble systems is their capacity to forecast their own forecast error through the ensemble spread-error relationship. The paper begins by revisiting the limitations of the Pearson correlation alone in assessing this relationship. Next, we introduce two new metrics to consider in assessing the utility an ensemble's varying dispersion. We argue there are two aspects of an ensemble's dispersion that should be assessed. First, and perhaps more fundamentally: is there enough variability in the ensembles dispersion to justify the maintenance of an expensive ensemble prediction system (EPS), irrespective of whether the EPS is well-calibrated or not? To diagnose this, the factor that controls the theoretical upper limit of the spread-error correlation can be useful. Secondly, does the variable dispersion of an ensemble relate to variable expectation of forecast error? Representing the spread-error correlation in relation to its theoretical limit can provide a simple diagnostic of this attribute. A context for these concepts is provided by assessing two operational ensembles: 30-member Western US temperature forecasts for the U.S. Army Test and Evaluation Command and 51-member Brahmaputra River flow forecasts of the Climate Forecast and Applications Project for Bangladesh. Both of these systems utilize a postprocessing technique based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. In addition, the methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. We will describe both ensemble systems briefly, review the steps used to calibrate the ensemble forecast, and present verification statistics using error-spread metrics, along with figures from operational ensemble forecasts before and after calibration.
A probabilistic neural network based approach for predicting the output power of wind turbines
NASA Astrophysics Data System (ADS)
Tabatabaei, Sajad
2017-03-01
Finding the authentic predicting tools of eliminating the uncertainty of wind speed forecasts is highly required while wind power sources are strongly penetrating. Recently, traditional predicting models of generating point forecasts have no longer been trustee. Thus, the present paper aims at utilising the concept of prediction intervals (PIs) to assess the uncertainty of wind power generation in power systems. Besides, this paper uses a newly introduced non-parametric approach called lower upper bound estimation (LUBE) to build the PIs since the forecasting errors are unable to be modelled properly by applying distribution probability functions. In the present proposed LUBE method, a PI combination-based fuzzy framework is used to overcome the performance instability of neutral networks (NNs) used in LUBE. In comparison to other methods, this formulation more suitably has satisfied the PI coverage and PI normalised average width (PINAW). Since this non-linear problem has a high complexity, a new heuristic-based optimisation algorithm comprising a novel modification is introduced to solve the aforesaid problems. Based on data sets taken from a wind farm in Australia, the feasibility and satisfying performance of the suggested method have been investigated.
Solving the Meteorological Challenges of Creating a Sustainable Energy System (Invited)
NASA Astrophysics Data System (ADS)
Marquis, M.
2010-12-01
Global energy demand is projected to double from 13 TW at the start of this century to 28 TW by the middle of the century. This translates into obtaining 1000 MW (1 GW, the amount produced by an average nuclear or coal power plant) of new energy every single day for the next 40 years. The U.S. Department of Energy has conducted three feasibility studies in the last two years identifying the costs, challenges, impacts, and benefits of generating large portions of the nation’s electricity from wind and solar energy, in the new two decades. The 20% Wind by 2030 report found that the nation could meet one-fifth of its electricity demand from wind energy by 2030. The second report, the Eastern Wind Integration and Transmission Study, considered similar costs, challenges, and benefits, but considered 20% wind energy in the Eastern Interconnect only, with a target date of 2024. The third report, the Western Wind and Solar Integration Study, considered the operational impact of up to 35% penetration of wind, photovoltaics (PVs) and, concentrating solar power (CSP) on the power system operated by the WestConnect group, with a target date of 2017. All three studies concluded that it is technically feasible to obtain these high penetration levels of renewable energy, but that increases in the balancing area cooperation or coordination, increased utilization of transmission and building of transmission in some cases, and improved weather forecasts are needed. Current energy systems were designed for dispatchable fuels, such as coal, natural gas and nuclear energy. Fitting weather-driven renewable energy into today's energy system is like fitting a square peg into a round hole. If society chooses to meet a significant portion of new energy demand from weather-driven renewable energy, such as wind and solar energy, a number of obstacles must be overcome. Some of these obstacles are meteorological and climatological issues that are amenable to scientific research. For variable renewable energy sources to reach high penetration levels, electric system operators and utilities need better atmo¬spheric observations, models, and forecasts. Current numerical weather prediction models have not been optimized to help the nation use renewable energy. Improved meteorological observations (e.g., wind turbine hub-height wind speeds, surface direct and diffuse solar radiation), as well as observations through a deeper layer of the atmosphere for assimilation into NWP models, are needed. Particularly urgent is the need for improved forecasts of ramp events. Longer-term predictions of renewable resources, on the seasonal to decadal scale, are also needed. Improved understanding of the variability and co-variability of wind and solar energy, as well as their correlations with large-scale climate drivers, would assist decision-makers in long-term planning. This talk with discuss the feasibility and benefits of developing enhanced weather forecasts and climate information specific to the needs of a growing renewable energy infrastructure.
Corridor-based forecasts of work-zone impacts for freeways.
DOT National Transportation Integrated Search
2011-08-09
This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...
Transportation Sector Module - NEMS Documentation
2017-01-01
Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model.
Confocal laser scanning microscopy to estimate nanoparticles' human skin penetration in vitro.
Zou, Ying; Celli, Anna; Zhu, Hanjiang; Elmahdy, Akram; Cao, Yachao; Hui, Xiaoying; Maibach, Howard
2017-01-01
With rapid development of nanotechnology, there is increasing interest in nanoparticle (NP) application and its safety and efficacy on human skin. In this study, we utilized confocal laser scanning microscopy to estimate NP skin penetration. Three different-sized polystyrene NPs marked with red fluorescence were applied to human skin, and Calcium Green 5N was used as a counterstain. Dimethyl sulfoxide (DMSO) and ethanol were used as alternative vehicles for NPs. Tape stripping was utilized as a barrier-damaged skin model. Skin biopsies dosed with NPs were incubated at 4°C or 37°C for 24 hours and imaged using confocal laser scanning microscopy. NPs were localized in the stratum corneum (SC) and hair follicles without penetrating the epidermis/dermis. Barrier alteration with tape stripping and change in incubation temperature did not induce deeper penetration. DMSO enhanced NP SC penetration but ethanol did not. Except with DMSO vehicle, these hydrolyzed polystyrene NPs did not penetrate intact or barrier-damaged human "viable" epidermis. For further clinical relevance, in vivo human skin studies and more sensitive analytic chemical methodology are suggested.
NASA Astrophysics Data System (ADS)
Ovchinnikov, I. I.; Snezhkina, O. V.; Ovchinnikov, I. G.
2017-11-01
The task of modeling the kinetics of chloride-containing medium penetration into construction elements out of reinforced concrete that have partially damaged anti-corrosion protective coatings is being discussed. As a result, chlorides penetrate the construction element via local surface areas which leads to irregularities between chloride dispersion volumes. The kinetics of chloride penetration is described by the equation of diffusion to solve which the CONDUCT software complex by professor S. Patankar was used. The methodology used to solve the diffusional equation is described. The results of the evaluation of concentration field in the axial section of a cylindrical construction element, which was centrally reinforced, are given. The chloride diffusion was symmetrical to the axis, the medium was applied through the central ring area equal to one third of the side surface area while the rest of the surface was isolated. It was shown that the methodology of evaluation and its algorithm allow one to evaluate the concentration field of chlorides in reinforced concrete structural elements under local or asymmetrical action of the chloride - containing medium. The example given illustrates that after a certain time interval critical the concentration of chlorides develops even in protected areas which are located far from the initial damaged area. This means that the corrosion destruction of reinforced elements develops not only in the immediate damage area, but also further away from it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baggu, Murali; Giraldez, Julieta; Harris, Tom
In an effort to better understand the impacts of high penetrations of photovoltaic (PV) generators on distribution systems, Arizona Public Service and its partners completed a multi-year project to develop the tools and knowledge base needed to safely and reliably integrate high penetrations of utility- and residential-scale PV. Building upon the APS Community Power Project-Flagstaff Pilot, this project investigates the impact of PV on a representative feeder in northeast Flagstaff. To quantify and catalog the effects of the estimated 1.3 MW of PV that will be installed on the feeder (both smaller units at homes and large, centrally located systems),more » high-speed weather and electrical data acquisition systems and digital 'smart' meters were designed and installed to facilitate monitoring and to build and validate comprehensive, high-resolution models of the distribution system. These models are being developed to analyze the impacts of PV on distribution circuit protection systems (including coordination and anti-islanding), predict voltage regulation and phase balance issues, and develop volt/VAr control schemes. This paper continues from a paper presented at the 2014 IEEE PVSC conference that described feeder model evaluation and high penetration advanced scenario analysis, specifically feeder reconfiguration. This paper presents results from Phase 5 of the project. Specifically, the paper discusses tool automation; interconnection assessment methodology and cost benefit analysis.« less
A 30-day-ahead forecast model for grass pollen in north London, United Kingdom.
Smith, Matt; Emberlin, Jean
2006-03-01
A 30-day-ahead forecast method has been developed for grass pollen in north London. The total period of the grass pollen season is covered by eight multiple regression models, each covering a 10-day period running consecutively from 21 May to 8 August. This means that three models were used for each 30-day forecast. The forecast models were produced using grass pollen and environmental data from 1961 to 1999 and tested on data from 2000 and 2002. Model accuracy was judged in two ways: the number of times the forecast model was able to successfully predict the severity (relative to the 1961-1999 dataset as a whole) of grass pollen counts in each of the eight forecast periods on a scale of 1 to 4; the number of times the forecast model was able to predict whether grass pollen counts were higher or lower than the mean. The models achieved 62.5% accuracy in both assessment years when predicting the relative severity of grass pollen counts on a scale of 1 to 4, which equates to six of the eight 10-day periods being forecast correctly. The models attained 87.5% and 100% accuracy in 2000 and 2002, respectively, when predicting whether grass pollen counts would be higher or lower than the mean. Attempting to predict pollen counts during distinct 10-day periods throughout the grass pollen season is a novel approach. The models also employed original methodology in the use of winter averages of the North Atlantic Oscillation to forecast 10-day means of allergenic pollen counts.
Ensemble-based methods for forecasting census in hospital units
2013-01-01
Background The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. Methods In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Results Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Conclusions Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts. PMID:23721123
Ensemble-based methods for forecasting census in hospital units.
Koestler, Devin C; Ombao, Hernando; Bender, Jesse
2013-05-30
The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts.
Impact and Penetration Simulations for Composite Wing-like Structures
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.
NASA Technical Reports Server (NTRS)
1975-01-01
The economic benefits of improved ocean condition, weather and ice forecasts by SEASAT satellites to the exploration, development and production of oil and natural gas in the offshore regions are considered. The results of case studies which investigate the effects of forecast accuracy on offshore operations in the North Sea, the Celtic Sea, and the Gulf of Mexico are reported. A methodology for generalizing the results to other geographic regions of offshore oil and natural gas exploration and development is described.
Clark, M.R.; Gangopadhyay, S.; Hay, L.; Rajagopalan, B.; Wilby, R.
2004-01-01
A number of statistical methods that are used to provide local-scale ensemble forecasts of precipitation and temperature do not contain realistic spatial covariability between neighboring stations or realistic temporal persistence for subsequent forecast lead times. To demonstrate this point, output from a global-scale numerical weather prediction model is used in a stepwise multiple linear regression approach to downscale precipitation and temperature to individual stations located in and around four study basins in the United States. Output from the forecast model is downscaled for lead times up to 14 days. Residuals in the regression equation are modeled stochastically to provide 100 ensemble forecasts. The precipitation and temperature ensembles from this approach have a poor representation of the spatial variability and temporal persistence. The spatial correlations for downscaled output are considerably lower than observed spatial correlations at short forecast lead times (e.g., less than 5 days) when there is high accuracy in the forecasts. At longer forecast lead times, the downscaled spatial correlations are close to zero. Similarly, the observed temporal persistence is only partly present at short forecast lead times. A method is presented for reordering the ensemble output in order to recover the space-time variability in precipitation and temperature fields. In this approach, the ensemble members for a given forecast day are ranked and matched with the rank of precipitation and temperature data from days randomly selected from similar dates in the historical record. The ensembles are then reordered to correspond to the original order of the selection of historical data. Using this approach, the observed intersite correlations, intervariable correlations, and the observed temporal persistence are almost entirely recovered. This reordering methodology also has applications for recovering the space-time variability in modeled streamflow. ?? 2004 American Meteorological Society.
Objective calibration of numerical weather prediction models
NASA Astrophysics Data System (ADS)
Voudouri, A.; Khain, P.; Carmona, I.; Bellprat, O.; Grazzini, F.; Avgoustoglou, E.; Bettems, J. M.; Kaufmann, P.
2017-07-01
Numerical weather prediction (NWP) and climate models use parameterization schemes for physical processes, which often include free or poorly confined parameters. Model developers normally calibrate the values of these parameters subjectively to improve the agreement of forecasts with available observations, a procedure referred as expert tuning. A practicable objective multi-variate calibration method build on a quadratic meta-model (MM), that has been applied for a regional climate model (RCM) has shown to be at least as good as expert tuning. Based on these results, an approach to implement the methodology to an NWP model is presented in this study. Challenges in transferring the methodology from RCM to NWP are not only restricted to the use of higher resolution and different time scales. The sensitivity of the NWP model quality with respect to the model parameter space has to be clarified, as well as optimize the overall procedure, in terms of required amount of computing resources for the calibration of an NWP model. Three free model parameters affecting mainly turbulence parameterization schemes were originally selected with respect to their influence on the variables associated to daily forecasts such as daily minimum and maximum 2 m temperature as well as 24 h accumulated precipitation. Preliminary results indicate that it is both affordable in terms of computer resources and meaningful in terms of improved forecast quality. In addition, the proposed methodology has the advantage of being a replicable procedure that can be applied when an updated model version is launched and/or customize the same model implementation over different climatological areas.
NASA Astrophysics Data System (ADS)
Lunter, Dominique; Daniels, Rolf
2014-12-01
A methodology that employs confocal Raman microscopy (CRM) on ex vivo skin samples is proposed for the investigation of drug content and distribution in the skin. To this end, the influence of the penetration enhancers propylene glycol and polyoxyethylene-23-lauryl ether on the penetration and permeation of procaine as a model substance was investigated. The drug content of skin samples that had been incubated with semisolid formulations containing one of these enhancers was examined after skin segmentation. The experiments showed that propylene glycol did not affect the procaine content that was delivered to the skin, whereas polyoxyethylene-23-lauryl ether led to higher procaine contents and deeper penetration. Neither substance was found to influence the permeation rate of procaine. It is thereby shown that CRM can provide additional information on drug penetration and permeation. Furthermore, the method was found to enhance the depth from which Raman spectra can be collected and to improve the depth resolution compared to previously proposed methods.
Biggerstaff, Matthew; Alper, David; Dredze, Mark; Fox, Spencer; Fung, Isaac Chun-Hai; Hickmann, Kyle S; Lewis, Bryan; Rosenfeld, Roni; Shaman, Jeffrey; Tsou, Ming-Hsiang; Velardi, Paola; Vespignani, Alessandro; Finelli, Lyn
2016-07-22
Early insights into the timing of the start, peak, and intensity of the influenza season could be useful in planning influenza prevention and control activities. To encourage development and innovation in influenza forecasting, the Centers for Disease Control and Prevention (CDC) organized a challenge to predict the 2013-14 Unites States influenza season. Challenge contestants were asked to forecast the start, peak, and intensity of the 2013-2014 influenza season at the national level and at any or all Health and Human Services (HHS) region level(s). The challenge ran from December 1, 2013-March 27, 2014; contestants were required to submit 9 biweekly forecasts at the national level to be eligible. The selection of the winner was based on expert evaluation of the methodology used to make the prediction and the accuracy of the prediction as judged against the U.S. Outpatient Influenza-like Illness Surveillance Network (ILINet). Nine teams submitted 13 forecasts for all required milestones. The first forecast was due on December 2, 2013; 3/13 forecasts received correctly predicted the start of the influenza season within one week, 1/13 predicted the peak within 1 week, 3/13 predicted the peak ILINet percentage within 1 %, and 4/13 predicted the season duration within 1 week. For the prediction due on December 19, 2013, the number of forecasts that correctly forecasted the peak week increased to 2/13, the peak percentage to 6/13, and the duration of the season to 6/13. As the season progressed, the forecasts became more stable and were closer to the season milestones. Forecasting has become technically feasible, but further efforts are needed to improve forecast accuracy so that policy makers can reliably use these predictions. CDC and challenge contestants plan to build upon the methods developed during this contest to improve the accuracy of influenza forecasts.
Medina, Daniel C.; Findley, Sally E.; Guindo, Boubacar; Doumbia, Seydou
2007-01-01
Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with diarrhea, acute respiratory infection, and malaria. With the increasing awareness that the aforementioned infectious diseases impose an enormous burden on developing countries, public health programs therein could benefit from parsimonious general-purpose forecasting methods to enhance infectious disease intervention. Unfortunately, these disease time-series often i) suffer from non-stationarity; ii) exhibit large inter-annual plus seasonal fluctuations; and, iii) require disease-specific tailoring of forecasting methods. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, diarrhea, acute respiratory infection of the lower tract, and malaria consultation time-series are fitted with a general-purpose econometric method, namely the multiplicative Holt-Winters, to produce contemporaneous on-line forecasts for the district of Niono, Mali. This method accommodates seasonal, as well as inter-annual, fluctuations and produces reasonably accurate median 2- and 3-month horizon forecasts for these non-stationary time-series, i.e., 92% of the 24 time-series forecasts generated (2 forecast horizons, 3 diseases, and 4 age categories = 24 time-series forecasts) have mean absolute percentage errors circa 25%. Conclusions/Significance The multiplicative Holt-Winters forecasting method: i) performs well across diseases with dramatically distinct transmission modes and hence it is a strong general-purpose forecasting method candidate for non-stationary epidemiological time-series; ii) obliquely captures prior non-linear interactions between climate and the aforementioned disease dynamics thus, obviating the need for more complex disease-specific climate-based parametric forecasting methods in the district of Niono; furthermore, iii) readily decomposes time-series into seasonal components thereby potentially assisting with programming of public health interventions, as well as monitoring of disease dynamics modification. Therefore, these forecasts could improve infectious diseases management in the district of Niono, Mali, and elsewhere in the Sahel. PMID:18030322
Traffic flow forecasting using approximate nearest neighbor nonparametric regression
DOT National Transportation Integrated Search
2000-12-01
The purpose of this research is to enhance nonparametric regression (NPR) for use in real-time systems by first reducing execution time using advanced data structures and imprecise computations and then developing a methodology for applying NPR. Due ...
Ocaña-Peinado, Francisco M; Valderrama, Mariano J; Bouzas, Paula R
2013-05-01
The problem of developing a 2-week-on ahead forecast of atmospheric cypress pollen levels is tackled in this paper by developing a principal component multiple regression model involving several climatic variables. The efficacy of the proposed model is validated by means of an application to real data of Cupressaceae pollen concentration in the city of Granada (southeast of Spain). The model was applied to data from 11 consecutive years (1995-2005), with 2006 being used to validate the forecasts. Based on the work of different authors, factors as temperature, humidity, hours of sun and wind speed were incorporated in the model. This methodology explains approximately 75-80% of the variability in the airborne Cupressaceae pollen concentration.
Model documentation report: Residential sector demand module of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less
Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Robertson, Franklin R.; Bosilovich, Michael; Lyon, Bradfield; Funk, Chris
2013-01-01
The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period
Forecasting of Radiation Belts: Results From the PROGRESS Project.
NASA Astrophysics Data System (ADS)
Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.
2017-12-01
Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.
Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa
NASA Technical Reports Server (NTRS)
Robertson, Franklin R.; Roberts, J. Brent; Bosilovich, Michael; Lyon, Bradfield
2013-01-01
The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period.
Kern, Jordan D; Patino-Echeverri, Dalia; Characklis, Gregory W
2014-08-19
Due to their operational flexibility, hydroelectric dams are ideal candidates to compensate for the intermittency and unpredictability of wind energy production. However, more coordinated use of wind and hydropower resources may exacerbate the impacts dams have on downstream environmental flows, that is, the timing and magnitude of water flows needed to sustain river ecosystems. In this paper, we examine the effects of increased (i.e., 5%, 15%, and 25%) wind market penetration on prices for electricity and reserves, and assess the potential for altered price dynamics to disrupt reservoir release schedules at a hydroelectric dam and cause more variable and unpredictable hourly flow patterns (measured in terms of the Richards-Baker Flashiness (RBF) index). Results show that the greatest potential for wind energy to impact downstream flows occurs at high (∼25%) wind market penetration, when the dam sells more reserves in order to exploit spikes in real-time electricity prices caused by negative wind forecast errors. Nonetheless, compared to the initial impacts of dam construction (and the dam's subsequent operation as a peaking resource under baseline conditions) the marginal effects of any increased wind market penetration on downstream flows are found to be relatively minor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Mingjian; Zhang, Jie; Feng, Cong
Here, one of the biggest concerns associated with integrating a large amount of renewable energy into the power grid is the ability to handle large ramps in the renewable power output. For the sake of system reliability and economics, it is essential for power system operators to better understand the ramping features of renewable, load, and netload. An optimized swinging door algorithm (OpSDA) is used and extended to accurately and efficiently detect ramping events. For wind power ramps detection, a process of merging 'bumps' (that have a different changing direction) into adjacent ramping segments is included to improve the performancemore » of the OpSDA method. For solar ramps detection, ramping events that occur in both clear-sky and measured (or forecasted) solar power are removed to account for the diurnal pattern of solar generation. Ramping features are extracted and extensively compared between load and netload under different renewable penetration levels (9.77%, 15.85%, and 51.38%). Comparison results show that (i) netload ramp events with shorter durations and smaller magnitudes occur more frequently when renewable penetration level increases, and the total number of ramping events also increases; and (ii) different ramping characteristics are observed in load and netload even with a low renewable penetration level.« less
Automated system for smoke dispersion prediction due to wild fires in Alaska
NASA Astrophysics Data System (ADS)
Kulchitsky, A.; Stuefer, M.; Higbie, L.; Newby, G.
2007-12-01
Community climate models have enabled development of specific environmental forecast systems. The University of Alaska (UAF) smoke group was created to adapt a smoke forecast system to the Alaska region. The US Forest Service (USFS) Missoula Fire Science Lab had developed a smoke forecast system based on the Weather Research and Forecasting (WRF) Model including chemistry (WRF/Chem). Following the successful experience of USFS, which runs their model operationally for the contiguous U.S., we develop a similar system for Alaska in collaboration with scientists from the USFS Missoula Fire Science Lab. Wildfires are a significant source of air pollution in Alaska because the climate and vegetation favor annual summer fires that burn huge areas. Extreme cases occurred in 2004, when an area larger than Maryland (more than 25000~km2) burned. Small smoke particles with a diameter less than 10~μm can penetrate deep into lungs causing health problems. Smoke also creates a severe restriction to air transport and has tremendous economical effect. The smoke dispersion and forecast system for Alaska was developed at the Geophysical Institute (GI) and the Arctic Region Supercomputing Center (ARSC), both at University of Alaska Fairbanks (UAF). They will help the public and plan activities a few days in advance to avoid dangerous smoke exposure. The availability of modern high performance supercomputers at ARSC allows us to create and run high-resolution, WRF-based smoke dispersion forecast for the entire State of Alaska. The core of the system is a Python program that manages the independent pieces. Our adapted Alaska system performs the following steps \\begin{itemize} Calculate the medium-resolution weather forecast using WRF/Met. Adapt the near real-time satellite-derived wildfire location and extent data that are received via direct broadcast from UAF's "Geographic Information Network of Alaska" (GINA) Calculate fuel moisture using WRF forecasts and National Fire Danger Rating System (NFDRS) fuel maps Calculate smoke emission components using a first order fire emission model Model the smoke plume rise yielding a vertically distribution that accounts for one-dimensional (vertical) concentrations of smoke constituents in the atmosphere above the fire Run WRF/Chem at high resolution for the forecast Use standard graphical tools to provide accessible smoke dispersion The system run twice each day at ARSC. The results will be freely available from a dedicated wildfire smoke web portal at ARSC.
Projected electric power demands for the Potomac Electric Power Company. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estomin, S.; Kahal, M.
1984-03-01
This three-volume report presents the results of an econometric forecast of peak and electric power demands for the Potomac Electric Power Company (PEPCO) through the year 2002. Volume I describes the methodology, the results of the econometric estimations, the forecast assumptions and the calculated forecasts of peak demand and energy usage. Separate sets of models were developed for the Maryland Suburbs (Montgomery and Prince George's counties), the District of Columbia and Southern Maryland (served by a wholesale customer of PEPCO). For each of the three jurisdictions, energy equations were estimated for residential and commercial/industrial customers for both summer and wintermore » seasons. For the District of Columbia, summer and winter equations for energy sales to the federal government were also estimated. Equations were also estimated for street lighting and energy losses. Noneconometric techniques were employed to forecast energy sales to the Northern Virginia suburbs, Metrorail and federal government facilities located in Maryland.« less
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.
1979-01-01
A cross impact model of the U.S. telecommunications system was developed. For this model, it was necessary to prepare forecasts of the major segments of the telecommunications system, such as satellites, telephone, TV, CATV, radio broadcasting, etc. In addition, forecasts were prepared of the traffic generated by a variety of new or expanded services, such as electronic check clearing and point of sale electronic funds transfer. Finally, the interactions among the forecasts were estimated (the cross impacts). Both the forecasts and the cross impacts were used as inputs to the cross impact model, which could then be used to stimulate the future growth of the entire U.S. telecommunications system. By varying the inputs, technology changes or policy decisions with regard to any segment of the system could be evaluated in the context of the remainder of the system. To illustrate the operation of the model, a specific study was made of the deployment of fiber optics, throughout the telecommunications system.
Gomez-Elipe, Alberto; Otero, Angel; van Herp, Michel; Aguirre-Jaime, Armando
2007-01-01
Background The objective of this work was to develop a model to predict malaria incidence in an area of unstable transmission by studying the association between environmental variables and disease dynamics. Methods The study was carried out in Karuzi, a province in the Burundi highlands, using time series of monthly notifications of malaria cases from local health facilities, data from rain and temperature records, and the normalized difference vegetation index (NDVI). Using autoregressive integrated moving average (ARIMA) methodology, a model showing the relation between monthly notifications of malaria cases and the environmental variables was developed. Results The best forecasting model (R2adj = 82%, p < 0.0001 and 93% forecasting accuracy in the range ± 4 cases per 100 inhabitants) included the NDVI, mean maximum temperature, rainfall and number of malaria cases in the preceding month. Conclusion This model is a simple and useful tool for producing reasonably reliable forecasts of the malaria incidence rate in the study area. PMID:17892540
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.
1979-01-01
A cross impact model of the U.S. telecommunications system was developed. It was necessary to prepare forecasts of the major segments of the telecommunications system, such as satellites, telephone, TV, CATV, radio broadcasting, etc. In addition, forecasts were prepared of the traffic generated by a variety of new or expanded services, such as electronic check clearing and point of sale electronic funds transfer. Finally, the interactions among the forecasts were estimated (the cross impact). Both the forecasts and the cross impacts were used as inputs to the cross impact model, which could then be used to stimulate the future growth of the entire U.S. telecommunications system. By varying the inputs, technology changes or policy decisions with regard to any segment of the system could be evaluated in the context of the remainder of the system. To illustrate the operation of the model, a specific study was made of the deployment of fiber optics throughout the telecommunications system.
Wind speed time series reconstruction using a hybrid neural genetic approach
NASA Astrophysics Data System (ADS)
Rodriguez, H.; Flores, J. J.; Puig, V.; Morales, L.; Guerra, A.; Calderon, F.
2017-11-01
Currently, electric energy is used in practically all modern human activities. Most of the energy produced came from fossil fuels, making irreversible damage to the environment. Lately, there has been an effort by nations to produce energy using clean methods, such as solar and wind energy, among others. Wind energy is one of the cleanest alternatives. However, the wind speed is not constant, making the planning and operation at electric power systems a difficult activity. Knowing in advance the amount of raw material (wind speed) used for energy production allows us to estimate the energy to be generated by the power plant, helping the maintenance planning, the operational management, optimal operational cost. For these reasons, the forecast of wind speed becomes a necessary task. The forecast process involves the use of past observations from the variable to forecast (wind speed). To measure wind speed, weather stations use devices called anemometers, but due to poor maintenance, connection error, or natural wear, they may present false or missing data. In this work, a hybrid methodology is proposed, and it uses a compact genetic algorithm with an artificial neural network to reconstruct wind speed time series. The proposed methodology reconstructs the time series using a ANN defined by a Compact Genetic Algorithm.
1975-06-01
MECHANISM : ALUMINOSILICATE ORGANIC INTERACTION IN SALINE WATERS, A.C.S. Tom M 21 (1969-1970) to be published in Advances in Chemistry Series. 5...recovery mechanisms . Detailed information on various devices is given in Sittig3 and by EPA. (1) WEIR DEVICES Weir devices depend on gravity to... mechanical eguipments. The selection depends on the depth of oil penetration: 1. For oil penetration of up to 1 inch, combined use of road graders and
Forecasting daily patient volumes in the emergency department.
Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L
2008-02-01
Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives and the conceptual and methodological approach used in the development of the Coal Production Submodule (CPS). It provides a description of the CPS for model analysts and the public. The Coal Market Module provides annual forecasts of prices, production, and consumption of coal.
Experimental droughts with rainout shelters: A methodological review
USDA-ARS?s Scientific Manuscript database
Forecast increases in the frequency, intensity and duration of droughts with climate change may have extreme and extensive ecological consequences. There are currently hundreds of published, ongoing and new drought experiments worldwide aimed to assess ecosystem sensitivities to drought and identify...
First Coast Guard district traffic model report
DOT National Transportation Integrated Search
1997-11-01
The purpose of this report was to describe the methodology used in developing the First Coast Guard District (CGD1) Traffic Model and to document the potential National Distress System (NDS) voice and data traffic forecasted for the year 2001. The ND...
Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur
2010-01-01
A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.
Crespi, Francesco; Cattini, Stefano; Donini, Maurizio; Bandera, Andrea; Rovati, Luigi
2016-01-30
Near-infrared spectroscopy (NIRS) is a non-invasive technique that monitors changes in oxygenation of haemoglobin. The absorption spectra of near-infrared light differ for the oxygenation-deoxygenation states of haemoglobin (oxygenate (HbO2) and deoxygenate (Hb), respectively) so that these two states can be directly monitored. Different methodologies report different basal values of HbO2 and Hb absolute concentrations in brain. Here, we attempt to calculate basal HbO2 levels in rat CNS via evaluation of the influence of exogenous oxygen or exogenous carbon dioxide on the NIRS parameters measured in vivo. Furthermore the possibility that changes of haemoglobin oxygenation in rat brain as measured by NIRS might be a useful index of brain penetration of chemical entities has been investigated. Different compounds from different chemical classes were selected on the basis of parallel ex vivo and in vivo pharmacokinetic (PK/PD) studies of brain penetration and overall pharmacokinetic profile. It appeared that NIRS might contribute to assess brain penetration of chemical entities, i.e. significant changes in NIRS signals could be related to brain exposure, conversely the lack of significant changes in relevant NIRS parameters could be indicative of low brain exposure. This work is proposing a further innovation on NIRS preclinical applications i.e. a "chemical" NIRS [chNIRS] approach for determining penetration of drugs in animal brain. Therefore, chNIRS could became a non invasive methodology for studies on neurobiological processes and psychiatric diseases in preclinical but also a translational strategy from preclinical to clinical investigations. Copyright © 2015 Elsevier B.V. All rights reserved.
Kayen, R.E.
1997-01-01
Abstract. Uncompacted artificial-fill deposits on the east side of San Francisco Bay suffered severe levels of soil liquefaction during the Loma Prieta earthquake of 17 October 1989. Damaged areas included maritime-port facilities, office buildings, and shoreline transportation arteries, ranging from 65 to 85 km from the north end of the Loma Prieta rupture zone. Typical of all these sites, which represent occurrences of liquefaction-induced damage farthest from the rupture zone, are low cone penetration test and Standard Penetration Test resistances in zones of cohesionless silty and sandy hydraulic fill, and underlying soft cohesive Holocene and Pleistocene sediment that strongly amplified ground motions. Postearthquake investigations at five study sites using standard penetration tests and cone penetration tests provide a basis for evaluation of the Arias intensity-based methodology for assessment of liquefaction susceptibility. ?? 1997 Kluwer Academic Publishers.
Transdermal delivery of biomacromolecules using lipid-like nanoparticles
NASA Astrophysics Data System (ADS)
Bello, Evelyn A.
The transdermal delivery of biomacromolecules, including proteins and nucleic acids, is challenging, owing to their large size and the penetration-resistant nature of the stratum corneum. Thus, an urgent need exists for the development of transdermal delivery methodologies. This research focuses on the use of cationic lipid-like nanoparticles (lipidoids) for the transdermal delivery of proteins, and establishes an in vitro model for the study. The lipidoids used were first combinatorially designed and synthesized; afterwards, they were employed for protein encapsulation in a vesicular system. A skin penetration study demonstrated that lipidoids enhance penetration depth in a pig skin model, overcoming the barrier that the stratum corneum presents. This research has successfully identified active lipidoids capable of efficiently penetrating the skin; therefore, loading proteins into lipidoid nanoparticles will facilitate the transdermal delivery of proteins. Membrane diffusion experiments were used to confirm the results. This research has confirmed that lipidoids are a suitable material for transdermal protein delivery enhancement.
Online Analysis of Wind and Solar Part I: Ramping Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.
The Wind Integration National Dataset (WIND) toolkit (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caroline Draxl: NREL
2014-01-01
Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.
NASA Technical Reports Server (NTRS)
Cleary, B.; Pearson, R. W.; Greenwood, S. W.; Kaplan, L.
1978-01-01
The extent of the threat to the US helicopter industry posed by a determined effort by foreign manufacturers, European companies in particular, to supply their own domestic markets and also to penetrate export markets, including the USA is assessed. Available data on US and world markets for civil and military uses are collated and presented in both graphic and tabular form showing the past history of production and markets and, where forecasts are available, anticipated future trends. The data are discussed on an item-by-item basis and inferences are drawn in as much depth as appears justified.
Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis
NASA Technical Reports Server (NTRS)
Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher
1996-01-01
We study a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and will be required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and a bias correction of forecast anomalies. In brief, the distortion is determined by minimizing the objective function by varying the displacement and bias correction fields. In the present project we use a global or hemispheric domain, and spherical harmonics to represent these fields. In this project we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically we study the forecast errors of the 500 hPa geopotential height field for forecasts of the short and medium range. The forecasts are those of the Goddard Earth Observing System data assimilation system. Results presented show that the methodology works, that a large part of the total error may be explained by a distortion limited to triangular truncation at wavenumber 10, and that the remaining residual error contains mostly small spatial scales.
Smirnova, Alexandra; deCamp, Linda; Chowell, Gerardo
2017-05-02
Deterministic and stochastic methods relying on early case incidence data for forecasting epidemic outbreaks have received increasing attention during the last few years. In mathematical terms, epidemic forecasting is an ill-posed problem due to instability of parameter identification and limited available data. While previous studies have largely estimated the time-dependent transmission rate by assuming specific functional forms (e.g., exponential decay) that depend on a few parameters, here we introduce a novel approach for the reconstruction of nonparametric time-dependent transmission rates by projecting onto a finite subspace spanned by Legendre polynomials. This approach enables us to effectively forecast future incidence cases, the clear advantage over recovering the transmission rate at finitely many grid points within the interval where the data are currently available. In our approach, we compare three regularization algorithms: variational (Tikhonov's) regularization, truncated singular value decomposition (TSVD), and modified TSVD in order to determine the stabilizing strategy that is most effective in terms of reliability of forecasting from limited data. We illustrate our methodology using simulated data as well as case incidence data for various epidemics including the 1918 influenza pandemic in San Francisco and the 2014-2015 Ebola epidemic in West Africa.
Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.
2017-01-01
We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.
Monitoring and seasonal forecasting of meteorological droughts
NASA Astrophysics Data System (ADS)
Dutra, Emanuel; Pozzi, Will; Wetterhall, Fredrik; Di Giuseppe, Francesca; Magnusson, Linus; Naumann, Gustavo; Barbosa, Paulo; Vogt, Jurgen; Pappenberger, Florian
2015-04-01
Near-real time drought monitoring can provide decision makers valuable information for use in several areas, such as water resources management, or international aid. Unfortunately, a major constraint in current drought outlooks is the lack of reliable monitoring capability for observed precipitation globally in near-real time. Furthermore, drought monitoring systems requires a long record of past observations to provide mean climatological conditions. We address these constraints by developing a novel drought monitoring approach in which monthly mean precipitation is derived from short-range using ECMWF probabilistic forecasts and then merged with the long term precipitation climatology of the Global Precipitation Climatology Centre (GPCC) dataset. Merging the two makes available a real-time global precipitation product out of which the Standardized Precipitation Index (SPI) can be estimated and used for global or regional drought monitoring work. This approach provides stability in that by-passes problems of latency (lags) in having local rain-gauge measurements available in real time or lags in satellite precipitation products. Seasonal drought forecasts can also be prepared using the common methodology and based upon two data sources used to provide initial conditions (GPCC and the ECMWF ERA-Interim reanalysis (ERAI) combined with either the current ECMWF seasonal forecast or a climatology based upon ensemble forecasts. Verification of the forecasts as a function of lead time revealed a reduced impact on skill for: (i) long lead times using different initial conditions, and (ii) short lead times using different precipitation forecasts. The memory effect of initial conditions was found to be 1 month lead time for the SPI-3, 3 to 4 months for the SPI-6 and 5 months for the SPI-12. Results show that dynamical forecasts of precipitation provide added value, a skill similar to or better than climatological forecasts. In some cases, particularly for long SPI time scales, it is very difficult to improve on the use of climatological forecasts. However, results presented regionally and globally pinpoint several regions in the world where drought onset forecasting is feasible and skilful.
A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags
NASA Astrophysics Data System (ADS)
Meng, S.; Xie, X.
2015-12-01
In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.
NASA Astrophysics Data System (ADS)
Delaney, C.; Hartman, R. K.; Mendoza, J.; Whitin, B.
2017-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC. The ESP hindcast was developed using Global Ensemble Forecast System version 10 precipitation reforecasts processed with the Hydrologic Ensemble Forecast System to generate daily reforecasts of 61 flow ensemble members for a 15-day forecast horizon. Model simulation results demonstrate that the EFO alternative may improve water supply reliability for Lake Mendocino yet not increase flood risk for downstream areas. The developed operations framework can directly leverage improved skill in the second week of the forecast and is extendable into the S2S time domain given the demonstration of improved skill through a reliable reforecast of adequate historical duration and consistent with operationally available numerical weather predictions.
Comparison of Observation Impacts in Two Forecast Systems using Adjoint Methods
NASA Technical Reports Server (NTRS)
Gelaro, Ronald; Langland, Rolf; Todling, Ricardo
2009-01-01
An experiment is being conducted to compare directly the impact of all assimilated observations on short-range forecast errors in different operational forecast systems. We use the adjoint-based method developed by Langland and Baker (2004), which allows these impacts to be efficiently calculated. This presentation describes preliminary results for a "baseline" set of observations, including both satellite radiances and conventional observations, used by the Navy/NOGAPS and NASA/GEOS-5 forecast systems for the month of January 2007. In each system, about 65% of the total reduction in 24-h forecast error is provided by satellite observations, although the impact of rawinsonde, aircraft, land, and ship-based observations remains significant. Only a small majority (50- 55%) of all observations assimilated improves the forecast, while the rest degrade it. It is found that most of the total forecast error reduction comes from observations with moderate-size innovations providing small to moderate impacts, not from outliers with very large positive or negative innovations. In a global context, the relative impacts of the major observation types are fairly similar in each system, although regional differences in observation impact can be significant. Of particular interest is the fact that while satellite radiances have a large positive impact overall, they degrade the forecast in certain locations common to both systems, especially over land and ice surfaces. Ongoing comparisons of this type, with results expected from other operational centers, should lead to more robust conclusions about the impacts of the various components of the observing system as well as about the strengths and weaknesses of the methodologies used to assimilate them.
Potential for malaria seasonal forecasting in Africa
NASA Astrophysics Data System (ADS)
Tompkins, Adrian; Di Giuseppe, Francesca; Colon-Gonzalez, Felipe; Namanya, Didas; Friday, Agabe
2014-05-01
As monthly and seasonal dynamical prediction systems have improved their skill in the tropics over recent years, there is now the potential to use these forecasts to drive dynamical malaria modelling systems to provide early warnings in epidemic and meso-endemic regions. We outline a new pilot operational system that has been developed at ECMWF and ICTP. It uses a precipitation bias correction methodology to seamlessly join the monthly ensemble prediction system (EPS) and seasonal (system 4) forecast systems of ECMWF together. The resulting temperature and rainfall forecasts for Africa are then used to drive the recently developed ICTP malaria model known as VECTRI. The resulting coupled system of ECMWF climate forecasts and VECTRI thus produces predictions of malaria prevalence rates and transmission intensity across Africa. The forecasts are filtered to highlight the regions and months in which the system has particular value due to high year to year variability. In addition to epidemic areas, these also include meso and hyper-endemic regions which undergo considerable variability in the onset months. We demonstrate the limits of the forecast skill as a function of lead-time, showing that for many areas the dynamical system can add one to two months additional warning time to a system based on environmental monitoring. We then evaluate the past forecasts against district level case data in Uganda and show that when interventions can be discounted, the system can show significant skill at predicting interannual variability in transmission intensity up to 3 or 4 months ahead at the district scale. The prospects for a operational implementation will be briefly discussed.
Deodhar, Suruchi; Bisset, Keith; Chen, Jiangzhuo; Barrett, Chris; Wilson, Mandy; Marathe, Madhav
2016-01-01
Public health decision makers need access to high resolution situation assessment tools for understanding the extent of various epidemics in different regions of the world. In addition, they need insights into the future course of epidemics by way of forecasts. Such forecasts are essential for planning the allocation of limited resources and for implementing several policy-level and behavioral intervention strategies. The need for such forecasting systems became evident in the wake of the recent Ebola outbreak in West Africa. We have developed EpiCaster, an integrated Web application for situation assessment and forecasting of various epidemics, such as Flu and Ebola, that are prevalent in different regions of the world. Using EpiCaster, users can assess the magnitude and severity of different epidemics at highly resolved spatio-temporal levels. EpiCaster provides time-varying heat maps and graphical plots to view trends in the disease dynamics. EpiCaster also allows users to visualize data gathered through surveillance mechanisms, such as Google Flu Trends (GFT) and the World Health Organization (WHO). The forecasts provided by EpiCaster are generated using different epidemiological models, and the users can select the models through the interface to filter the corresponding forecasts. EpiCaster also allows the users to study epidemic propagation in the presence of a number of intervention strategies specific to certain diseases. Here we describe the modeling techniques, methodologies and computational infrastructure that EpiCaster relies on to support large-scale predictive analytics for situation assessment and forecasting of global epidemics. PMID:27796009
Methodology for Designing Operational Banking Risks Monitoring System
NASA Astrophysics Data System (ADS)
Kostjunina, T. N.
2018-05-01
The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.
Optimization of Automobile Crush Characteristics: Technical Report
DOT National Transportation Integrated Search
1975-10-01
A methodology is developed for the evaluation and optimization of societal costs of two-vehicle automobile collisions. Costs considered in a Figure of Merit include costs of injury/mortality, occupant compartment penetration, collision damage repairs...
Physician supply forecast: better than peering in a crystal ball?
Roberfroid, Dominique; Leonard, Christian; Stordeur, Sabine
2009-01-01
Background Anticipating physician supply to tackle future health challenges is a crucial but complex task for policy planners. A number of forecasting tools are available, but the methods, advantages and shortcomings of such tools are not straightforward and not always well appraised. Therefore this paper had two objectives: to present a typology of existing forecasting approaches and to analyse the methodology-related issues. Methods A literature review was carried out in electronic databases Medline-Ovid, Embase and ERIC. Concrete examples of planning experiences in various countries were analysed. Results Four main forecasting approaches were identified. The supply projection approach defines the necessary inflow to maintain or to reach in the future an arbitrary predefined level of service offer. The demand-based approach estimates the quantity of health care services used by the population in the future to project physician requirements. The needs-based approach involves defining and predicting health care deficits so that they can be addressed by an adequate workforce. Benchmarking health systems with similar populations and health profiles is the last approach. These different methods can be combined to perform a gap analysis. The methodological challenges of such projections are numerous: most often static models are used and their uncertainty is not assessed; valid and comprehensive data to feed into the models are often lacking; and a rapidly evolving environment affects the likelihood of projection scenarios. As a result, the internal and external validity of the projections included in our review appeared limited. Conclusion There is no single accepted approach to forecasting physician requirements. The value of projections lies in their utility in identifying the current and emerging trends to which policy-makers need to respond. A genuine gap analysis, an effective monitoring of key parameters and comprehensive workforce planning are key elements to improving the usefulness of physician supply projections. PMID:19216772
Olshansky, S Jay; Goldman, Dana P; Zheng, Yuhui; Rowe, John W
2009-01-01
Context: The aging of the baby boom generation, the extension of life, and progressive increases in disability-free life expectancy have generated a dramatic demographic transition in the United States. Official government forecasts may, however, have inadvertently underestimated life expectancy, which would have major policy implications, since small differences in forecasts of life expectancy produce very large differences in the number of people surviving to an older age. This article presents a new set of population and life expectancy forecasts for the United States, focusing on transitions that will take place by midcentury. Methods: Forecasts were made with a cohort-components methodology, based on the premise that the risk of death will be influenced in the coming decades by accelerated advances in biomedical technology that either delay the onset and age progression of major fatal diseases or that slow the aging process itself. Findings: Results indicate that the current forecasts of the U.S. Social Security Administration and U.S. Census Bureau may underestimate the rise in life expectancy at birth for men and women combined, by 2050, from 3.1 to 7.9 years. Conclusions: The cumulative outlays for Medicare and Social Security could be higher by $3.2 to $8.3 trillion relative to current government forecasts. This article discusses the implications of these results regarding the benefits and costs of an aging society and the prospect that health disparities could attenuate some of these changes. PMID:20021588
Confocal laser scanning microscopy to estimate nanoparticles’ human skin penetration in vitro
Elmahdy, Akram; Cao, Yachao; Hui, Xiaoying; Maibach, Howard
2017-01-01
Objective With rapid development of nanotechnology, there is increasing interest in nanoparticle (NP) application and its safety and efficacy on human skin. In this study, we utilized confocal laser scanning microscopy to estimate NP skin penetration. Methods Three different-sized polystyrene NPs marked with red fluorescence were applied to human skin, and Calcium Green 5N was used as a counterstain. Dimethyl sulfoxide (DMSO) and ethanol were used as alternative vehicles for NPs. Tape stripping was utilized as a barrier-damaged skin model. Skin biopsies dosed with NPs were incubated at 4°C or 37°C for 24 hours and imaged using confocal laser scanning microscopy. Results NPs were localized in the stratum corneum (SC) and hair follicles without penetrating the epidermis/dermis. Barrier alteration with tape stripping and change in incubation temperature did not induce deeper penetration. DMSO enhanced NP SC penetration but ethanol did not. Conclusion Except with DMSO vehicle, these hydrolyzed polystyrene NPs did not penetrate intact or barrier-damaged human “viable” epidermis. For further clinical relevance, in vivo human skin studies and more sensitive analytic chemical methodology are suggested. PMID:29184403
VMT Mix Modeling for Mobile Source Emissions Forecasting: Formulation and Empirical Application
DOT National Transportation Integrated Search
2000-05-01
The purpose of the current report is to propose and implement a methodology for obtaining improved link-specific vehicle miles of travel (VMT) mix values compared to those obtained from existent methods. Specifically, the research is developing a fra...
A methodology for incorporating fuel price impacts into short-term transit ridership forecasts.
DOT National Transportation Integrated Search
2009-08-01
Anticipating changes to public transportation ridership demand is important to planning for and meeting : service goals and maintaining system viability. These changes may occur in the short- or long-term; : extensive academic work has focused on bet...
Methodology to estimate particulate matter emissions from certified commercial aircraft engines.
DOT National Transportation Integrated Search
2009-01-01
Today, about one-fourth of U.S. commercial service airports, : including 41 of the busiest 50, are either in nonattainment : or maintenance areas per the National Ambient : Air Quality Standards. U.S. aviation activity is forecasted : to triple by 20...
DEVELOPMENT AND EVALUATION OF PM 2.5 SOURCE APPORTIONMENT METHODOLOGIES
The receptor model called Positive Matrix Factorization (PMF) has been extensively used to apportion sources of ambient fine particulate matter (PM2.5), but the accuracy of source apportionment results currently remains unknown. In addition, air quality forecast model...
Projecting long term medical spending growth.
Borger, Christine; Rutherford, Thomas F; Won, Gregory Y
2008-01-01
We present a dynamic general equilibrium model of the U.S. economy and the medical sector in which the adoption of new medical treatments is endogenous and the demand for medical services is conditional on the state of technology. We use this model to prepare 75-year medical spending forecasts and a projection of the Medicare actuarial balance, and we compare our results to those obtained from a method that has been used by government actuaries. Our baseline forecast predicts slower health spending growth in the long run and a lower Medicare actuarial deficit relative to the previous projection methodology.
NASA Astrophysics Data System (ADS)
Cervone, G.; Clemente-Harding, L.; Alessandrini, S.; Delle Monache, L.
2016-12-01
A methodology based on Artificial Neural Networks (ANN) and an Analog Ensemble (AnEn) is presented to generate 72-hour deterministic and probabilistic forecasts of power generated by photovoltaic (PV) power plants using input from a numerical weather prediction model and computed astronomical variables. ANN and AnEn are used individually and in combination to generate forecasts for three solar power plant located in Italy. The computational scalability of the proposed solution is tested using synthetic data simulating 4,450 PV power stations. The NCAR Yellowstone supercomputer is employed to test the parallel implementation of the proposed solution, ranging from 1 node (32 cores) to 4,450 nodes (141,140 cores). Results show that a combined AnEn + ANN solution yields best results, and that the proposed solution is well suited for massive scale computation.
Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael
2017-09-01
The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.
Perimeter Barrier Selection Guide
DOT National Transportation Integrated Search
1989-05-01
This document provides a methodology to determine the magnitude of the threat from attack vehicles to the perimeter of a facility. The threat is determined by the penetration tolerance and the maximum speed attainable. After the threat is defined thi...
Diversity modelling for electrical power system simulation
NASA Astrophysics Data System (ADS)
Sharip, R. M.; Abu Zarim, M. A. U. A.
2013-12-01
This paper considers diversity of generation and demand profiles against the different future energy scenarios and evaluates these on a technical basis. Compared to previous studies, this research applied a forecasting concept based on possible growth rates from publically electrical distribution scenarios concerning the UK. These scenarios were created by different bodies considering aspects such as environment, policy, regulation, economic and technical. In line with these scenarios, forecasting is on a long term timescale (up to every ten years from 2020 until 2050) in order to create a possible output of generation mix and demand profiles to be used as an appropriate boundary condition for the network simulation. The network considered is a segment of rural LV populated with a mixture of different housing types. The profiles for the 'future' energy and demand have been successfully modelled by applying a forecasting method. The network results under these profiles shows for the cases studied that even though the value of the power produced from each Micro-generation is often in line with the demand requirements of an individual dwelling there will be no problems arising from high penetration of Micro-generation and demand side management for each dwellings considered. The results obtained highlight the technical issues/changes for energy delivery and management to rural customers under the future energy scenarios.
Jones, Michael L.; Shuter, Brian J.; Zhao, Yingming; Stockwell, Jason D.
2006-01-01
Future changes to climate in the Great Lakes may have important consequences for fisheries. Evidence suggests that Great Lakes air and water temperatures have risen and the duration of ice cover has lessened during the past century. Global circulation models (GCMs) suggest future warming and increases in precipitation in the region. We present new evidence that water temperatures have risen in Lake Erie, particularly during summer and winter in the period 19652000. GCM forecasts coupled with physical models suggest lower annual runoff, less ice cover, and lower lake levels in the future, but the certainty of these forecasts is low. Assessment of the likely effects of climate change on fish stocks will require an integrative approach that considers several components of habitat rather than water temperature alone. We recommend using mechanistic models that couple habitat conditions to population demographics to explore integrated effects of climate-caused habitat change and illustrate this approach with a model for Lake Erie walleye (Sander vitreum). We show that the combined effect on walleye populations of plausible changes in temperature, river hydrology, lake levels, and light penetration can be quite different from that which would be expected based on consideration of only a single factor.
Economic analysis for transmission operation and planning
NASA Astrophysics Data System (ADS)
Zhou, Qun
2011-12-01
Restructuring of the electric power industry has caused dramatic changes in the use of transmission system. The increasing congestion conditions as well as the necessity of integrating renewable energy introduce new challenges and uncertainties to transmission operation and planning. Accurate short-term congestion forecasting facilitates market traders in bidding and trading activities. Cost sharing and recovery issue is a major impediment for long-term transmission investment to integrate renewable energy. In this research, a new short-term forecasting algorithm is proposed for predicting congestion, LMPs, and other power system variables based on the concept of system patterns. The advantage of this algorithm relative to standard statistical forecasting methods is that structural aspects underlying power market operations are exploited to reduce the forecasting error. The advantage relative to previously proposed structural forecasting methods is that data requirements are substantially reduced. Forecasting results based on a NYISO case study demonstrate the feasibility and accuracy of the proposed algorithm. Moreover, a negotiation methodology is developed to guide transmission investment for integrating renewable energy. Built on Nash Bargaining theory, the negotiation of investment plans and payment rate can proceed between renewable generation and transmission companies for cost sharing and recovery. The proposed approach is applied to Garver's six bus system. The numerical results demonstrate fairness and efficiency of the approach, and hence can be used as guidelines for renewable energy investors. The results also shed light on policy-making of renewable energy subsidies.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the forecast, including the methodology used to project loads, rates, revenue, power costs, operating expenses, plant additions, and other factors having a material effect on the balance sheet and on financial... regional office will consult with the Power Supply Division in the case of generation projects for...
Research needs for developing a commodity-driven freight modeling approach.
DOT National Transportation Integrated Search
2003-01-01
It is well known that better freight forecasting models and data are needed, but the literature does not clearly indicate which components of the modeling methodology are most in need of improvement, which is a critical need in an era of limited rese...
Commercial Demand Module - NEMS Documentation
2017-01-01
Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.
ERIC Educational Resources Information Center
Peisachovich, Eva Hava; Nelles, L. J.; Johnson, Samantha; Nicholson, Laura; Gal, Raya; Kerr, Barbara; Celia, Popovic; Epstein, Iris; Da Silva, Celina
2017-01-01
Numerous forecasts suggest that professional-competence development depends on human encounters. Interaction between organizations, tasks, and individual providers influence human behaviour, affect organizations' or systems' performance, and are a key component of professional-competence development. Further, insufficient or ineffective…
Day-Ahead Crude Oil Price Forecasting Using a Novel Morphological Component Analysis Based Model
Zhu, Qing; Zou, Yingchao; Lai, Kin Keung
2014-01-01
As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations. PMID:25061614
Thermal sensation prediction by soft computing methodology.
Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor
2016-12-01
Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.
Day-ahead crude oil price forecasting using a novel morphological component analysis based model.
Zhu, Qing; He, Kaijian; Zou, Yingchao; Lai, Kin Keung
2014-01-01
As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
Jensen, Tue V.; Pinson, Pierre
2017-01-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.
Jensen, Tue V; Pinson, Pierre
2017-11-28
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
NASA Astrophysics Data System (ADS)
Jensen, Tue V.; Pinson, Pierre
2017-11-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S Y; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R
2016-09-01
With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore's dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369-1375; http://dx.doi.org/10.1289/ehp.1509981.
Prototype methodology for obtaining cloud seeding guidance from HRRR model data
NASA Astrophysics Data System (ADS)
Dawson, N.; Blestrud, D.; Kunkel, M. L.; Waller, B.; Ceratto, J.
2017-12-01
Weather model data, along with real time observations, are critical to determine whether atmospheric conditions are prime for super-cooled liquid water during cloud seeding operations. Cloud seeding groups can either use operational forecast models, or run their own model on a computer cluster. A custom weather model provides the most flexibility, but is also expensive. For programs with smaller budgets, openly-available operational forecasting models are the de facto method for obtaining forecast data. The new High-Resolution Rapid Refresh (HRRR) model (3 x 3 km grid size), developed by the Earth System Research Laboratory (ESRL), provides hourly model runs with 18 forecast hours per run. While the model cannot be fine-tuned for a specific area or edited to provide cloud-seeding-specific output, model output is openly available on a near-real-time basis. This presentation focuses on a prototype methodology for using HRRR model data to create maps which aid in near-real-time cloud seeding decision making. The R programming language is utilized to run a script on a Windows® desktop/laptop computer either on a schedule (such as every half hour) or manually. The latest HRRR model run is downloaded from NOAA's Operational Model Archive and Distribution System (NOMADS). A GRIB-filter service, provided by NOMADS, is used to obtain surface and mandatory pressure level data for a subset domain which greatly cuts down on the amount of data transfer. Then, a set of criteria, identified by the Idaho Power Atmospheric Science Group, is used to create guidance maps. These criteria include atmospheric stability (lapse rates), dew point depression, air temperature, and wet bulb temperature. The maps highlight potential areas where super-cooled liquid water may exist, reasons as to why cloud seeding should not be attempted, and wind speed at flight level.
FUSION++: A New Data Assimilative Model for Electron Density Forecasting
NASA Astrophysics Data System (ADS)
Bust, G. S.; Comberiate, J.; Paxton, L. J.; Kelly, M.; Datta-Barua, S.
2014-12-01
There is a continuing need within the operational space weather community, both civilian and military, for accurate, robust data assimilative specifications and forecasts of the global electron density field, as well as derived RF application product specifications and forecasts obtained from the electron density field. The spatial scales of interest range from a hundred to a few thousand kilometers horizontally (synoptic large scale structuring) and meters to kilometers (small scale structuring that cause scintillations). RF space weather applications affected by electron density variability on these scales include navigation, communication and geo-location of RF frequencies ranging from 100's of Hz to GHz. For many of these applications, the necessary forecast time periods range from nowcasts to 1-3 hours. For more "mission planning" applications, necessary forecast times can range from hours to days. In this paper we present a new ionosphere-thermosphere (IT) specification and forecast model being developed at JHU/APL based upon the well-known data assimilation algorithms Ionospheric Data Assimilation Four Dimensional (IDA4D) and Estimating Model Parameters from Ionospheric Reverse Engineering (EMPIRE). This new forecast model, "Forward Update Simple IONosphere model Plus IDA4D Plus EMPIRE (FUSION++), ingests data from observations related to electron density, winds, electric fields and neutral composition and provides improved specification and forecast of electron density. In addition, the new model provides improved specification of winds, electric fields and composition. We will present a short overview and derivation of the methodology behind FUSION++, some preliminary results using real observational sources, example derived RF application products such as HF bi-static propagation, and initial comparisons with independent data sources for validation.
Voukantsis, Dimitris; Karatzas, Kostas; Kukkonen, Jaakko; Räsänen, Teemu; Karppinen, Ari; Kolehmainen, Mikko
2011-03-01
In this paper we propose a methodology consisting of specific computational intelligence methods, i.e. principal component analysis and artificial neural networks, in order to inter-compare air quality and meteorological data, and to forecast the concentration levels for environmental parameters of interest (air pollutants). We demonstrate these methods to data monitored in the urban areas of Thessaloniki and Helsinki in Greece and Finland, respectively. For this purpose, we applied the principal component analysis method in order to inter-compare the patterns of air pollution in the two selected cities. Then, we proceeded with the development of air quality forecasting models for both studied areas. On this basis, we formulated and employed a novel hybrid scheme in the selection process of input variables for the forecasting models, involving a combination of linear regression and artificial neural networks (multi-layer perceptron) models. The latter ones were used for the forecasting of the daily mean concentrations of PM₁₀ and PM₂.₅ for the next day. Results demonstrated an index of agreement between measured and modelled daily averaged PM₁₀ concentrations, between 0.80 and 0.85, while the kappa index for the forecasting of the daily averaged PM₁₀ concentrations reached 60% for both cities. Compared with previous corresponding studies, these statistical parameters indicate an improved performance of air quality parameters forecasting. It was also found that the performance of the models for the forecasting of the daily mean concentrations of PM₁₀ was not substantially different for both cities, despite the major differences of the two urban environments under consideration. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie
2016-07-01
This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.
Foreign currency rate forecasting using neural networks
NASA Astrophysics Data System (ADS)
Pandya, Abhijit S.; Kondo, Tadashi; Talati, Amit; Jayadevappa, Suryaprasad
2000-03-01
Neural networks are increasingly being used as a forecasting tool in many forecasting problems. This paper discusses the application of neural networks in predicting daily foreign exchange rates between the USD, GBP as well as DEM. We approach the problem from a time-series analysis framework - where future exchange rates are forecasted solely using past exchange rates. This relies on the belief that the past prices and future prices are very close related, and interdependent. We present the result of training a neural network with historical USD-GBP data. The methodology used in explained, as well as the training process. We discuss the selection of inputs to the network, and present a comparison of using the actual exchange rates and the exchange rate differences as inputs. Price and rate differences are the preferred way of training neural network in financial applications. Results of both approaches are present together for comparison. We show that the network is able to learn the trends in the exchange rate movements correctly, and present the results of the prediction over several periods of time.
Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition
NASA Astrophysics Data System (ADS)
Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.
2005-12-01
Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.
DOT National Transportation Integrated Search
1979-06-01
Contents: A form of utility function for the UMOT model; An analysis of transportation/land use interactions; Toward a methodology to shape urban structure; Approaches for improving urban travel forecasts; Quasi-dynamic urban location models with end...
Methodologies used to estimate and forecast vehicle miles traveled (VMT) : final report.
DOT National Transportation Integrated Search
2016-07-01
Vehicle miles traveled (VMT) is a measure used in transportation planning for a variety of purposes. It measures the amount of travel for all vehicles in a geographic region over a given period of time, typically a one-year period. VMT is calculated ...
How Many Will Choose? School Choice and Student Enrollment Planning.
ERIC Educational Resources Information Center
Chan, Tak C.
1993-01-01
Enrollment planning is the basis of all school system planning. Focuses on assessing the impact of a choice plan on student enrollment planning. Issues involved include home schooling, school employees' choice, and private kindergarten programs. Administrators are advised to evaluate existing forecasting methodologies. (MLF)
Forecasting of Hourly Photovoltaic Energy in Canarian Electrical System
NASA Astrophysics Data System (ADS)
Henriquez, D.; Castaño, C.; Nebot, R.; Piernavieja, G.; Rodriguez, A.
2010-09-01
The Canarian Archipelago face similar problems as most insular region lacking of endogenous conventional energy resources and not connected to continental electrical grids. A consequence of the "insular fact" is the existence of isolated electrical systems that are very difficult to interconnect due to the considerable sea depths between the islands. Currently, the Canary Islands have six isolated electrical systems, only one utility generating most of the electricity (burning fuel), a recently arrived TSO (REE) and still a low implementation of Renewable Energy Resources (RES). The low level of RES deployment is a consequence of two main facts: the weakness of the stand-alone grids (from 12 MW in El Hierro up to only 1 GW in Gran Canaria) and the lack of space to install RES systems (more than 50% of the land protected due to environmental reasons). To increase the penetration of renewable energy generation, like solar or wind energy, is necessary to develop tools to manage them. The penetration of non manageable sources into weak grids like the Canarian ones causes a big problem to the grid operator. There are currently 104 MW of PV connected to the islands grids (Dec. 2009) and additional 150 MW under licensing. This power presents a serious challenge for the operation and stability of the electrical system. ITC, together with the local TSO (Red Eléctrica de España, REE) started in 2008 and R&D project to develop a PV energy prediction tool for the six Canarian Insular electrical systems. The objective is to supply reliable information for hourly forecast of the generation dispatch programme and to predict daily solar radiation patterns, in order to help program spinning reserves. ITC has approached the task of weather forecasting using different numerical model (MM5 and WRF) in combination with MSG (Meteosat Second Generation) images. From the online data recorded at several monitored PV plants and meteorological stations, PV nominal power and energy produced by every plant in Canary Islands are estimated using a series of theoretical and statistical energy models.
Characterizing and analyzing ramping events in wind power, solar power, load, and netload
Cui, Mingjian; Zhang, Jie; Feng, Cong; ...
2017-04-07
Here, one of the biggest concerns associated with integrating a large amount of renewable energy into the power grid is the ability to handle large ramps in the renewable power output. For the sake of system reliability and economics, it is essential for power system operators to better understand the ramping features of renewable, load, and netload. An optimized swinging door algorithm (OpSDA) is used and extended to accurately and efficiently detect ramping events. For wind power ramps detection, a process of merging 'bumps' (that have a different changing direction) into adjacent ramping segments is included to improve the performancemore » of the OpSDA method. For solar ramps detection, ramping events that occur in both clear-sky and measured (or forecasted) solar power are removed to account for the diurnal pattern of solar generation. Ramping features are extracted and extensively compared between load and netload under different renewable penetration levels (9.77%, 15.85%, and 51.38%). Comparison results show that (i) netload ramp events with shorter durations and smaller magnitudes occur more frequently when renewable penetration level increases, and the total number of ramping events also increases; and (ii) different ramping characteristics are observed in load and netload even with a low renewable penetration level.« less
A comparison of the domestic satellite communications forecast to the year 2000
NASA Technical Reports Server (NTRS)
Poley, W. A.; Lekan, J. F.; Salzman, J. A.; Stevenson, S. M.
1983-01-01
The methodologies and results of three NASA-sponsored market demand assessment studies are presented and compared. Forecasts of future satellite addressable traffic (both trunking and customer premises services) were developed for the three main service categories of voice, data and video and subcategories thereof for the benchmark years of 1980, 1990 and 2000. The contractor results are presented on a service by service basis in two formats: equivalent 36 MHz transponders and basic transmission units (voice: half-voice circuits, data: megabits per second and video: video channels). It is shown that while considerable differences exist at the service category level, the overall forecasts by the two contractors are quite similar. ITT estimates the total potential satellite market to be 3594 transponders in the year 2000 with data service comprising 54 percent of this total. The WU outlook for the same time period is 2779 transponders with voice services accounting for 66 percent of the total.
NASA Technical Reports Server (NTRS)
Keitz, J. F.
1982-01-01
The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.; ...
2017-07-11
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staid, Andrea; Watson, Jean -Paul; Wets, Roger J. -B.
Forecasts of available wind power are critical in key electric power systems operations planning problems, including economic dispatch and unit commitment. Such forecasts are necessarily uncertain, limiting the reliability and cost effectiveness of operations planning models based on a single deterministic or “point” forecast. A common approach to address this limitation involves the use of a number of probabilistic scenarios, each specifying a possible trajectory of wind power production, with associated probability. We present and analyze a novel method for generating probabilistic wind power scenarios, leveraging available historical information in the form of forecasted and corresponding observed wind power timemore » series. We estimate non-parametric forecast error densities, specifically using epi-spline basis functions, allowing us to capture the skewed and non-parametric nature of error densities observed in real-world data. We then describe a method to generate probabilistic scenarios from these basis functions that allows users to control for the degree to which extreme errors are captured.We compare the performance of our approach to the current state-of-the-art considering publicly available data associated with the Bonneville Power Administration, analyzing aggregate production of a number of wind farms over a large geographic region. Finally, we discuss the advantages of our approach in the context of specific power systems operations planning problems: stochastic unit commitment and economic dispatch. Here, our methodology is embodied in the joint Sandia – University of California Davis Prescient software package for assessing and analyzing stochastic operations strategies.« less
NASA Astrophysics Data System (ADS)
Rodriguez-Camino, Ernesto; Voces, José; Sánchez, Eroteida; Navascues, Beatriz; Pouget, Laurent; Roldan, Tamara; Gómez, Manuel; Cabello, Angels; Comas, Pau; Pastor, Fernando; Concepción García-Gómez, M.°; José Gil, Juan; Gil, Delfina; Galván, Rogelio; Solera, Abel
2016-04-01
This presentation, first, briefly describes the current use of weather forecasts and climate projections delivered by AEMET for water management in Spain. The potential use of seasonal climate predictions for water -in particular dams- management is then discussed more in-depth, using a pilot experience carried out by a multidisciplinary group coordinated by AEMET and DG for Water of Spain. This initiative is being developed in the framework of the national implementation of the GFCS and the European project, EUPORIAS. Among the main components of this experience there are meteorological and hydrological observations, and an empirical seasonal forecasting technique that provides an ensemble of water reservoir inflows. These forecasted inflows feed a prediction model for the dam state that has been adapted for this purpose. The full system is being tested retrospectively, over several decades, for selected water reservoirs located in different Spanish river basins. The assessment includes an objective verification of the probabilistic seasonal forecasts using standard metrics, and the evaluation of the potential social and economic benefits, with special attention to drought and flooding conditions. The methodology of implementation of these seasonal predictions in the decision making process is being developed in close collaboration with final users participating in this pilot experience.
Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program
NASA Technical Reports Server (NTRS)
Manobianco, John; Nutter, Paul
1997-01-01
The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.
Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest
NASA Technical Reports Server (NTRS)
Rohloff, Kurt
2010-01-01
The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.
High-quality weather data for grid integration studies
NASA Astrophysics Data System (ADS)
Draxl, C.
2016-12-01
As variable renewable power penetration levels increase in power systems worldwide, renewable integration studies are crucial to ensure continued economic and reliable operation of the power grid. In this talk we will shed light on requirements for grid integration studies as far as wind and solar energy are concerned. Because wind and solar plants are strongly impacted by weather, high-resolution and high-quality weather data are required to drive power system simulations. Future data sets will have to push limits of numerical weather prediction to yield these high-resolution data sets, and wind data will have to be time-synchronized with solar data. Current wind and solar integration data sets will be presented. The Wind Integration National Dataset (WIND) Toolkit is the largest and most complete grid integration data set publicly available to date. A meteorological data set, wind power production time series, and simulated forecasts created using the Weather Research and Forecasting Model run on a 2-km grid over the continental United States at a 5-min resolution is now publicly available for more than 126,000 land-based and offshore wind power production sites. The Solar Integration National Dataset (SIND) is available as time synchronized with the WIND Toolkit, and will allow for combined wind-solar grid integration studies. The National Solar Radiation Database (NSRDB) is a similar high temporal- and spatial resolution database of 18 years of solar resource data for North America and India. Grid integration studies are also carried out in various countries, which aim at increasing their wind and solar penetration through combined wind and solar integration data sets. We will present a multi-year effort to directly support India's 24x7 energy access goal through a suite of activities aimed at enabling large-scale deployment of clean energy and energy efficiency. Another current effort is the North-American-Renewable-Integration-Study, with the aim of providing a seamless data set across borders for a whole continent, to simulate and analyze the impacts of potential future large wind and solar power penetrations on bulk power system operations.
Fuzzy Multi-Objective Transportation Planning with Modified S-Curve Membership Function
NASA Astrophysics Data System (ADS)
Peidro, D.; Vasant, P.
2009-08-01
In this paper, the S-Curve membership function methodology is used in a transportation planning decision (TPD) problem. An interactive method for solving multi-objective TPD problems with fuzzy goals, available supply and forecast demand is developed. The proposed method attempts simultaneously to minimize the total production and transportation costs and the total delivery time with reference to budget constraints and available supply, machine capacities at each source, as well as forecast demand and warehouse space constraints at each destination. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in TPD problems, with linear membership functions.
NASA Lewis Research Center Futuring Workshop
NASA Technical Reports Server (NTRS)
Boroush, Mark; Stover, John; Thomas, Charles
1987-01-01
On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty.
DOT National Transportation Integrated Search
2017-06-01
This project developed a methodology to simulate and analyze roadway traffic patterns : and expected penetration and timing of electric vehicles (EVs) with application directed : toward the requirements for electric vehicle supply equipment (EVSE) si...
Drilled Shaft Foundations for Noise Barrier Walls and Slope Stabilization
DOT National Transportation Integrated Search
2002-12-01
This research project is focused on two primary objectives. The first objective relates to the development of a methodology for using the SPT (Standard Penetration Test) results to design the laterally loaded drilled shafts. The second objective aims...
NASA Technical Reports Server (NTRS)
Reynolds, David; Rasch, William; Kozlowski, Daniel; Burks, Jason; Zavodsky, Bradley; Bernardet, Ligia; Jankov, Isidora; Albers, Steve
2014-01-01
The Experimental Regional Ensemble Forecast (ExREF) system is a tool for the development and testing of new Numerical Weather Prediction (NWP) methodologies. ExREF is run in near-realtime by the Global Systems Division (GSD) of the NOAA Earth System Research Laboratory (ESRL) and its products are made available through a website, an ftp site, and via the Unidata Local Data Manager (LDM). The ExREF domain covers most of North America and has 9-km horizontal grid spacing. The ensemble has eight members, all employing WRF-ARW. The ensemble uses a variety of initial conditions from LAPS and the Global Forecasting System (GFS) and multiple boundary conditions from the GFS ensemble. Additionally, a diversity of physical parameterizations is used to increase ensemble spread and to account for the uncertainty in forecasting extreme precipitation events. ExREF has been a component of the Hydrometeorology Testbed (HMT) NWP suite in the 2012-2013 and 2013-2014 winters. A smaller domain covering just the West Coast was created to minimize band-width consumption for the NWS. This smaller domain has and is being distributed to the National Weather Service (NWS) Weather Forecast Office and California Nevada River Forecast Center in Sacramento, California, where it is ingested into the Advanced Weather Interactive Processing System (AWIPS I and II) to provide guidance on the forecasting of extreme precipitation events. This paper will review the cooperative effort employed by NOAA ESRL, NASA SPoRT (Short-term Prediction Research and Transition Center), and the NWS to facilitate the ingest and display of ExREF data utilizing the AWIPS I and II D2D and GFE (Graphical Software Editor) software. Within GFE is a very useful verification software package called BoiVer that allows the NWS to utilize the River Forecast Center's 4 km gridded QPE to compare with all operational NWP models 6-hr QPF along with the ExREF mean 6-hr QPF so the forecasters can build confidence in the use of the ExREF in preparing their rainfall forecasts. Preliminary results will be presented.
Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics
NASA Astrophysics Data System (ADS)
Kuchment, L.
2012-04-01
Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.
NASA Astrophysics Data System (ADS)
Salvage, R. O.; Neuberg, J. W.
2016-09-01
Prior to many volcanic eruptions, an acceleration in seismicity has been observed, suggesting the potential for this as a forecasting tool. The Failure Forecast Method (FFM) relates an accelerating precursor to the timing of failure by an empirical power law, with failure being defined in this context as the onset of an eruption. Previous applications of the FFM have used a wide variety of accelerating time series, often generating questionable forecasts with large misfits between data and the forecast, as well as the generation of a number of different forecasts from the same data series. Here, we show an alternative approach applying the FFM in combination with a cross correlation technique which identifies seismicity from a single active source mechanism and location at depth. Isolating a single system at depth avoids additional uncertainties introduced by averaging data over a number of different accelerating phenomena, and consequently reduces the misfit between the data and the forecast. Similar seismic waveforms were identified in the precursory accelerating seismicity to dome collapses at Soufrière Hills volcano, Montserrat in June 1997, July 2003 and February 2010. These events were specifically chosen since they represent a spectrum of collapse scenarios at this volcano. The cross correlation technique generates a five-fold increase in the number of seismic events which could be identified from continuous seismic data rather than using triggered data, thus providing a more holistic understanding of the ongoing seismicity at the time. The use of similar seismicity as a forecasting tool for collapses in 1997 and 2003 greatly improved the forecasted timing of the dome collapse, as well as improving the confidence in the forecast, thereby outperforming the classical application of the FFM. We suggest that focusing on a single active seismic system at depth allows a more accurate forecast of some of the major dome collapses from the ongoing eruption at Soufrière Hills volcano, and provides a simple addition to the well-used methodology of the FFM.
Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.
ERIC Educational Resources Information Center
Moore, Gwendolyn B.; And Others
The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…
Agri-Manpower Forecasting and Educational Planning
ERIC Educational Resources Information Center
Ramarao, D.; Agrawal, Rashmi; Rao, B. V. L. N.; Nanda, S. K.; Joshi, Girish P.
2014-01-01
Purpose: Developing countries need to plan growth or expansion of education so as to provide required trained manpower for different occupational sectors. The paper assesses supply and demand of professional manpower in Indian agriculture and the demands are translated in to educational requirements. Methodology: The supply is assessed from the…
76 FR 30605 - Assessment and Collection of Regulatory Fees For Fiscal Year 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... the Wireless Telecommunications Bureau's Numbering Resource Utilization Forecast and Annual CMRS... compute their fee payment using the standard methodology \\32\\ that is currently in place for CMRS Wireless... Commission, Regulatory Fees Fact Sheet: What You Owe--Commercial Wireless Services for FY 2010 at 1 (released...
Using GIS Tools and Environmental Scanning to Forecast Industry Workforce Needs
ERIC Educational Resources Information Center
Gaertner, Elaine; Fleming, Kevin; Marquez, Michelle
2009-01-01
The Centers of Excellence (COE) provide regional workforce data on high growth, high demand industries and occupations for use by community colleges in program planning and resource enhancement. This article discusses the environmental scanning research methodology and its application to data-driven decision making in community college program…
The IDEA model: A single equation approach to the Ebola forecasting challenge.
Tuite, Ashleigh R; Fisman, David N
2018-03-01
Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
METHODOLOGY FOR MEASURING PM 2.5 SEPARATOR CHARACTERISTICS USING AN AEROSIZER
A method is presented that enables the measurement of the particle size separation characteristics of an inertial separator in a rapid fashion. Overall penetration is determined for discrete particle sizes using an Aerosizer (Model LD, TSI, Incorporated, Particle Instruments/Am...
Drilled Shaft Foundations for Noise Barrier Walls and Slope Stabilization : Executive Summary
DOT National Transportation Integrated Search
2002-12-01
This research project is focused on two primary objectives. The first objective relates to the development of a methodology for using the SPT (Standard Penetration Test) results to design the laterally loaded drilled shafts. The second objective aims...
Penetrating abdominal injuries: management controversies
Butt, Muhammad U; Zacharias, Nikolaos; Velmahos, George C
2009-01-01
Penetrating abdominal injuries have been traditionally managed by routine laparotomy. New understanding of trajectories, potential for organ injury, and correlation with advanced radiographic imaging has allowed a shift towards non-operative management of appropriate cases. Although a selective approach has been established for stab wounds, the management of abdominal gunshot wounds remains a matter of controversy. In this chapter we describe the rationale and methodology of selecting patients for non-operative management. We also discuss additional controversial issues, as related to antibiotic prophylaxis, management of asymptomatic thoracoabdominal injuries, and the use of colostomy vs. primary repair for colon injuries. PMID:19374761
WIRE: Weather Intelligence for Renewable Energies
NASA Astrophysics Data System (ADS)
Heimo, A.; Cattin, R.; Calpini, B.
2010-09-01
Renewable energies such as wind and solar energy will play an important, even decisive role in order to mitigate and adapt to the projected dramatic consequences to our society and environment due to climate change. Due to shrinking fossil resources, the transition to more and more renewable energy shares is unavoidable. But, as wind and solar energy are strongly dependent on highly variable weather processes, increased penetration rates will also lead to strong fluctuations in the electricity grid which need to be balanced. Proper and specific forecasting of ‘energy weather' is a key component for this. Therefore, it is today appropriate to scientifically address the requirements to provide the best possible specific weather information for forecasting the energy production of wind and solar power plants within the next minutes up to several days. Towards such aims, Weather Intelligence will first include developing dedicated post-processing algorithms coupled with weather prediction models and with past and/or online measurement data especially remote sensing observations. Second, it will contribute to investigate the difficult relationship between the highly intermittent weather dependent power production and concurrent capacities such as transport and distribution of this energy to the end users. Selecting, resp. developing surface-based and satellite remote sensing techniques well adapted to supply relevant information to the specific post-processing algorithms for solar and wind energy production short-term forecasts is a major task with big potential. It will lead to improved energy forecasts and help to increase the efficiency of the renewable energy productions while contributing to improve the management and presumably the design of the energy grids. The second goal will raise new challenges as this will require first from the energy producers and distributors definitions of the requested input data and new technologies dedicated to the management of power plants and electricity grids and second from the meteorological measurement community to deliver suitable, short term high quality forecasts to fulfill these requests with emphasis on highly variable weather conditions and spatially distributed energy productions often located in complex terrain. This topic has been submitted for a new COST Action under the title "Short-Term High Resolution Wind and Solar Energy Production Forecasts".
Assessment of Hydrologic Response to Variable Precipitation Forcing: Russian River Case Study
NASA Astrophysics Data System (ADS)
Cifelli, R.; Hsu, C.; Johnson, L. E.
2014-12-01
NOAA Hydrometeorology Testbed (HMT) activities in California have involved deployment of advanced sensor networks to better track atmospheric river (AR) dynamics and inland penetration of high water vapor air masses. Numerical weather prediction models and decision support tools have been developed to provide forecasters a better basis for forecasting heavy precipitation and consequent flooding. The HMT also involves a joint project with California Department of Water Resources (CA-DWR) and the Scripps Institute for Oceanography (SIO) as part of CA-DWR's Enhanced Flood Response and Emergency Preparedness (EFREP) program. The HMT activities have included development and calibration of a distributed hydrologic model, the NWS Office of Hydrologic Development's (OHD) Research Distributed Hydrologic Model (RDHM), to prototype the distributed approach for flood and other water resources applications. HMT has applied RDHM to the Russian-Napa watersheds for research assessment of gap-filling weather radars for precipitation and hydrologic forecasting and for establishing a prototype to inform both the NWS Monterey Forecast Office and the California Nevada River Forecast Center (CNRFC) of RDHM capabilities. In this presentation, a variety of precipitation forcings generated with and without gap filling radar and rain gauge data are used as input to RDHM to assess the hydrologic response for selected case study events. Both the precipitation forcing and hydrologic model are run at different spatial and temporal resolution in order to examine the sensitivity of runoff to the precipitation inputs. Based on the timing of the events and the variations of spatial and temporal resolution, the parameters which dominate the hydrologic response are identified. The assessment is implemented at two USGS stations (Ukiah near Russian River and Austin Creek near Cazadero) that are minimally influenced by managed flows and objective evaluation can thus be derived. The results are assessed using statistical metrics, including daily Nash scores, Pearson Correlation, and sub daily timing errors.
Sampling strategies based on singular vectors for assimilated models in ocean forecasting systems
NASA Astrophysics Data System (ADS)
Fattorini, Maria; Brandini, Carlo; Ortolani, Alberto
2016-04-01
Meteorological and oceanographic models do need observations, not only as a ground truth element to verify the quality of the models, but also to keep model forecast error acceptable: through data assimilation techniques which merge measured and modelled data, natural divergence of numerical solutions from reality can be reduced / controlled and a more reliable solution - called analysis - is computed. Although this concept is valid in general, its application, especially in oceanography, raises many problems due to three main reasons: the difficulties that have ocean models in reaching an acceptable state of equilibrium, the high measurements cost and the difficulties in realizing them. The performances of the data assimilation procedures depend on the particular observation networks in use, well beyond the background quality and the used assimilation method. In this study we will present some results concerning the great impact of the dataset configuration, in particular measurements position, on the evaluation of the overall forecasting reliability of an ocean model. The aim consists in identifying operational criteria to support the design of marine observation networks at regional scale. In order to identify the observation network able to minimize the forecast error, a methodology based on Singular Vectors Decomposition of the tangent linear model is proposed. Such a method can give strong indications on the local error dynamics. In addition, for the purpose of avoiding redundancy of information contained in the data, a minimal distance among data positions has been chosen on the base of a spatial correlation analysis of the hydrodynamic fields under investigation. This methodology has been applied for the choice of data positions starting from simplified models, like an ideal double-gyre model and a quasi-geostrophic one. Model configurations and data assimilation are based on available ROMS routines, where a variational assimilation algorithm (4D-var) is included as part of the code These first applications have provided encouraging results in terms of increased predictability time and reduced forecast error, also improving the quality of the analysis used to recover the real circulation patterns from a first guess quite far from the real state.
A Damage-Dependent Finite Element Analysis for Fiber-Reinforced Composite Laminates
NASA Technical Reports Server (NTRS)
Coats, Timothy W.; Harris, Charles E.
1998-01-01
A progressive damage methodology has been developed to predict damage growth and residual strength of fiber-reinforced composite structure with through penetrations such as a slit. The methodology consists of a damage-dependent constitutive relationship based on continuum damage mechanics. Damage is modeled using volume averaged strain-like quantities known as internal state variables and is represented in the equilibrium equations as damage induced force vectors instead of the usual degradation and modification of the global stiffness matrix.
Optimal Design of Sheet Pile Wall Embedded in Clay
NASA Astrophysics Data System (ADS)
Das, Manas Ranjan; Das, Sarat Kumar
2015-09-01
Sheet pile wall is a type of flexible earth retaining structure used in waterfront offshore structures, river protection work and temporary supports in foundations and excavations. Economy is an essential part of a good engineering design and needs to be considered explicitly in obtaining an optimum section. By considering appropriate embedment depth and sheet pile section it may be possible to achieve better economy. This paper describes optimum design of both cantilever and anchored sheet pile wall penetrating clay using a simple optimization tool Microsoft Excel ® Solver. The detail methodology and its application with examples are presented for cantilever and anchored sheet piles. The effects of soil properties, depth of penetration and variation of ground water table on the optimum design are also discussed. Such a study will help professional while designing the sheet pile wall penetrating clay.
NASA Astrophysics Data System (ADS)
Palchak, David
Electrical load forecasting is a tool that has been utilized by distribution designers and operators as a means for resource planning and generation dispatch. The techniques employed in these predictions are proving useful in the growing market of consumer, or end-user, participation in electrical energy consumption. These predictions are based on exogenous variables, such as weather, and time variables, such as day of week and time of day as well as prior energy consumption patterns. The participation of the end-user is a cornerstone of the Smart Grid initiative presented in the Energy Independence and Security Act of 2007, and is being made possible by the emergence of enabling technologies such as advanced metering infrastructure. The optimal application of the data provided by an advanced metering infrastructure is the primary motivation for the work done in this thesis. The methodology for using this data in an energy management scheme that utilizes a short-term load forecast is presented. The objective of this research is to quantify opportunities for a range of energy management and operation cost savings of a university campus through the use of a forecasted daily electrical load profile. The proposed algorithm for short-term load forecasting is optimized for Colorado State University's main campus, and utilizes an artificial neural network that accepts weather and time variables as inputs. The performance of the predicted daily electrical load is evaluated using a number of error measurements that seek to quantify the best application of the forecast. The energy management presented utilizes historical electrical load data from the local service provider to optimize the time of day that electrical loads are being managed. Finally, the utilization of forecasts in the presented energy management scenario is evaluated based on cost and energy savings.
NASA Astrophysics Data System (ADS)
Delaney, C.; Mendoza, J.; Jasperse, J.; Hartman, R. K.; Whitin, B.; Kalansky, J.
2017-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates 15-day ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to conduct a mock operation test trial of the EFO alternative for 2017. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The operational trial utilized real-time ESPs prepared by the CNRFC and observed flow information to simulate hydrologic conditions in Lake Mendocino and a 50-mile downstream reach of the Russian River to the City of Healdsburg. Results of the EFO trial demonstrate a 6% increase in reservoir storage at the end of trial period (May 10) relative to observed conditions. Additionally, model results show no increase in flows above flood stage for points downstream of Lake Mendocino. Results of this investigation and other studies demonstrate that the EFO alternative may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.
Smart Irrigation From Soil Moisture Forecast Using Satellite And Hydro -Meteorological Modelling
NASA Astrophysics Data System (ADS)
Corbari, Chiara; Mancini, Marco; Ravazzani, Giovanni; Ceppi, Alessandro; Salerno, Raffaele; Sobrino, Josè
2017-04-01
Increased water demand and climate change impacts have recently enhanced the need to improve water resources management, even in those areas which traditionally have an abundant supply of water. The highest consumption of water is devoted to irrigation for agricultural production, and so it is in this area that efforts have to be focused to study possible interventions. The SIM project funded by EU in the framework of the WaterWorks2014 - Water Joint Programming Initiative aims at developing an operational tool for real-time forecast of crops irrigation water requirements to support parsimonious water management and to optimize irrigation scheduling providing real-time and forecasted soil moisture behavior at high spatial and temporal resolutions with forecast horizons from few up to thirty days. This study discusses advances in coupling satellite driven soil water balance model and meteorological forecast as support for precision irrigation use comparing different case studies in Italy, in the Netherlands, in China and Spain, characterized by different climatic conditions, water availability, crop types and irrigation techniques and water distribution rules. Herein, the applications in two operative farms in vegetables production in the South of Italy where semi-arid climatic conditions holds, two maize fields in Northern Italy in a more water reach environment with flood irrigation will be presented. This system combines state of the art mathematical models and new technologies for environmental monitoring, merging ground observed data with Earth observations. Discussion on the methodology approach is presented, comparing for a reanalysis periods the forecast system outputs with observed soil moisture and crop water needs proving the reliability of the forecasting system and its benefits. The real-time visualization of the implemented system is also presented through web-dashboards.
Hoover, Stephen; Jackson, Eric V.; Paul, David; Locke, Robert
2016-01-01
Summary Background Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Objective Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. Methods We used five years of retrospective daily NICU census data for model development (January 2008 – December 2012, N=1827 observations) and one year of data for validation (January – December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. Results The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Conclusions Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning. PMID:27437040
Hybrid vs Adaptive Ensemble Kalman Filtering for Storm Surge Forecasting
NASA Astrophysics Data System (ADS)
Altaf, M. U.; Raboudi, N.; Gharamti, M. E.; Dawson, C.; McCabe, M. F.; Hoteit, I.
2014-12-01
Recent storm surge events due to Hurricanes in the Gulf of Mexico have motivated the efforts to accurately forecast water levels. Toward this goal, a parallel architecture has been implemented based on a high resolution storm surge model, ADCIRC. However the accuracy of the model notably depends on the quality and the recentness of the input data (mainly winds and bathymetry), model parameters (e.g. wind and bottom drag coefficients), and the resolution of the model grid. Given all these uncertainties in the system, the challenge is to build an efficient prediction system capable of providing accurate forecasts enough ahead of time for the authorities to evacuate the areas at risk. We have developed an ensemble-based data assimilation system to frequently assimilate available data into the ADCIRC model in order to improve the accuracy of the model. In this contribution we study and analyze the performances of different ensemble Kalman filter methodologies for efficient short-range storm surge forecasting, the aim being to produce the most accurate forecasts at the lowest possible computing time. Using Hurricane Ike meteorological data to force the ADCIRC model over a domain including the Gulf of Mexico coastline, we implement and compare the forecasts of the standard EnKF, the hybrid EnKF and an adaptive EnKF. The last two schemes have been introduced as efficient tools for enhancing the behavior of the EnKF when implemented with small ensembles by exploiting information from a static background covariance matrix. Covariance inflation and localization are implemented in all these filters. Our results suggest that both the hybrid and the adaptive approach provide significantly better forecasts than those resulting from the standard EnKF, even when implemented with much smaller ensembles.
Capan, Muge; Hoover, Stephen; Jackson, Eric V; Paul, David; Locke, Robert
2016-01-01
Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. We used five years of retrospective daily NICU census data for model development (January 2008 - December 2012, N=1827 observations) and one year of data for validation (January - December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning.
Estimating Consequences of MMOD Penetrations on ISS
NASA Technical Reports Server (NTRS)
Evans, H.; Hyde, James; Christiansen, E.; Lear, D.
2017-01-01
The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.
Predicting the Consequences of MMOD Penetrations on the International Space Station
NASA Technical Reports Server (NTRS)
Hyde, James; Christiansen, E.; Lear, D.; Evans
2018-01-01
The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.
Performance of hybrid and single-frequency impulse GPR antennas on USGA sporting greens
USDA-ARS?s Scientific Manuscript database
The utility of employing ground-penetrating radar (GPR) technologies for environmental surveys can vary, depending upon the physical properties of the site. Environmental conditions can fluctuate, altering soil properties. Operator proficiency and survey methodology will also influence GPR findings....
DOT National Transportation Integrated Search
2016-06-30
The overarching objective of this research is the development of a systematic methodology of employing GPR, including instruments, subsequent data processing and interpretation that can be used regularly as part of a roadway pavement and bridge evalu...
Linking seasonal climate forecasts with crop models in Iberian Peninsula
NASA Astrophysics Data System (ADS)
Capa, Mirian; Ines, Amor; Baethgen, Walter; Rodriguez-Fonseca, Belen; Han, Eunjin; Ruiz-Ramos, Margarita
2015-04-01
Translating seasonal climate forecasts into agricultural production forecasts could help to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse conditions. In this study, we use seasonal rainfall forecasts and crop models to improve predictability of wheat yield in the Iberian Peninsula (IP). Additionally, we estimate economic margins and production risks associated with extreme scenarios of seasonal rainfall forecast. This study evaluates two methods for disaggregating seasonal climate forecasts into daily weather data: 1) a stochastic weather generator (CondWG), and 2) a forecast tercile resampler (FResampler). Both methods were used to generate 100 (with FResampler) and 110 (with CondWG) weather series/sequences for three scenarios of seasonal rainfall forecasts. Simulated wheat yield is computed with the crop model CERES-wheat (Ritchie and Otter, 1985), which is included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at two locations in northeastern Spain where the crop model was calibrated and validated with independent field data. Once simulated yields were obtained, an assessment of farmer's gross margin for different seasonal climate forecasts was accomplished to estimate production risks under different climate scenarios. This methodology allows farmers to assess the benefits and risks of a seasonal weather forecast in IP prior to the crop growing season. The results of this study may have important implications on both, public (agricultural planning) and private (decision support to farmers, insurance companies) sectors. Acknowledgements Research by M. Capa-Morocho has been partly supported by a PICATA predoctoral fellowship of the Moncloa Campus of International Excellence (UCM-UPM) and MULCLIVAR project (CGL2012-38923-C02-02) References Hoogenboom, G. et al., 2010. The Decision Support System for Agrotechnology Transfer (DSSAT).Version 4.5 [CD-ROM].University of Hawaii, Honolulu, Hawaii. Ritchie, J.T., Otter, S., 1985. Description and performanceof CERES-Wheat: a user-oriented wheat yield model. In: ARS Wheat Yield Project. ARS-38.Natl Tech Info Serv, Springfield, Missouri, pp. 159-175.
NASA Astrophysics Data System (ADS)
Dommanget, Etienne; Bellier, Joseph; Ben Daoud, Aurélien; Graff, Benjamin
2014-05-01
Compagnie Nationale du Rhône (CNR) has been granted the concession to operate the Rhone River from the Swiss border to the Mediterranean Sea since 1933 and carries out three interdependent missions: navigation, irrigation and hydropower production. Nowadays, CNR generates one quarter of France's hydropower electricity. The convergence of public and private interests around optimizing the management of water resources throughout the French Rhone valley led CNR to develop hydrological models dedicated to discharge seasonal forecasting. Indeed, seasonal forecasting is a major issue for CNR and water resource management, in order to optimize long-term investments of the produced electricity, plan dam maintenance operations and anticipate low water period. Seasonal forecasting models have been developed on the Genissiat dam. With an installed capacity of 420MW, Genissiat dam is the first of the 19 CNR's hydropower plants. Discharge forecasting at Genissiat dam is strategic since its inflows contributes to 20% of the total Rhone average discharge and consequently to 40% of the total Rhone hydropower production. Forecasts are based on hydrological statistical models. Discharge on the main Rhone River tributaries upstream Genissiat dam are forecasted from 1 to 6 months ahead thanks to multiple linear regressions. Inputs data of these regressions are identified depending on river hydrological regimes and periods of the year. For the melting season, from spring to summer, snow water equivalent (SWE) data are of major importance. SWE data are calculated from Crocus model (Météo France) and SLF's model (Switzerland). CNR hydro-meteorological forecasters assessed meteorological trends regarding precipitations for the next coming months. These trends are used to generate stochastically precipitation scenarios in order to complement regression data set. This probabilistic approach build a decision-making supports for CNR's water resource management team and provides them with seasonal forecasts and their confidence interval. After a presentation of CNR methodology, results for the years 2011 and 2013 will illustrate CNR's seasonal forecasting models ability. These years are of particular interest regarding water resource management seeing that they are, respectively, unusually dry and snowy. Model performances will be assessed in comparison with historical climatology thanks to CRPS skill score.
Post-processing of multi-model ensemble river discharge forecasts using censored EMOS
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian
2014-05-01
When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin Wilde, Principal Investigator
2012-12-31
ABSTRACT Application of Real-Time Offsite Measurements in Improved Short-Term Wind Ramp Prediction Skill Improved forecasting performance immediately preceding wind ramp events is of preeminent concern to most wind energy companies, system operators, and balancing authorities. The value of near real-time hub height-level wind data and more general meteorological measurements to short-term wind power forecasting is well understood. For some sites, access to onsite measured wind data - even historical - can reduce forecast error in the short-range to medium-range horizons by as much as 50%. Unfortunately, valuable free-stream wind measurements at tall tower are not typically available at most windmore » plants, thereby forcing wind forecasters to rely upon wind measurements below hub height and/or turbine nacelle anemometry. Free-stream measurements can be appropriately scaled to hub-height levels, using existing empirically-derived relationships that account for surface roughness and turbulence. But there is large uncertainty in these relationships for a given time of day and state of the boundary layer. Alternatively, forecasts can rely entirely on turbine anemometry measurements, though such measurements are themselves subject to wake effects that are not stationary. The void in free-stream hub-height level measurements of wind can be filled by remote sensing (e.g., sodar, lidar, and radar). However, the expense of such equipment may not be sustainable. There is a growing market for traditional anemometry on tall tower networks, maintained by third parties to the forecasting process (i.e., independent of forecasters and the forecast users). This study examines the value of offsite tall-tower data from the WINDataNOW Technology network for short-horizon wind power predictions at a wind farm in northern Montana. The presentation shall describe successful physical and statistical techniques for its application and the practicality of its application in an operational setting. It shall be demonstrated that when used properly, the real-time offsite measurements materially improve wind ramp capture and prediction statistics, when compared to traditional wind forecasting techniques and to a simple persistence model.« less
Shi, Yuan; Liu, Xu; Kok, Suet-Yheng; Rajarethinam, Jayanthi; Liang, Shaohong; Yap, Grace; Chong, Chee-Seng; Lee, Kim-Sung; Tan, Sharon S.Y.; Chin, Christopher Kuan Yew; Lo, Andrew; Kong, Waiming; Ng, Lee Ching; Cook, Alex R.
2015-01-01
Background: With its tropical rainforest climate, rapid urbanization, and changing demography and ecology, Singapore experiences endemic dengue; the last large outbreak in 2013 culminated in 22,170 cases. In the absence of a vaccine on the market, vector control is the key approach for prevention. Objectives: We sought to forecast the evolution of dengue epidemics in Singapore to provide early warning of outbreaks and to facilitate the public health response to moderate an impending outbreak. Methods: We developed a set of statistical models using least absolute shrinkage and selection operator (LASSO) methods to forecast the weekly incidence of dengue notifications over a 3-month time horizon. This forecasting tool used a variety of data streams and was updated weekly, including recent case data, meteorological data, vector surveillance data, and population-based national statistics. The forecasting methodology was compared with alternative approaches that have been proposed to model dengue case data (seasonal autoregressive integrated moving average and step-down linear regression) by fielding them on the 2013 dengue epidemic, the largest on record in Singapore. Results: Operationally useful forecasts were obtained at a 3-month lag using the LASSO-derived models. Based on the mean average percentage error, the LASSO approach provided more accurate forecasts than the other methods we assessed. We demonstrate its utility in Singapore’s dengue control program by providing a forecast of the 2013 outbreak for advance preparation of outbreak response. Conclusions: Statistical models built using machine learning methods such as LASSO have the potential to markedly improve forecasting techniques for recurrent infectious disease outbreaks such as dengue. Citation: Shi Y, Liu X, Kok SY, Rajarethinam J, Liang S, Yap G, Chong CS, Lee KS, Tan SS, Chin CK, Lo A, Kong W, Ng LC, Cook AR. 2016. Three-month real-time dengue forecast models: an early warning system for outbreak alerts and policy decision support in Singapore. Environ Health Perspect 124:1369–1375; http://dx.doi.org/10.1289/ehp.1509981 PMID:26662617
The weather roulette: assessing the economic value of seasonal wind speed predictions
NASA Astrophysics Data System (ADS)
Christel, Isadora; Cortesi, Nicola; Torralba-Fernandez, Veronica; Soret, Albert; Gonzalez-Reviriego, Nube; Doblas-Reyes, Francisco
2016-04-01
Climate prediction is an emerging and highly innovative research area. For the wind energy sector, predicting the future variability of wind resources over the coming weeks or seasons is especially relevant to quantify operation and maintenance logistic costs or to inform energy trading decision with potential cost savings and/or economic benefits. Recent advances in climate predictions have already shown that probabilistic forecasting can improve the current prediction practices, which are based in the use of retrospective climatology and the assumption that what happened in the past is the best estimation of future conditions. Energy decision makers now have this new set of climate services but, are they willing to use them? Our aim is to properly explain the potential economic benefits of adopting probabilistic predictions, compared with the current practice, by using the weather roulette methodology (Hagedorn & Smith, 2009). This methodology is a diagnostic tool created to inform in a more intuitive and relevant way about the skill and usefulness of a forecast in the decision making process, by providing an economic and financial oriented assessment of the benefits of using a particular forecast system. We have selected a region relevant to the energy stakeholders where the predictions of the EUPORIAS climate service prototype for the energy sector (RESILIENCE) are skillful. In this region, we have applied the weather roulette to compare the overall prediction success of RESILIENCE's predictions and climatology illustrating it as an effective interest rate, an economic term that is easier to understand for energy stakeholders.
Assessment of precursory information in seismo-electromagnetic phenomena
NASA Astrophysics Data System (ADS)
Han, P.; Hattori, K.; Zhuang, J.
2017-12-01
Previous statistical studies showed that there were correlations between seismo-electromagnetic phenomena and sizeable earthquakes in Japan. In this study, utilizing Molchan's error diagram, we evaluate whether these phenomena contain precursory information and discuss how they can be used in short-term forecasting of large earthquake events. In practice, for given series of precursory signals and related earthquake events, each prediction strategy is characterized by the leading time of alarms, the length of alarm window, the alarm radius (area) and magnitude. The leading time is the time length between a detected anomaly and its following alarm, and the alarm window is the duration that an alarm lasts. The alarm radius and magnitude are maximum predictable distance and minimum predictable magnitude of earthquake events, respectively. We introduce the modified probability gain (PG') and the probability difference (D') to quantify the forecasting performance and to explore the optimal prediction parameters for a given electromagnetic observation. The above methodology is firstly applied to ULF magnetic data and GPS-TEC data. The results show that the earthquake predictions based on electromagnetic anomalies are significantly better than random guesses, indicating the data contain potential useful precursory information. Meanwhile, we reveal the optimal prediction parameters for both observations. The methodology proposed in this study could be also applied to other pre-earthquake phenomena to find out whether there is precursory information, and then on this base explore the optimal alarm parameters in practical short-term forecast.
NASA Astrophysics Data System (ADS)
dos Santos, A. F.; Freitas, S. R.; de Mattos, J. G. Z.; de Campos Velho, H. F.; Gan, M. A.; da Luz, E. F. P.; Grell, G. A.
2013-09-01
In this paper we consider an optimization problem applying the metaheuristic Firefly algorithm (FY) to weight an ensemble of rainfall forecasts from daily precipitation simulations with the Brazilian developments on the Regional Atmospheric Modeling System (BRAMS) over South America during January 2006. The method is addressed as a parameter estimation problem to weight the ensemble of precipitation forecasts carried out using different options of the convective parameterization scheme. Ensemble simulations were performed using different choices of closures, representing different formulations of dynamic control (the modulation of convection by the environment) in a deep convection scheme. The optimization problem is solved as an inverse problem of parameter estimation. The application and validation of the methodology is carried out using daily precipitation fields, defined over South America and obtained by merging remote sensing estimations with rain gauge observations. The quadratic difference between the model and observed data was used as the objective function to determine the best combination of the ensemble members to reproduce the observations. To reduce the model rainfall biases, the set of weights determined by the algorithm is used to weight members of an ensemble of model simulations in order to compute a new precipitation field that represents the observed precipitation as closely as possible. The validation of the methodology is carried out using classical statistical scores. The algorithm has produced the best combination of the weights, resulting in a new precipitation field closest to the observations.
How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.
2015-03-01
The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.
relationships and can be utilized to provide seasonal forecasts of tropical cyclones. Details of methodologies thunderstorm systems (called mesoscale convective complexes [MCCs]) often produce an inertially stable, warm , they considered hurricanes and intense hurricanes that occurred anywhere within these water boundaries
Technical Processing Librarians in the 1980's: Current Trends and Future Forecasts.
ERIC Educational Resources Information Center
Kennedy, Gail
1980-01-01
This review of recent and anticipated advances in library automation technology and methodology includes a review of the effects of OCLC, MARC formatting, AACR2, and increasing costs, as well as predictions of the impact on library technical processing of networking, expansion of automation, minicomputers, specialized reference services, and…
Application of Classification Methods for Forecasting Mid-Term Power Load Patterns
NASA Astrophysics Data System (ADS)
Piao, Minghao; Lee, Heon Gyu; Park, Jin Hyoung; Ryu, Keun Ho
Currently an automated methodology based on data mining techniques is presented for the prediction of customer load patterns in long duration load profiles. The proposed approach in this paper consists of three stages: (i) data preprocessing: noise or outlier is removed and the continuous attribute-valued features are transformed to discrete values, (ii) cluster analysis: k-means clustering is used to create load pattern classes and the representative load profiles for each class and (iii) classification: we evaluated several supervised learning methods in order to select a suitable prediction method. According to the proposed methodology, power load measured from AMR (automatic meter reading) system, as well as customer indexes, were used as inputs for clustering. The output of clustering was the classification of representative load profiles (or classes). In order to evaluate the result of forecasting load patterns, the several classification methods were applied on a set of high voltage customers of the Korea power system and derived class labels from clustering and other features are used as input to produce classifiers. Lastly, the result of our experiments was presented.
Probabilistic population aging
2017-01-01
We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675
Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge observations and several years of archived forecasts, overall empirical error distributions termed 'overall error' were for each gauge derived for a range of relevant forecast lead times. b) The error distributions vary strongly with the hydrometeorological situation, therefore a subdivision into the hydrological cases 'low flow, 'rising flood', 'flood', flood recession' was introduced. c) For the sake of numerical compression, theoretical distributions were fitted to the empirical distributions using the method of moments. Here, the normal distribution was generally best suited. d) Further data compression was achieved by representing the distribution parameters as a function (second-order polynome) of lead time. In general, the 'overall error' obtained from the above procedure is most useful in regions where large human impact occurs and where the influence of the meteorological forecast is limited. In upstream regions however, forecast uncertainty is strongly dependent on the current predictability of the atmosphere, which is contained in the spread of an ensemble forecast. Including this dynamically in the hydrological forecast uncertainty estimation requires prior elimination of the contribution of the weather forecast to the 'overall error'. This was achieved by calculating long series of hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The resulting error distribution is termed 'model error' and can be applied on hydrological ensemble forecasts, where ensemble rainfall forecasts are used as forcing. The concept will be illustrated by examples (good and bad ones) covering a wide range of catchment sizes, hydrometeorological regimes and quality of hydrological model calibration. The methodology to combine the static and dynamic shares of uncertainty will be presented in part II of this study.
Windblown Dust Deposition Forecasting and Spread of Contamination around Mine Tailings.
Stovern, Michael; Guzmán, Héctor; Rine, Kyle P; Felix, Omar; King, Matthew; Ela, Wendell P; Betterton, Eric A; Sáez, Avelino Eduardo
2016-02-01
Wind erosion, transport and deposition of windblown dust from anthropogenic sources, such as mine tailings impoundments, can have significant effects on the surrounding environment. The lack of vegetation and the vertical protrusion of the mine tailings above the neighboring terrain make the tailings susceptible to wind erosion. Modeling the erosion, transport and deposition of particulate matter from mine tailings is a challenge for many reasons, including heterogeneity of the soil surface, vegetative canopy coverage, dynamic meteorological conditions and topographic influences. In this work, a previously developed Deposition Forecasting Model (DFM) that is specifically designed to model the transport of particulate matter from mine tailings impoundments is verified using dust collection and topsoil measurements. The DFM is initialized using data from an operational Weather Research and Forecasting (WRF) model. The forecast deposition patterns are compared to dust collected by inverted-disc samplers and determined through gravimetric, chemical composition and lead isotopic analysis. The DFM is capable of predicting dust deposition patterns from the tailings impoundment to the surrounding area. The methodology and approach employed in this work can be generalized to other contaminated sites from which dust transport to the local environment can be assessed as a potential route for human exposure.
Windblown Dust Deposition Forecasting and Spread of Contamination around Mine Tailings
Stovern, Michael; Guzmán, Héctor; Rine, Kyle P.; Felix, Omar; King, Matthew; Ela, Wendell P.; Betterton, Eric A.; Sáez, Avelino Eduardo
2017-01-01
Wind erosion, transport and deposition of windblown dust from anthropogenic sources, such as mine tailings impoundments, can have significant effects on the surrounding environment. The lack of vegetation and the vertical protrusion of the mine tailings above the neighboring terrain make the tailings susceptible to wind erosion. Modeling the erosion, transport and deposition of particulate matter from mine tailings is a challenge for many reasons, including heterogeneity of the soil surface, vegetative canopy coverage, dynamic meteorological conditions and topographic influences. In this work, a previously developed Deposition Forecasting Model (DFM) that is specifically designed to model the transport of particulate matter from mine tailings impoundments is verified using dust collection and topsoil measurements. The DFM is initialized using data from an operational Weather Research and Forecasting (WRF) model. The forecast deposition patterns are compared to dust collected by inverted-disc samplers and determined through gravimetric, chemical composition and lead isotopic analysis. The DFM is capable of predicting dust deposition patterns from the tailings impoundment to the surrounding area. The methodology and approach employed in this work can be generalized to other contaminated sites from which dust transport to the local environment can be assessed as a potential route for human exposure. PMID:29082035
Wang, K W; Deng, C; Li, J P; Zhang, Y Y; Li, X Y; Wu, M C
2017-04-01
Tuberculosis (TB) affects people globally and is being reconsidered as a serious public health problem in China. Reliable forecasting is useful for the prevention and control of TB. This study proposes a hybrid model combining autoregressive integrated moving average (ARIMA) with a nonlinear autoregressive (NAR) neural network for forecasting the incidence of TB from January 2007 to March 2016. Prediction performance was compared between the hybrid model and the ARIMA model. The best-fit hybrid model was combined with an ARIMA (3,1,0) × (0,1,1)12 and NAR neural network with four delays and 12 neurons in the hidden layer. The ARIMA-NAR hybrid model, which exhibited lower mean square error, mean absolute error, and mean absolute percentage error of 0·2209, 0·1373, and 0·0406, respectively, in the modelling performance, could produce more accurate forecasting of TB incidence compared to the ARIMA model. This study shows that developing and applying the ARIMA-NAR hybrid model is an effective method to fit the linear and nonlinear patterns of time-series data, and this model could be helpful in the prevention and control of TB.
Bayesian Probabilistic Projections of Life Expectancy for All Countries
Raftery, Adrian E.; Chunn, Jennifer L.; Gerland, Patrick; Ševčíková, Hana
2014-01-01
We propose a Bayesian hierarchical model for producing probabilistic forecasts of male period life expectancy at birth for all the countries of the world from the present to 2100. Such forecasts would be an input to the production of probabilistic population projections for all countries, which is currently being considered by the United Nations. To evaluate the method, we did an out-of-sample cross-validation experiment, fitting the model to the data from 1950–1995, and using the estimated model to forecast for the subsequent ten years. The ten-year predictions had a mean absolute error of about 1 year, about 40% less than the current UN methodology. The probabilistic forecasts were calibrated, in the sense that (for example) the 80% prediction intervals contained the truth about 80% of the time. We illustrate our method with results from Madagascar (a typical country with steadily improving life expectancy), Latvia (a country that has had a mortality crisis), and Japan (a leading country). We also show aggregated results for South Asia, a region with eight countries. Free publicly available R software packages called bayesLife and bayesDem are available to implement the method. PMID:23494599
NASA Technical Reports Server (NTRS)
1977-01-01
A demonstration experiment is being planned to show that frost and freeze prediction improvements are possible utilizing timely Synchronous Meteorological Satellite temperature measurements and that this information can affect Florida citrus grower operations and decisions. An economic experiment was carried out which will monitor citrus growers' decisions, actions, costs and losses, and meteorological forecasts and actual weather events and will establish the economic benefits of improved temperature forecasts. A summary is given of the economic experiment, the results obtained to date, and the work which still remains to be done. Specifically, the experiment design is described in detail as are the developed data collection methodology and procedures, sampling plan, data reduction techniques, cost and loss models, establishment of frost severity measures, data obtained from citrus growers, National Weather Service, and Federal Crop Insurance Corp., resulting protection costs and crop losses for the control group sample, extrapolation of results of control group to the Florida citrus industry and the method for normalization of these results to a normal or average frost season so that results may be compared with anticipated similar results from test group measurements.
New technique for ensemble dressing combining Multimodel SuperEnsemble and precipitation PDF
NASA Astrophysics Data System (ADS)
Cane, D.; Milelli, M.
2009-09-01
The Multimodel SuperEnsemble technique (Krishnamurti et al., Science 285, 1548-1550, 1999) is a postprocessing method for the estimation of weather forecast parameters reducing direct model output errors. It differs from other ensemble analysis techniques by the use of an adequate weighting of the input forecast models to obtain a combined estimation of meteorological parameters. Weights are calculated by least-square minimization of the difference between the model and the observed field during a so-called training period. Although it can be applied successfully on the continuous parameters like temperature, humidity, wind speed and mean sea level pressure (Cane and Milelli, Meteorologische Zeitschrift, 15, 2, 2006), the Multimodel SuperEnsemble gives good results also when applied on the precipitation, a parameter quite difficult to handle with standard post-processing methods. Here we present our methodology for the Multimodel precipitation forecasts applied on a wide spectrum of results over Piemonte very dense non-GTS weather station network. We will focus particularly on an accurate statistical method for bias correction and on the ensemble dressing in agreement with the observed precipitation forecast-conditioned PDF. Acknowledgement: this work is supported by the Italian Civil Defence Department.
NASA Astrophysics Data System (ADS)
Lamouroux, Julien; Testut, Charles-Emmanuel; Lellouche, Jean-Michel; Perruche, Coralie; Paul, Julien
2017-04-01
The operational production of data-assimilated biogeochemical state of the ocean is one of the challenging core projects of the Copernicus Marine Environment Monitoring Service. In that framework - and with the April 2018 CMEMS V4 release as a target - Mercator Ocean is in charge of improving the realism of its global ¼° BIOMER coupled physical-biogeochemical (NEMO/PISCES) simulations, analyses and re-analyses, and to develop an effective capacity to routinely estimate the biogeochemical state of the ocean, through the implementation of biogeochemical data assimilation. Primary objectives are to enhance the time representation of the seasonal cycle in the real time and reanalysis systems, and to provide a better control of the production in the equatorial regions. The assimilation of BGC data will rely on a simplified version of the SEEK filter, where the error statistics do not evolve with the model dynamics. The associated forecast error covariances are based on the statistics of a collection of 3D ocean state anomalies. The anomalies are computed from a multi-year numerical experiment (free run without assimilation) with respect to a running mean in order to estimate the 7-day scale error on the ocean state at a given period of the year. These forecast error covariances rely thus on a fixed-basis seasonally variable ensemble of anomalies. This methodology, which is currently implemented in the "blue" component of the CMEMS operational forecast system, is now under adaptation to be applied to the biogeochemical part of the operational system. Regarding observations - and as a first step - the system shall rely on the CMEMS GlobColour Global Ocean surface chlorophyll concentration products, delivered in NRT. The objective of this poster is to provide a detailed overview of the implementation of the aforementioned data assimilation methodology in the CMEMS BIOMER forecasting system. Focus shall be put on (1) the assessment of the capabilities of this data assimilation methodology to provide satisfying statistics of the model variability errors (through space-time analysis of dedicated representers of satellite surface Chla observations), (2) the dedicated features of the data assimilation configuration that have been implemented so far (e.g. log-transformation of the analysis state, multivariate Chlorophyll-Nutrient control vector, etc.) and (3) the assessment of the performances of this future operational data assimilation configuration.
Data Assimilation and Regional Forecasts Using Atmospheric InfraRed Sounder (AIRS) Profiles
NASA Technical Reports Server (NTRS)
Chou, Shih-Hung; Zavodsky, Bradley; Jedlovec, Gary
2009-01-01
In data sparse regions, remotely-sensed observations can be used to improve analyses, which in turn should lead to better forecasts. One such source comes from the Atmospheric Infrared Sounder (AIRS), which together with the Advanced Microwave Sounding Unit (AMSU), provides temperature and moisture profiles with an accuracy comparable to that of radiosondes. The purpose of this paper is to describe a procedure to optimally assimilate AIRS thermodynamic profiles--obtained from the version 5.0 Earth Observing System (EOS) science team retrieval algorithm-into a regional configuration of the Weather Research and Forecasting (WRF) model using WRF-Var. The paper focuses on development of background error covariances for the regional domain and background field type, a methodology for ingesting AIRS profiles as separate over-land and over-water retrievals with different error characteristics, and utilization of level-by-level quality indicators to select only the highest quality data. The assessment of the impact of the AIRS profiles on WRF-Var analyses will focus on intelligent use of the quality indicators, optimized tuning of the WRF-Var, and comparison of analysis soundings to radiosondes. The analyses will be used to conduct a month-long series of regional forecasts over the continental U.S. The long-tern1 impact of AIRS profiles on forecast will be assessed against verifying radiosonde and stage IV precipitation data.
Data Assimilation and Regional Forecasts using Atmospheric InfraRed Sounder (AIRS) Profiles
NASA Technical Reports Server (NTRS)
Zabodsky, Brad; Chou, Shih-Hung; Jedlovec, Gary J.
2009-01-01
In data sparse regions, remotely-sensed observations can be used to improve analyses, which in turn should lead to better forecasts. One such source comes from the Atmospheric Infrared Sounder (AIRS), which, together with the Advanced Microwave Sounding Unit (AMSU), provides temperature and moisture profiles with an accuracy comparable to that of radionsondes. The purpose of this poster is to describe a procedure to optimally assimilate AIRS thermodynamic profiles, obtained from the version 5.0 Earth Observing System (EOS) science team retrieval algorithm, into a regional configuration of the Weather Research and Forecasting (WRF) model using WRF-Var. The poster focuses on development of background error covariances for the regional domain and background field type, a methodology for ingesting AIRS profiles as separate over-land and over-water retrievals with different error characteristics, and utilization of level-by-level quality indicators to select only the highest quality data. The assessment of the impact of the AIRS profiles on WRF-Var analyses will focus on intelligent use of the quality indicators, optimized tuning of the WRF-Var, and comparison of analysis soundings to radiosondes. The analyses are used to conduct a month-long series of regional forecasts over the continental U.S. The long-term impact of AIRS profiles on forecast will be assessed against NAM analyses and stage IV precipitation data.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting.
Alomar, Miquel L; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L
2016-01-01
Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting
Alomar, Miquel L.; Canals, Vincent; Perez-Mora, Nicolas; Martínez-Moll, Víctor; Rosselló, Josep L.
2016-01-01
Hardware implementation of artificial neural networks (ANNs) allows exploiting the inherent parallelism of these systems. Nevertheless, they require a large amount of resources in terms of area and power dissipation. Recently, Reservoir Computing (RC) has arisen as a strategic technique to design recurrent neural networks (RNNs) with simple learning capabilities. In this work, we show a new approach to implement RC systems with digital gates. The proposed method is based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations. The result is the development of a highly functional system with low hardware resources. The presented methodology is applied to chaotic time-series forecasting. PMID:26880876
Evaluating the Impact of AIRS Observations on Regional Forecasts at the SPoRT Center
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley
2011-01-01
NASA Short-term Prediction Research and Transition (SPoRT) Center collaborates with operational partners of different sizes and operational goals to improve forecasts using targeted projects and data sets. Modeling and DA activities focus on demonstrating utility of NASA data sets and capabilities within operational systems. SPoRT has successfully assimilated the Atmospheric Infrared Sounder (AIRS) radiance and profile data. A collaborative project is underway with the Joint Center for Satellite Data Assimilation (JCSDA) to use AIRS profiles to better understand the impact of AIRS radiances assimilated within Gridpoint Statistical Interpolation (GSI) in hopes of engaging the operational DA community in a reassessment of assimilation methodologies to more effectively assimilate hyperspectral radiances.
NASA Technical Reports Server (NTRS)
1980-01-01
The U.S./Canada wheat/barley exploratory experiment is discussed with emphasis on labeling, machine processing using P1A, and the crop calendar. Classification and the simulated aggregation test used in the U.S. corn/soybean exploratory experiment are also considered. Topics covered regarding the foreign commodity production forecasting project include: (1) the acquisition, handling, and processing of both U.S. and foreign agricultural data, as well as meteorological data. The accuracy assessment methodology, multicrop sampling and aggregation technology development, frame development, the yield project interface, and classification for area estimation are also examined.
Space Monitoring Data Center at Moscow State University
NASA Astrophysics Data System (ADS)
Kalegaev, Vladimir; Bobrovnikov, Sergey; Barinova, Vera; Myagkova, Irina; Shugay, Yulia; Barinov, Oleg; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir
Space monitoring data center of Moscow State University provides operational information on radiation state of the near-Earth space. Internet portal http://swx.sinp.msu.ru/ gives access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in the magnetosphere and heliosphere in the real time mode. Operational data coming from space missions (ACE, GOES, ELECTRO-L1, Meteor-M1) at L1, LEO and GEO and from the Earth’s surface are used to represent geomagnetic and radiation state of near-Earth environment. On-line database of measurements is also maintained to allow quick comparison between current conditions and conditions experienced in the past. The models of space environment working in autonomous mode are used to generalize the information obtained from observations on the whole magnetosphere. Interactive applications and operational forecasting services are created on the base of these models. They automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons using data from LEO orbits. Special forecasting services give short-term forecast of SEP penetration to the Earth magnetosphere at low altitudes, as well as relativistic electron fluxes at GEO. Velocities of recurrent high speed solar wind streams on the Earth orbit are predicted with advance time of 3-4 days on the basis of automatic estimation of the coronal hole areas detected on the images of the Sun received from the SDO satellite. By means of neural network approach, Dst and Kp indices online forecasting 0.5-1.5 hours ahead, depending on solar wind and the interplanetary magnetic field, measured by ACE satellite, is carried out. Visualization system allows representing experimental and modeling data in 2D and 3D.
Calibration of Ocean Forcing with satellite Flux Estimates (COFFEE)
NASA Astrophysics Data System (ADS)
Barron, Charlie; Jan, Dastugue; Jackie, May; Rowley, Clark; Smith, Scott; Spence, Peter; Gremes-Cordero, Silvia
2016-04-01
Predicting the evolution of ocean temperature in regional ocean models depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. Within the COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates, real-time satellite observations are used to estimate shortwave, longwave, sensible, and latent air-sea heat flux corrections to a background estimate from the prior day's regional or global model forecast. These satellite-corrected fluxes are used to prepare a corrected ocean hindcast and to estimate flux error covariances to project the heat flux corrections for a 3-5 day forecast. In this way, satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. While traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle, COFFEE endeavors to appropriately partition and reduce among various surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using operational global or regional atmospheric forcing. Experiment cases combine different levels of flux calibration with assimilation alternatives. The cases use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
Satellite-based Calibration of Heat Flux at the Ocean Surface
NASA Astrophysics Data System (ADS)
Barron, C. N.; Dastugue, J. M.; May, J. C.; Rowley, C. D.; Smith, S. R.; Spence, P. L.; Gremes-Cordero, S.
2016-02-01
Model forecasts of upper ocean heat content and variability on diurnal to daily scales are highly dependent on estimates of heat flux through the air-sea interface. Satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. Traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle. Subsequent evolution depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. The COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates) endeavors to correct ocean forecast bias through a responsive error partition among surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using Navy operational global or regional atmospheric forcing. COFFEE addresses satellite-calibration of surface fluxes to estimate surface error covariances and links these to the ocean interior. Experiment cases combine different levels of flux calibration with different assimilation alternatives. The cases may use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
Prediction of ENSO episodes using canonical correlation analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnston, A.G.; Ropelewski, C.F.
Canonical correlation analysis (CCA) is explored as a multivariate linear statistical methodology with which to forecast fluctuations of the El Nino/Southern Oscillation (ENSO) in real time. CCA is capable of identifying critical sequences of predictor patterns that tend to evolve into subsequent pattern that can be used to form a forecast. The CCA model is used to forecast the 3-month mean sea surface temperature (SST) in several regions of the tropical Pacific and Indian oceans for projection times of 0 to 4 seasons beyond the immediately forthcoming season. The predictor variables, representing the climate situation in the four consecutive 3-monthmore » periods ending at the time of the forecast, are (1) quasi-global seasonal mean sea level pressure (SLP) and (2) SST in the predicted regions themselves. Forecast skill is estimated using cross-validation, and persistence is used as the primary skill control measure. Results indicate that a large region in the eastern equatorial Pacific (120[degrees]-170[degrees] W longitude) has the highest overall predictability, with excellent skill realized for winter forecasts made at the end of summer. CCA outperforms persistence in this region under most conditions, and does noticeably better with the SST included as a predictor in addition to the SLP. It is demonstrated that better forecast performance at the longer lead times would be obtained if some significantly earlier (i.e., up to 4 years) predictor data were included, because the ability to predict the lower-frequency ENSO phase changes would increase. The good performance of the current system at shorter lead times appears to be based largely on the ability to predict ENSO evolution for events already in progress. The forecasting of the eastern tropical Pacific SST using CCA is now done routinely on a monthly basis for a O-, 1-, and 2-season lead at the Climate Analysis Center.« less
Using volcanic tremor for eruption forecasting at White Island volcano (Whakaari), New Zealand
NASA Astrophysics Data System (ADS)
Chardot, Lauriane; Jolly, Arthur D.; Kennedy, Ben M.; Fournier, Nicolas; Sherburn, Steven
2015-09-01
Eruption forecasting is a challenging task because of the inherent complexity of volcanic systems. Despite remarkable efforts to develop complex models in order to explain volcanic processes prior to eruptions, the material Failure Forecast Method (FFM) is one of the very few techniques that can provide a forecast time for an eruption. However, the method requires testing and automation before being used as a real-time eruption forecasting tool at a volcano. We developed an automatic algorithm to issue forecasts from volcanic tremor increase episodes recorded by Real-time Seismic Amplitude Measurement (RSAM) at one station and optimised this algorithm for the period August 2011-January 2014 which comprises the recent unrest period at White Island volcano (Whakaari), New Zealand. A detailed residual analysis was paramount to select the most appropriate model explaining the RSAM time evolutions. In a hindsight simulation, four out of the five small eruptions reported during this period occurred within a failure window forecast by our optimised algorithm and the probability of an eruption on a day within a failure window was 0.21, which is 37 times higher than the probability of having an eruption on any day during the same period (0.0057). Moreover, the forecasts were issued prior to the eruptions by a few hours which is important from an emergency management point of view. Whereas the RSAM time evolutions preceding these four eruptions have a similar goodness-of-fit with the FFM, their spectral characteristics are different. The duration-amplitude distributions of the precursory tremor episodes support the hypothesis that several processes were likely occurring prior to these eruptions. We propose that slow rock failure and fluid flow processes are plausible candidates for the tremor source of these episodes. This hindsight exercise can be useful for future real-time implementation of the FFM at White Island. A similar methodology could also be tested at other volcanoes even if only a limited network is available.
Updating of states in operational hydrological models
NASA Astrophysics Data System (ADS)
Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.
2012-04-01
Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.
The Impact of the Assimilation of AIRS Radiance Measurements on Short-term Weather Forecasts
NASA Technical Reports Server (NTRS)
McCarty, Will; Jedlovec, Gary; Miller, Timothy L.
2009-01-01
Advanced spaceborne instruments have the ability to improve the horizontal and vertical characterization of temperature and water vapor in the atmosphere through the explicit use of hyperspectral thermal infrared radiance measurements. The incorporation of these measurements into a data assimilation system provides a means to continuously characterize a three-dimensional, instantaneous atmospheric state necessary for the time integration of numerical weather forecasts. Measurements from the National Aeronautics and Space Administration (NASA) Atmospheric Infrared Sounder (AIRS) are incorporated into the gridpoint statistical interpolation (GSI) three-dimensional variational (3D-Var) assimilation system to provide improved initial conditions for use in a mesoscale modeling framework mimicking that of the operational North American Mesoscale (NAM) model. The methodologies for the incorporation of the measurements into the system are presented. Though the measurements have been shown to have a positive impact in global modeling systems, the measurements are further constrained in this system as the model top is physically lower than the global systems and there is no ozone characterization in the background state. For a study period, the measurements are shown to have positive impact on both the analysis state as well as subsequently spawned short-term (0-48 hr) forecasts, particularly in forecasted geopotential height and precipitation fields. At 48 hr, height anomaly correlations showed an improvement in forecast skill of 2.3 hours relative to a system without the AIRS measurements. Similarly, the equitable threat and bias scores of precipitation forecasts of 25 mm (6 hr)-1 were shown to be improved by 8% and 7%, respectively.
NASA Astrophysics Data System (ADS)
Mateus, Pedro; Miranda, Pedro M. A.; Nico, Giovanni; Catalão, João.; Pinto, Paulo; Tomé, Ricardo
2018-04-01
Very high resolution precipitable water vapor maps obtained by the Sentinel-1 A synthetic aperture radar (SAR), using the SAR interferometry (InSAR) technique, are here shown to have a positive impact on the performance of severe weather forecasts. A case study of deep convection which affected the city of Adra, Spain, on 6-7 September 2015, is successfully forecasted by the Weather Research and Forecasting model initialized with InSAR data assimilated by the three-dimensional variational technique, with improved space and time distributions of precipitation, as observed by the local weather radar and rain gauge. This case study is exceptional because it consisted of two severe events 12 hr apart, with a timing that allows for the assimilation of both the ascending and descending satellite images, each for the initialization of each event. The same methodology applied to the network of Global Navigation Satellite System observations in Iberia, at the same times, failed to reproduce observed precipitation, although it also improved, in a more modest way, the forecast skill. The impact of precipitable water vapor data is shown to result from a direct increment of convective available potential energy, associated with important adjustments in the low-level wind field, favoring its release in deep convection. It is suggested that InSAR images, complemented by dense Global Navigation Satellite System data, may provide a new source of water vapor data for weather forecasting, since their sampling frequency could reach the subdaily scale by merging different SAR platforms, or when future geosynchronous radar missions become operational.
Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew
2016-01-01
Objectives Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3–20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Design Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. Participants People are not needed in this study. Data sources The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Main outcome measure Studies reporting methods used to predict future health technologies within a 3–20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. Results 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. Conclusions The methodological fundamentals of formal 3–20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. PMID:26966060
WIEBE, DOUGLAS J.; HOLENA, DANIEL N.; DELGADO, M. KIT; McWILLIAMS, NATHAN; ALTENBURG, JULIET; CARR, BRENDAN G.
2018-01-01
Trauma centers need objective feedback on performance to inform quality improvement efforts. The Trauma Quality Improvement Program recently published recommended methodology for case mix adjustment and benchmarking performance. We tested the feasibility of applying this methodology to develop risk-adjusted mortality models for a statewide trauma system. We performed a retrospective cohort study of patients ≥16 years old at Pennsylvania trauma centers from 2011 to 2013 (n = 100,278). Our main outcome measure was observed-to-expected mortality ratios (overall and within blunt, penetrating, multisystem, isolated head, and geriatric subgroups). Patient demographic variables, physiology, mechanism of injury, transfer status, injury severity, and pre-existing conditions were included as predictor variables. The statistical model had excellent discrimination (area under the curve = 0.94). Funnel plots of observed-to-expected identified five centers with lower than expected mortality and two centers with higher than expected mortality. No centers were outliers for management of penetrating trauma, but five centers had lower and three had higher than expected mortality for blunt trauma. It is feasible to use Trauma Quality Improvement Program methodology to develop risk-adjusted models for statewide trauma systems. Even with smaller numbers of trauma centers that are available in national datasets, it is possible to identify high and low outliers in performance. PMID:28541852
Wiebe, Douglas J; Holena, Daniel N; Delgado, M Kit; McWilliams, Nathan; Altenburg, Juliet; Carr, Brendan G
2017-05-01
Trauma centers need objective feedback on performance to inform quality improvement efforts. The Trauma Quality Improvement Program recently published recommended methodology for case mix adjustment and benchmarking performance. We tested the feasibility of applying this methodology to develop risk-adjusted mortality models for a statewide trauma system. We performed a retrospective cohort study of patients ≥16 years old at Pennsylvania trauma centers from 2011 to 2013 (n = 100,278). Our main outcome measure was observed-to-expected mortality ratios (overall and within blunt, penetrating, multisystem, isolated head, and geriatric subgroups). Patient demographic variables, physiology, mechanism of injury, transfer status, injury severity, and pre-existing conditions were included as predictor variables. The statistical model had excellent discrimination (area under the curve = 0.94). Funnel plots of observed-to-expected identified five centers with lower than expected mortality and two centers with higher than expected mortality. No centers were outliers for management of penetrating trauma, but five centers had lower and three had higher than expected mortality for blunt trauma. It is feasible to use Trauma Quality Improvement Program methodology to develop risk-adjusted models for statewide trauma systems. Even with smaller numbers of trauma centers that are available in national datasets, it is possible to identify high and low outliers in performance.
Traffic model for the satellite component of UMTS
NASA Technical Reports Server (NTRS)
Hu, Y. F.; Sheriff, R. E.
1995-01-01
An algorithm for traffic volume estimation for satellite mobile communications systems has been developed. This algorithm makes use of worldwide databases for demographic and economic data. In order to provide for such an estimation, the effects of competing services have been considered so that likely market demand can be forecasted. Different user groups of the predicted market have been identified according to expectations in the quality of services and mobility requirement. The number of users for different user groups are calculated taking into account the gross potential market, the penetration rate of the identified services and the profitability to provide such services via satellite.
NASA Astrophysics Data System (ADS)
Gafurov, O.; Gafurov, D.; Syryamkin, V.
2018-05-01
The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.
Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods
Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.
2011-01-01
Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.
Time series modelling of global mean temperature for managerial decision-making.
Romilly, Peter
2005-07-01
Climate change has important implications for business and economic activity. Effective management of climate change impacts will depend on the availability of accurate and cost-effective forecasts. This paper uses univariate time series techniques to model the properties of a global mean temperature dataset in order to develop a parsimonious forecasting model for managerial decision-making over the short-term horizon. Although the model is estimated on global temperature data, the methodology could also be applied to temperature data at more localised levels. The statistical techniques include seasonal and non-seasonal unit root testing with and without structural breaks, as well as ARIMA and GARCH modelling. A forecasting evaluation shows that the chosen model performs well against rival models. The estimation results confirm the findings of a number of previous studies, namely that global mean temperatures increased significantly throughout the 20th century. The use of GARCH modelling also shows the presence of volatility clustering in the temperature data, and a positive association between volatility and global mean temperature.
NASA Astrophysics Data System (ADS)
Gastón, Martín; Fernández-Peruchena, Carlos; Körnich, Heiner; Landelius, Tomas
2017-06-01
The present work describes the first approach of a new procedure to forecast Direct Normal Irradiance (DNI): the #hashtdim that treats to combine ground information and Numerical Weather Predictions. The system is centered in generate predictions for the very short time. It combines the outputs from the Numerical Weather Prediction Model HARMONIE with an adaptive methodology based on Machine Learning. The DNI predictions are generated with 15-minute and hourly temporal resolutions and presents 3-hourly updates. Each update offers forecasts to the next 12 hours, the first nine hours are generated with 15-minute temporal resolution meanwhile the last three hours present hourly temporal resolution. The system is proved over a Spanish emplacement with BSRN operative station in south of Spain (PSA station). The #hashtdim has been implemented in the framework of the Direct Normal Irradiance Nowcasting methods for optimized operation of concentrating solar technologies (DNICast) project, under the European Union's Seventh Programme for research, technological development and demonstration framework.
Electron Flux Models for Different Energies at Geostationary Orbit
NASA Technical Reports Server (NTRS)
Boynton, R. J.; Balikhin, M. A.; Sibeck, D. G.; Walker, S. N.; Billings, S. A.; Ganushkina, N.
2016-01-01
Forecast models were derived for energetic electrons at all energy ranges sampled by the third-generation Geostationary Operational Environmental Satellites (GOES). These models were based on Multi-Input Single-Output Nonlinear Autoregressive Moving Average with Exogenous inputs methodologies. The model inputs include the solar wind velocity, density and pressure, the fraction of time that the interplanetary magnetic field (IMF) was southward, the IMF contribution of a solar wind-magnetosphere coupling function proposed by Boynton et al. (2011b), and the Dst index. As such, this study has deduced five new 1 h resolution models for the low-energy electrons measured by GOES (30-50 keV, 50-100 keV, 100-200 keV, 200-350 keV, and 350-600 keV) and extended the existing >800 keV and >2 MeV Geostationary Earth Orbit electron fluxes models to forecast at a 1 h resolution. All of these models were shown to provide accurate forecasts, with prediction efficiencies ranging between 66.9% and 82.3%.
Geologic framework for the national assessment of carbon dioxide storage resources
Warwick, Peter D.; Corum, Margo D.
2012-01-01
The 2007 Energy Independence and Security Act (Public Law 110–140) directs the U.S. Geological Survey (USGS) to conduct a national assessment of potential geologic storage resources for carbon dioxide (CO2) and to consult with other Federal and State agencies to locate the pertinent geological data needed for the assessment. The geologic sequestration of CO2 is one possible way to mitigate its effects on climate change. The methodology used for the national CO2 assessment (Open-File Report 2010-1127; http://pubs.usgs.gov/of/2010/1127/) is based on previous USGS probabilistic oil and gas assessment methodologies. The methodology is non-economic and intended to be used at regional to subbasinal scales. The operational unit of the assessment is a storage assessment unit (SAU), composed of a porous storage formation with fluid flow and an overlying sealing unit with low permeability. Assessments are conducted at the SAU level and are aggregated to basinal and regional results. This report identifies and contains geologic descriptions of SAUs in separate packages of sedimentary rocks within the assessed basin and focuses on the particular characteristics, specified in the methodology, that influence the potential CO2 storage resource in those SAUs. Specific descriptions of the SAU boundaries as well as their sealing and reservoir units are included. Properties for each SAU such as depth to top, gross thickness, net porous thickness, porosity, permeability, groundwater quality, and structural reservoir traps are provided to illustrate geologic factors critical to the assessment. Although assessment results are not contained in this report, the geologic information included here will be employed, as specified in the methodology, to calculate a statistical Monte Carlo-based distribution of potential storage space in the various SAUs. Figures in this report show SAU boundaries and cell maps of well penetrations through the sealing unit into the top of the storage formation. Wells sharing the same well borehole are treated as a single penetration. Cell maps show the number of penetrating wells within one square mile and are derived from interpretations of incompletely attributed well data, a digital compilation that is known not to include all drilling. The USGS does not expect to know the location of all wells and cannot guarantee the amount of drilling through specific formations in any given cell shown on cell maps.
NASA Technical Reports Server (NTRS)
Lapenta, William M.; Wohlman, Richard; Bradshaw, Tom; Burks, Jason; Jedlovec, Gary; Goodman, Steve; Darden, Chris; Meyer, Paul
2003-01-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center seeks to accelerate the infusion of NASA Earth Science Enterprise (ESE) observations, data assimilation and modeling research into NWS forecast operations and decision-making. To meet long-term program expectations, it is not sufficient simply to give forecasters sophisticated workstations or new forecast products without fully assessing the ways in which they will be utilized. Close communication must be established between the research and operational communities so that developers have a complete understanding of user needs. In turn, forecasters must obtain a more comprehensive knowledge of the modeling and sensing tools available to them. A major goal of the SPoRT Program is to develop metrics and conduct assessment studies with NWS forecasters to evaluate the impacts and benefits of ESE experimental products on forecast skill. At a glance the task seems relatively straightforward. However, performing assessment of experimental products in an operational environment is demanding. Given the tremendous time constraints placed on NWS forecasters, it is imperative that forecaster input be obtained in a concise unobtrusive manor. Great care must also be taken to ensure that forecasters understand their participation will eventually benefit them and WFO operations in general. Two requirements of the assessment plan developed under the SPoRT activity are that it 1) Can be implemented within the WFO environment; and 2) Provide tangible results for BOTH the research and operational communities. Supplemental numerical quantitative precipitation forecasts (QPF) were chosen as the first experimental SPoRT product to be evaluated during a Pilot Assessment Program conducted 1 May 2003 within the Huntsville AL National Weather Service Forecast Office. Forecast time periods were broken up into six- hour bins ranging from zero to twenty-four hours. Data were made available for display in AWIPS on an operational basis so they could be efficiently incorporated into the forecast process. The methodology used to assess the value of experimental QPFs compared to available operational products is best described as a three-tier approach involving both forecasters and research scientists. Tier-one is a web-based survey completed by duty forecasters on the aviation and public desks. The survey compiles information on how the experimental product was used in the forecast decision making process. Up to 6 responses per twenty-four hours can be compiled during a precipitation event. Tier-two consists of an event post mortem and experimental product assessment performed daily by the NASA/NWS Liaison. Tier-three is a detailed breakdown/analysis of specific events targeted by either the NWS SO0 or SPoRT team members. The task is performed by both NWS and NASA research scientists and may be conducted once every couple of months. The findings from the Pilot Assessment Program will be reported at the meeting.
Taghizadeh, S Mojtaba; Moghimi-Ardakani, Ali; Mohamadnia, Fatemeh
2015-03-01
A series of drug-in-adhesive transdermal drug delivery systems (patch) with different chemical penetration enhancers were designed to deliver drug through the skin as a site of application. The objective of our effort was to study the influence of various chemical penetration enhancers on skin permeation rate and adhesion properties of a transdermal drug delivery system using Box-Behnken experimental design. The response surface methodology based on a three-level, three-variable Box-Behnken design was used to evaluate the interactive effects on dependent variables including, the rate of skin permeation and adhesion properties, namely peel strength and tack value. Levulinic acid, lauryl alcohol, and Tween 80 were used as penetration enhancers (patch formulations, containing 0-8% of each chemical penetration enhancer). Buprenorphine was used as a model penetrant drug. The results showed that incorporation of 20% chemical penetration enhancer into the mixture led to maximum skin permeation flux of buprenorphine from abdominal rat skin while the adhesion properties decreased. Also that skin flux in presence of levulinic acid (1.594 μg/cm(2) h) was higher than Tween 80 (1.473 μg/cm(2) h) and lauryl alcohol (0.843 μg/cm(2) h), and in mixing these enhancers together, an additional effect was observed. Moreover, it was found that each enhancer increased the tack value, while levulinic acid and lauryl alcohol improved the peel strength but Tween 80 reduced it. These findings indicated that the best chemical skin penetration enhancer for buprenorphine patch was levulinic acid. Among the designed formulations, the one which contained 12% (wt/wt) enhancers exhibited the highest efficiency.
The use of satellite data assimilation methods in regional NWP for solar irradiance forecasting
NASA Astrophysics Data System (ADS)
Kurzrock, Frederik; Cros, Sylvain; Chane-Ming, Fabrice; Potthast, Roland; Linguet, Laurent; Sébastien, Nicolas
2016-04-01
As an intermittent energy source, the injection of solar power into electricity grids requires irradiance forecasting in order to ensure grid stability. On time scales of more than six hours ahead, numerical weather prediction (NWP) is recognized as the most appropriate solution. However, the current representation of clouds in NWP models is not sufficiently precise for an accurate forecast of solar irradiance at ground level. Dynamical downscaling does not necessarily increase the quality of irradiance forecasts. Furthermore, incorrectly simulated cloud evolution is often the cause of inaccurate atmospheric analyses. In non-interconnected tropical areas, the large amplitudes of solar irradiance variability provide abundant solar yield but present significant problems for grid safety. Irradiance forecasting is particularly important for solar power stakeholders in these regions where PV electricity penetration is increasing. At the same time, NWP is markedly more challenging in tropic areas than in mid-latitudes due to the special characteristics of tropical homogeneous convective air masses. Numerous data assimilation methods and strategies have evolved and been applied to a large variety of global and regional NWP models in the recent decades. Assimilating data from geostationary meteorological satellites is an appropriate approach. Indeed, models converting radiances measured by satellites into cloud properties already exist. Moreover, data are available at high temporal frequencies, which enable a pertinent cloud cover evolution modelling for solar energy forecasts. In this work, we present a survey of different approaches which aim at improving cloud cover forecasts using the assimilation of geostationary meteorological satellite data into regional NWP models. Various approaches have been applied to a variety of models and satellites and in different regions of the world. Current methods focus on the assimilation of cloud-top information, derived from infrared channels. For example, those information have been directly assimilated by modifying the water vapour profile in the initial conditions of the WRF model in California using GOES satellite imagery. In Europe, the assimilation of cloud-top height and relative humidity has been performed in an indirect approach using an ensemble Kalman filter. In this case Meteosat SEVIRI cloud information has been assimilated in the COSMO model. Although such methods generally provide improved cloud cover forecasts in mid-latitudes, the major limitation is that only clear-sky or completely cloudy cases can be considered. Indeed, fractional clouds cause a measured signal mixing cold clouds and warmer Earth surface. If the model's initial state is directly forced by cloud properties observed by satellite, the changed model fields have to be smoothed in order to avoid numerical instability. Other crucial aspects which influence forecast quality in the case of satellite radiance assimilation are channel selection, bias and error treatment. The overall promising satellite data assimilation methods in regional NWP have not yet been explicitly applied and tested under tropical conditions. Therefore, a deeper understanding on the benefits of such methods is necessary to improve irradiance forecast schemes.
Extensions and applications of ensemble-of-trees methods in machine learning
NASA Astrophysics Data System (ADS)
Bleich, Justin
Ensemble-of-trees algorithms have emerged to the forefront of machine learning due to their ability to generate high forecasting accuracy for a wide array of regression and classification problems. Classic ensemble methodologies such as random forests (RF) and stochastic gradient boosting (SGB) rely on algorithmic procedures to generate fits to data. In contrast, more recent ensemble techniques such as Bayesian Additive Regression Trees (BART) and Dynamic Trees (DT) focus on an underlying Bayesian probability model to generate the fits. These new probability model-based approaches show much promise versus their algorithmic counterparts, but also offer substantial room for improvement. The first part of this thesis focuses on methodological advances for ensemble-of-trees techniques with an emphasis on the more recent Bayesian approaches. In particular, we focus on extensions of BART in four distinct ways. First, we develop a more robust implementation of BART for both research and application. We then develop a principled approach to variable selection for BART as well as the ability to naturally incorporate prior information on important covariates into the algorithm. Next, we propose a method for handling missing data that relies on the recursive structure of decision trees and does not require imputation. Last, we relax the assumption of homoskedasticity in the BART model to allow for parametric modeling of heteroskedasticity. The second part of this thesis returns to the classic algorithmic approaches in the context of classification problems with asymmetric costs of forecasting errors. First we consider the performance of RF and SGB more broadly and demonstrate its superiority to logistic regression for applications in criminology with asymmetric costs. Next, we use RF to forecast unplanned hospital readmissions upon patient discharge with asymmetric costs taken into account. Finally, we explore the construction of stable decision trees for forecasts of violence during probation hearings in court systems.
An EPR methodology for measuring the London penetration depth for the ceramic superconductors
NASA Technical Reports Server (NTRS)
Rakvin, B.; Mahl, T. A.; Dalal, N. S.
1990-01-01
The use is discussed of electron paramagnetic resonance (EPR) as a quick and easily accessible method for measuring the London penetration depth, lambda for the high T(sub c) superconductors. The method utilizes the broadening of the EPR signal, due to the emergence of the magnetic flux lattice, of a free radical adsorbed on the surface of the sample. The second moment, of the EPR signal below T(sub c) is fitted to the Brandt equation for a simple triangular lattice. The precision of this method compares quite favorably with those of the more standard methods such as micro sup(+)SR, Neutron scattering, and magnetic susceptibility.
Mortality prediction using TRISS methodology in the Spanish ICU Trauma Registry (RETRAUCI).
Chico-Fernández, M; Llompart-Pou, J A; Sánchez-Casado, M; Alberdi-Odriozola, F; Guerrero-López, F; Mayor-García, M D; Egea-Guerrero, J J; Fernández-Ortega, J F; Bueno-González, A; González-Robledo, J; Servià-Goixart, L; Roldán-Ramírez, J; Ballesteros-Sanz, M Á; Tejerina-Alvarez, E; Pino-Sánchez, F I; Homar-Ramírez, J
2016-10-01
To validate Trauma and Injury Severity Score (TRISS) methodology as an auditing tool in the Spanish ICU Trauma Registry (RETRAUCI). A prospective, multicenter registry evaluation was carried out. Thirteen Spanish Intensive Care Units (ICUs). Individuals with traumatic disease and available data admitted to the participating ICUs. Predicted mortality using TRISS methodology was compared with that observed in the pilot phase of the RETRAUCI from November 2012 to January 2015. Discrimination was evaluated using receiver operating characteristic (ROC) curves and the corresponding areas under the curves (AUCs) (95% CI), with calibration using the Hosmer-Lemeshow (HL) goodness-of-fit test. A value of p<0.05 was considered significant. Predicted and observed mortality. A total of 1405 patients were analyzed. The observed mortality rate was 18% (253 patients), while the predicted mortality rate was 16.9%. The area under the ROC curve was 0.889 (95% CI: 0.867-0.911). Patients with blunt trauma (n=1305) had an area under the ROC curve of 0.887 (95% CI: 0.864-0.910), and those with penetrating trauma (n=100) presented an area under the curve of 0.919 (95% CI: 0.859-0.979). In the global sample, the HL test yielded a value of 25.38 (p=0.001): 27.35 (p<0.0001) in blunt trauma and 5.91 (p=0.658) in penetrating trauma. TRISS methodology underestimated mortality in patients with low predicted mortality and overestimated mortality in patients with high predicted mortality. TRISS methodology in the evaluation of severe trauma in Spanish ICUs showed good discrimination, with inadequate calibration - particularly in blunt trauma. Copyright © 2015 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
A methodology for long range prediction of air transportation
NASA Technical Reports Server (NTRS)
Ayati, M. B.; English, J. M.
1980-01-01
The paper describes the methodology for long-time projection of aircraft fuel requirements. A new concept of social and economic factors for future aviation industry which provides an estimate of predicted fuel usage is presented; it includes air traffic forecasts and lead times for producing new engines and aircraft types. An air transportation model is then developed in terms of an abstracted set of variables which represent the entire aircraft industry on a macroscale. This model was evaluated by testing the required output variables from a model based on historical data over the past decades.
Francisco Rodríguez y Silva; Armando González-Cabán
2016-01-01
We propose an economic analysis using utility and productivity, and efficiency theories to provide fire managers a decision support tool to determine the most efficient fire management programs levels. By incorporating managersâ accumulated fire suppression experiences (capitalized experience) in the analysis we help fire managers...
Are Education Cost Functions Ready for Prime Time? An Examination of Their Validity and Reliability
ERIC Educational Resources Information Center
Duncombe, William; Yinger, John
2011-01-01
This article makes the case that cost functions are the best available methodology for ensuring consistency between a state's educational accountability system and its education finance system. Because they are based on historical data and well-known statistical methods, cost functions are a particularly flexible and low-cost way to forecast what…
ERIC Educational Resources Information Center
Moeletsi, M. E.; Mellaart, E. A. R.; Mpandeli, N. S.; Hamandawana, H.
2013-01-01
Purpose: New innovative ways of communicating agrometeorological information are needed to help farmers, especially subsistence/small-scale farmers, to cope with the high climate variability experienced in most parts of southern Africa. Design/methodology/approach: The article introduces an early warning system for farmers. It utilizes short…
Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.
ERIC Educational Resources Information Center
Marasulov, Akhmat; Saipov, Amangeldi; ?rymbayeva, Kulimkhan; Zhiyentayeva, Begaim; Demeuov, Akhan; Konakbaeva, Ulzhamal; Bekbolatova, Akbota
2016-01-01
The aim of the study is to examine the methodological-theoretical construction bases for development mechanism of an integrated model for a specialist's training and teacher's conceptual-theoretical activity. Using the methods of generalization of teaching experience, pedagogical modeling and forecasting, the authors determine the urgent problems…
Ronald F Billings; William W. Upton
2010-01-01
An operational system to forecast infestation trends (increasing, static, declining) and relative population levels (high, moderate, low) of the southern pine beetle (SPB), Dendroctonus frontalis, has been implemented in the Southern and Eastern United States. Numbers of dispersing SPB and those of a major predator (the clerid beetle, ...
The 2017 Meteor Shower Activity Forecast for Earth Orbit
NASA Technical Reports Server (NTRS)
Moorhead, Althea; Cooke, Bill; Moser, Danielle
2017-01-01
Most meteor showers will display typical activity levels in 2017. Perseid activity is expected to be higher than normal but less than in 2016; rates may reach 80% of the peak ZHR in 2016. Despite this enhancement, the Perseids rank 4th in flux for 0.04-cm-equivalent meteoroids: the Geminids (GEM), Daytime Arietids (ARI), and Southern delta Aquariids (SDA) all produce higher fluxes. Aside from heightened Perseid activity, the 2017 forecast includes a number of changes. In 2016, the Meteoroid Environment Office used 14 years of shower flux data to revisit the activity profiles of meteor showers included in the annual forecast. Both the list of showers and the shape of certain major showers have been revised. The names and three-letter shower codes were updated to match those in the International Astronomical Union (IAU) Meteor Data Center, and a number of defunct or insignificant showers were removed. The most significant of these changes are the increased durations of the Daytime Arietid (ARI) and Geminid (GEM) meteor showers. This document is designed to supplement spacecraft risk assessments that incorporate an annual averaged meteor shower flux (as is the case with all NASA meteor models). Results are presented relative to this baseline and are weighted to a constant kinetic energy. Two showers - the Daytime Arietids (ARI) and the Geminids (GEM) - attain flux levels approaching that of the baseline meteoroid environment for 0.1-cm-equivalent meteoroids. This size is the threshold for structural damage. These two showers, along with the Quadrantids (QUA) and Perseids (PER), exceed the baseline flux for 0.3-cm-equivalent particles, which is near the limit for pressure vessel penetration. Please note, however, that meteor shower fluxes drop dramatically with increasing particle size. As an example, the Arietids contribute a flux of about 5x10(exp -6) meteoroids m(exp -2) hr-1 in the 0.04-cm-equivalent range, but only 1x10(exp -8) meteoroids m(sub -2) hr-1 for the 0.3-cmequivalent and larger size regime. Thus, a PNP risk assessment should use the flux and flux enhancements corresponding to the smallest particle capable of penetrating a component, because the flux at this size will be the dominant contributor to the risk.
United States Registered Nurse Workforce Report Card and Shortage Forecast: A Revisit.
Zhang, Xiaoming; Tai, Daniel; Pforsich, Hugh; Lin, Vernon W
This is a reevaluation of registered nurse (RN) supply and demand from 2016 to 2030 using a previously published work forecast model and grading methodology with more recent workforce data. There will be a shortage of 154 018 RNs by 2020 and 510 394 RNs by 2030; the South and West regions will have higher shortage ratios than Northeast and Midwest regions. This reflects a nearly 50% overall improvement when compared with the authors' prior study, and the low-performing states have improved from 18 "D" and 12 "F" grades as published earlier to 13 "D" and 1 "F" in this study. Although progress has been made, efforts to foster the pipelines for improving the nursing workforce need to be continued.
Man-made Boards Technology Trends based on TRIZ Evolution Theory
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin
China is one of the world's largest manufacturers and consumers of man-made board applications. A systematic and efficient method of foreseeing future technology trends and their evolutionary potentials is a key task that can help companies guide their planning and allocate their resources. Application of the law of evolution with a S-shaped curve could contribute essentially to the accuracy of the long-term forecast. This research seeks to determine the current stage and the position on the S-curve of man-made board technology in China on the TRIZ evolution theo ryand introduce a methodology which combines patent analysis and technology life cycle forecasting to find a niche space of man-made technology development in China.
Prediction of high-energy radiation belt electron fluxes using a combined VERB-NARMAX model
NASA Astrophysics Data System (ADS)
Pakhotin, I. P.; Balikhin, M. A.; Shprits, Y.; Subbotin, D.; Boynton, R.
2013-12-01
This study is concerned with the modelling and forecasting of energetic electron fluxes that endanger satellites in space. By combining data-driven predictions from the NARMAX methodology with the physics-based VERB code, it becomes possible to predict electron fluxes with a high level of accuracy and across a radial distance from inside the local acceleration region to out beyond geosynchronous orbit. The model coupling also makes is possible to avoid accounting for seed electron variations at the outer boundary. Conversely, combining a convection code with the VERB and NARMAX models has the potential to provide even greater accuracy in forecasting that is not limited to geostationary orbit but makes predictions across the entire outer radiation belt region.
An overview of the 1984 Battelle outside users payload model
NASA Astrophysics Data System (ADS)
Day, J. B.; Conlon, R. J.; Neale, D. B.; Fischer, N. H.
1984-10-01
The methodology and projections from a model for the market for non-NASA, non-DOD, reimbursable payloads from the non-Soviet bloc countries over the 1984-2000 AD time period are summarized. High and low forecast ranges were made based on demand forecasts by industrial users, NASA estimates, and other publications. The launches were assumed to be alloted to either the Shuttle or the Ariane. The greatest demand for launch services is expected to come form communications and materials processing payloads, the latter either becoming a large user or remaining a research item. The number of Shuttle payload equivalents over the reference time spanis projected as 84-194, showing the large variance that is dependent on the progress in materials processing operations.
NASA Astrophysics Data System (ADS)
Vilhelmsen, Troels N.; Ferré, Ty P. A.
2016-04-01
Hydrological models are often developed to forecasting future behavior in response due to natural or human induced changes in stresses affecting hydrologic systems. Commonly, these models are conceptualized and calibrated based on existing data/information about the hydrological conditions. However, most hydrologic systems lack sufficient data to constrain models with adequate certainty to support robust decision making. Therefore, a key element of a hydrologic study is the selection of additional data to improve model performance. Given the nature of hydrologic investigations, it is not practical to select data sequentially, i.e. to choose the next observation, collect it, refine the model, and then repeat the process. Rather, for timing and financial reasons, measurement campaigns include multiple wells or sampling points. There is a growing body of literature aimed at defining the expected data worth based on existing models. However, these are almost all limited to identifying single additional observations. In this study, we present a methodology for simultaneously selecting multiple potential new observations based on their expected ability to reduce the uncertainty of the forecasts of interest. This methodology is based on linear estimates of the predictive uncertainty, and it can be used to determine the optimal combinations of measurements (location and number) established to reduce the uncertainty of multiple predictions. The outcome of the analysis is an estimate of the optimal sampling locations; the optimal number of samples; as well as a probability map showing the locations within the investigated area that are most likely to provide useful information about the forecasting of interest.
Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo
2015-01-01
The rapid ecological shifts that are occurring due to climate change present major challenges for managers and policymakers and, therefore, are one of the main concerns for environmental modelers and evolutionary biologists. Species distribution models (SDM) are appropriate tools for assessing the relationship between species distribution and environmental conditions, so being customarily used to forecast the biogeographical response of species to climate change. A serious limitation of species distribution models when forecasting the effects of climate change is that they normally assume that species behavior and climatic tolerances will remain constant through time. In this study, we propose a new methodology, based on fuzzy logic, useful for incorporating the potential capacity of species to adapt to new conditions into species distribution models. Our results demonstrate that it is possible to include different behavioral responses of species when predicting the effects of climate change on species distribution. Favorability models offered in this study show two extremes: one considering that the species will not modify its present behavior, and another assuming that the species will take full advantage of the possibilities offered by an increase in environmental favorability. This methodology may mean a more realistic approach to the assessment of the consequences of global change on species' distribution and conservation. Overlooking the potential of species' phenotypical plasticity may under- or overestimate the predicted response of species to changes in environmental drivers and its effects on species distribution. Using this approach, we could reinforce the science behind conservation planning in the current situation of rapid climate change. PMID:26120426
Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo
2015-06-01
The rapid ecological shifts that are occurring due to climate change present major challenges for managers and policymakers and, therefore, are one of the main concerns for environmental modelers and evolutionary biologists. Species distribution models (SDM) are appropriate tools for assessing the relationship between species distribution and environmental conditions, so being customarily used to forecast the biogeographical response of species to climate change. A serious limitation of species distribution models when forecasting the effects of climate change is that they normally assume that species behavior and climatic tolerances will remain constant through time. In this study, we propose a new methodology, based on fuzzy logic, useful for incorporating the potential capacity of species to adapt to new conditions into species distribution models. Our results demonstrate that it is possible to include different behavioral responses of species when predicting the effects of climate change on species distribution. Favorability models offered in this study show two extremes: one considering that the species will not modify its present behavior, and another assuming that the species will take full advantage of the possibilities offered by an increase in environmental favorability. This methodology may mean a more realistic approach to the assessment of the consequences of global change on species' distribution and conservation. Overlooking the potential of species' phenotypical plasticity may under- or overestimate the predicted response of species to changes in environmental drivers and its effects on species distribution. Using this approach, we could reinforce the science behind conservation planning in the current situation of rapid climate change.
An operational procedure for rapid flood risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc
2017-07-01
The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-06-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-03-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
The Development of New Solar Indices for use in Thermospheric Density Modeling
NASA Technical Reports Server (NTRS)
Tobiska, W. Kent; Bouwer, S. Dave; Bowman, Bruce R.
2006-01-01
New solar indices have been developed to improve thermospheric density modeling for research and operational purposes. Out of 11 new and 4 legacy indices and proxies, we have selected three (F10.7, S10.7, and M10.7) for use in the new JB2006 empirical thermospheric density model. In this work, we report on the development of these solar irradiance indices. The rationale for their use, their definitions, and their characteristics, including the ISO 21348 spectral category and sub-category, wavelength range, solar source temperature region, solar source feature, altitude region of terrestrial atmosphere absorption at unit optical depth, and terrestrial atmosphere thermal processes in the region of maximum energy absorption, are described. We also summarize for each solar index, the facility and instrument(s) used to observe the solar emission, the time frame over which the data exist, the measurement cadence, the data latency, and the research as well as operational availability. The new solar indices are provided in forecast (http://SpaceWx.com) as well as real-time and historical (http://sol.spacenvironment.net/jb2006/) time frames. We describe the forecast methodology, compare results with actual data for active and quiet solar conditions, and compare improvements in F10.7 forecasting with legacy High Accuracy Satellite Drag Model (HASDM) and NOAA SEC forecasts.
NASA Technical Reports Server (NTRS)
Chou, Shih-Hung; Zavodsky, Brad; Jedlovec, Gary J.
2009-01-01
In data sparse regions, remotely-sensed observations can be used to improve analyses and produce improved forecasts. One such source comes from the Atmospheric InfraRed Sounder (AIRS), which together with the Advanced Microwave Sounding Unit (AMSU), represents one of the most advanced space-based atmospheric sounding systems. The purpose of this paper is to describe a procedure to optimally assimilate high resolution AIRS profile data into a regional configuration of the Advanced Research WRF (ARW) version 2.2 using WRF-Var. The paper focuses on development of background error covariances for the regional domain and background type, and an optimal methodology for ingesting AIRS temperature and moisture profiles as separate overland and overwater retrievals with different error characteristics. The AIRS thermodynamic profiles are derived from the version 5.0 Earth Observing System (EOS) science team retrieval algorithm and contain information about the quality of each temperature layer. The quality indicators were used to select the highest quality temperature and moisture data for each profile location and pressure level. The analyses were then used to conduct a month-long series of regional forecasts over the continental U.S. The long-term impacts of AIRS profiles on forecast were assessed against verifying NAM analyses and stage IV precipitation data.
Paschalidou, Anastasia K; Karakitsios, Spyridon; Kleanthous, Savvas; Kassomenos, Pavlos A
2011-02-01
In the present work, two types of artificial neural network (NN) models using the multilayer perceptron (MLP) and the radial basis function (RBF) techniques, as well as a model based on principal component regression analysis (PCRA), are employed to forecast hourly PM(10) concentrations in four urban areas (Larnaca, Limassol, Nicosia and Paphos) in Cyprus. The model development is based on a variety of meteorological and pollutant parameters corresponding to the 2-year period between July 2006 and June 2008, and the model evaluation is achieved through the use of a series of well-established evaluation instruments and methodologies. The evaluation reveals that the MLP NN models display the best forecasting performance with R (2) values ranging between 0.65 and 0.76, whereas the RBF NNs and the PCRA models reveal a rather weak performance with R (2) values between 0.37-0.43 and 0.33-0.38, respectively. The derived MLP models are also used to forecast Saharan dust episodes with remarkable success (probability of detection ranging between 0.68 and 0.71). On the whole, the analysis shows that the models introduced here could provide local authorities with reliable and precise predictions and alarms about air quality if used on an operational basis.
Kroezen, Marieke; Van Hoegaerden, Michel; Batenburg, Ronald
2018-02-01
Health workforce (HWF) planning and forecasting is faced with a number of challenges, most notably a lack of consistent terminology, a lack of data, limited model-, demand-based- and future-based planning, and limited inter-country collaboration. The Joint Action on Health Workforce Planning and Forecasting (JAHWF, 2013-2016) aimed to move forward on the HWF planning process and support countries in tackling the key challenges facing the HWF and HWF planning. This paper synthesizes and discusses the results of the JAHWF. It is shown that the JAHWF has provided important steps towards improved HWF planning and forecasting across Europe, among others through the creation of a minimum data set for HWF planning and the 'Handbook on Health Workforce Planning Methodologies across EU countries'. At the same time, the context-sensitivity of HWF planning was repeatedly noticeable in the application of the tools through pilot- and feasibility studies. Further investments should be made by all actors involved to support and stimulate countries in their HWF efforts, among others by implementing the tools developed by the JAHWF in diverse national and regional contexts. Simultaneously, investments should be made in evaluation to build a more robust evidence base for HWF planning methods. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Delaney, C.; Hartman, R. K.; Mendoza, J.; Evans, K. M.; Evett, S.
2016-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation or flow forecasts to inform the flood operations of reservoirs. Previous research and modeling for flood control reservoirs has shown that FIRO can reduce flood risk and increase water supply for many reservoirs. The risk-based method of FIRO presents a unique approach that incorporates flow forecasts made by NOAA's California-Nevada River Forecast Center (CNRFC) to model and assess risk of meeting or exceeding identified management targets or thresholds. Forecasted risk is evaluated against set risk tolerances to set reservoir flood releases. A water management model was developed for Lake Mendocino, a 116,500 acre-foot reservoir located near Ukiah, California. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United State Army Corps of Engineers and is operated by the Sonoma County Water Agency for water supply. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has been plagued with water supply reliability issues since 2007. FIRO is applied to Lake Mendocino by simulating daily hydrologic conditions from 1985 to 2010 in the Upper Russian River from Lake Mendocino to the City of Healdsburg approximately 50 miles downstream. The risk-based method is simulated using a 15-day, 61 member streamflow hindcast by the CNRFC. Model simulation results of risk-based flood operations demonstrate a 23% increase in average end of water year (September 30) storage levels over current operations. Model results show no increase in occurrence of flood damages for points downstream of Lake Mendocino. This investigation demonstrates that FIRO may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.
NASA Astrophysics Data System (ADS)
Bonaccorso, Brunella; Cancelliere, Antonino
2015-04-01
In the present study two probabilistic models for short-medium term drought forecasting able to include information provided by teleconnection indices are proposed and applied to Sicily region (Italy). Drought conditions are expressed in terms of the Standardized Precipitation-Evapotranspiration Index (SPEI) at different aggregation time scales. More specifically, a multivariate approach based on normal distribution is developed in order to estimate: 1) on the one hand transition probabilities to future SPEI drought classes and 2) on the other hand, SPEI forecasts at a generic time horizon M, as functions of past values of SPEI and the selected teleconnection index. To this end, SPEI series at 3, 4 and 6 aggregation time scales for Sicily region are extracted from the Global SPEI database, SPEIbase , available at Web repository of the Spanish National Research Council (http://sac.csic.es/spei/database.html), and averaged over the study area. In particular, SPEIbase v2.3 with spatial resolution of 0.5° lat/lon and temporal coverage between January 1901 and December 2013 is used. A preliminary correlation analysis is carried out to investigate the link between the drought index and different teleconnection patterns, namely: the North Atlantic Oscillation (NAO), the Scandinavian (SCA) and the East Atlantic-West Russia (EA-WR) patterns. Results of such analysis indicate a strongest influence of NAO on drought conditions in Sicily with respect to other teleconnection indices. Then, the proposed forecasting methodology is applied and the skill in forecasting of the proposed models is quantitatively assessed through the application of a simple score approach and of performance indices. Results indicate that inclusion of NAO index generally enhance model performance thus confirming the suitability of the models for short- medium term forecast of drought conditions.
Forecast of Frost Days Based on Monthly Temperatures
NASA Astrophysics Data System (ADS)
Castellanos, M. T.; Tarquis, A. M.; Morató, M. C.; Saa-Requejo, A.
2009-04-01
Although frost can cause considerable crop damage and mitigation practices against forecasted frost exist, frost forecasting technologies have not changed for many years. The paper reports a new method to forecast the monthly number of frost days (FD) for several meteorological stations at Community of Madrid (Spain) based on successive application of two models. The first one is a stochastic model, autoregressive integrated moving average (ARIMA), that forecasts monthly minimum absolute temperature (tmin) and monthly average of minimum temperature (tminav) following Box-Jenkins methodology. The second model relates these monthly temperatures to minimum daily temperature distribution during one month. Three ARIMA models were identified for the time series analyzed with a stational period correspondent to one year. They present the same stational behavior (moving average differenced model) and different non-stational part: autoregressive model (Model 1), moving average differenced model (Model 2) and autoregressive and moving average model (Model 3). At the same time, the results point out that minimum daily temperature (tdmin), for the meteorological stations studied, followed a normal distribution each month with a very similar standard deviation through years. This standard deviation obtained for each station and each month could be used as a risk index for cold months. The application of Model 1 to predict minimum monthly temperatures showed the best FD forecast. This procedure provides a tool for crop managers and crop insurance companies to asses the risk of frost frequency and intensity, so that they can take steps to mitigate against frost damage and estimated the damage that frost would cost. This research was supported by Comunidad de Madrid Research Project 076/92. The cooperation of the Spanish National Meteorological Institute and the Spanish Ministerio de Agricultura, Pesca y Alimentation (MAPA) is gratefully acknowledged.
Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew
2016-03-10
Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3-20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. People are not needed in this study. The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Studies reporting methods used to predict future health technologies within a 3-20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. The methodological fundamentals of formal 3-20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
State–Space Forecasting of Schistosoma haematobium Time-Series in Niono, Mali
Medina, Daniel C.; Findley, Sally E.; Doumbia, Seydou
2008-01-01
Background Much of the developing world, particularly sub-Saharan Africa, exhibits high levels of morbidity and mortality associated with infectious diseases. The incidence of Schistosoma sp.—which are neglected tropical diseases exposing and infecting more than 500 and 200 million individuals in 77 countries, respectively—is rising because of 1) numerous irrigation and hydro-electric projects, 2) steady shifts from nomadic to sedentary existence, and 3) ineffective control programs. Notwithstanding the colossal scope of these parasitic infections, less than 0.5% of Schistosoma sp. investigations have attempted to predict their spatial and or temporal distributions. Undoubtedly, public health programs in developing countries could benefit from parsimonious forecasting and early warning systems to enhance management of these parasitic diseases. Methodology/Principal Findings In this longitudinal retrospective (01/1996–06/2004) investigation, the Schistosoma haematobium time-series for the district of Niono, Mali, was fitted with general-purpose exponential smoothing methods to generate contemporaneous on-line forecasts. These methods, which are encapsulated within a state–space framework, accommodate seasonal and inter-annual time-series fluctuations. Mean absolute percentage error values were circa 25% for 1- to 5-month horizon forecasts. Conclusions/Significance The exponential smoothing state–space framework employed herein produced reasonably accurate forecasts for this time-series, which reflects the incidence of S. haematobium–induced terminal hematuria. It obliquely captured prior non-linear interactions between disease dynamics and exogenous covariates (e.g., climate, irrigation, and public health interventions), thus obviating the need for more complex forecasting methods in the district of Niono, Mali. Therefore, this framework could assist with managing and assessing S. haematobium transmission and intervention impact, respectively, in this district and potentially elsewhere in the Sahel. PMID:18698361
Real-time short-term forecast of water inflow into Bureyskaya reservoir
NASA Astrophysics Data System (ADS)
Motovilov, Yury
2017-04-01
During several recent years, a methodology for operational optimization in hydrosystems including forecasts of the hydrological situation has been developed on example of Burea reservoir. The forecasts accuracy improvement of the water inflow into the reservoir during planning of water and energy regime was one of the main goals for implemented research. Burea river is the second left largest Amur tributary after Zeya river with its 70.7 thousand square kilometers watershed and 723 km-long river course. A variety of natural conditions - from plains in the southern part to northern mountainous areas determine a significant spatio-temporal variability in runoff generation patterns and river regime. Bureyskaya hydropower plant (HPP) with watershed area 65.2 thousand square kilometers is a key station in the Russian Far Eastern energy system providing its reliable operation. With a spacious reservoir, Bureyskaya HPP makes a significant contribution to the protection of the Amur region from catastrophic floods. A physically-based distributed model of runoff generation based on the ECOMAG (ECOlogical Model for Applied Geophysics) hydrological modeling platform has been developed for the Burea River basin. The model describes processes of interception of rainfall/snowfall by the canopy, snow accumulation and melt, soil freezing and thawing, water infiltration into unfrozen and frozen soil, evapotranspiration, thermal and water regime of soil, overland, subsurface, ground and river flow. The governing model's equations are derived from integration of the basic hydro- and thermodynamics equations of water and heat vertical transfer in snowpack, frozen/unfrozen soil, horizontal water flow under and over catchment slopes, etc. The model setup for Bureya river basin included watershed and river network schematization with GIS module by DEM analysis, meteorological time-series preparation, model calibration and validation against historical observations. The results showed good model performance as compared to observed inflow data into the Bureya reservoir and high diagnostic potential of data-modeling system of the runoff formation. With the use of this system the following flowchart for short-range forecasting inflow into Bureyskoe reservoir and forecast correction technique using continuously updated hydrometeorological data has been developed: 1 - Daily renewal of weather observations and forecasts database via the Internet; 2 - Daily runoff calculation from the beginning of the current year to current date is conducted; 3 - Short-range (up to 7 days) forecast is generated based on weather forecast. The idea underlying the model assimilation of newly obtained hydro meteorological information to adjust short-range hydrological forecasts lies in the assumption of the forecast errors inertia. Then the difference between calculated and observed streamflow at the forecast release date is "scattered" with specific weights to calculated streamflow for the forecast lead time. During 2016 this forecasts method of the inflow into the Bureyskaya reservoir up to 7 days is tested in online mode. Satisfactory evaluated short-range inflow forecast success rate is obtained. Tests of developed method have shown strong sensitivity to the results of short-term precipitation forecasts.
EU pharmaceutical expenditure forecast.
Urbinati, Duccio; Rémuzat, Cécile; Kornfeld, Åsa; Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
With constant incentives for healthcare payers to contain their pharmaceutical budgets, forecasting has become critically important. Some countries have, for instance, developed pharmaceutical horizon scanning units. The objective of this project was to build a model to assess the net effect of the entrance of new patented medicinal products versus medicinal products going off-patent, with a defined forecast horizon, on selected European Union (EU) Member States' pharmaceutical budgets. This model took into account population ageing, as well as current and future country-specific pricing, reimbursement, and market access policies (the project was performed for the European Commission; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). In order to have a representative heterogeneity of EU Member States, the following countries were selected for the analysis: France, Germany, Greece, Hungary, Poland, Portugal, and the United Kingdom. A forecasting period of 5 years (2012-2016) was chosen to assess the net pharmaceutical budget impact. A model for generics and biosimilars was developed for each country. The model estimated a separate and combined effect of the direct and indirect impacts of the patent cliff. A second model, estimating the sales development and the risk of development failure, was developed for new drugs. New drugs were reviewed individually to assess their clinical potential and translate it into commercial potential. The forecast was carried out according to three perspectives (healthcare public payer, society, and manufacturer), and several types of distribution chains (retail, hospital, and combined retail and hospital). Probabilistic and deterministic sensitivity analyses were carried out. According to the model, all countries experienced drug budget reductions except Poland (+€41 million). Savings were expected to be the highest in the United Kingdom (-€9,367 million), France (-€5,589 million), and, far behind them, Germany (-€831 million), Greece (-€808 million), Portugal (-€243 million), and Hungary (-€84 million). The main source of savings came from the cardiovascular, central nervous system, and respiratory areas and from biosimilar entries. Oncology, immunology, and inflammation, in contrast, lead to additional expenditure. The model was particularly sensitive to the time to market of branded products, generic prices, generic penetration, and the distribution of biosimilars. The results of this forecast suggested a decrease in pharmaceutical expenditure in the studied period. The model was sensitive to pharmaceutical policy decisions.
EU pharmaceutical expenditure forecast
Urbinati, Duccio; Rémuzat, Cécile; Kornfeld, Åsa; Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
Background and Objectives With constant incentives for healthcare payers to contain their pharmaceutical budgets, forecasting has become critically important. Some countries have, for instance, developed pharmaceutical horizon scanning units. The objective of this project was to build a model to assess the net effect of the entrance of new patented medicinal products versus medicinal products going off-patent, with a defined forecast horizon, on selected European Union (EU) Member States’ pharmaceutical budgets. This model took into account population ageing, as well as current and future country-specific pricing, reimbursement, and market access policies (the project was performed for the European Commission; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Method In order to have a representative heterogeneity of EU Member States, the following countries were selected for the analysis: France, Germany, Greece, Hungary, Poland, Portugal, and the United Kingdom. A forecasting period of 5 years (2012–2016) was chosen to assess the net pharmaceutical budget impact. A model for generics and biosimilars was developed for each country. The model estimated a separate and combined effect of the direct and indirect impacts of the patent cliff. A second model, estimating the sales development and the risk of development failure, was developed for new drugs. New drugs were reviewed individually to assess their clinical potential and translate it into commercial potential. The forecast was carried out according to three perspectives (healthcare public payer, society, and manufacturer), and several types of distribution chains (retail, hospital, and combined retail and hospital). Probabilistic and deterministic sensitivity analyses were carried out. Results According to the model, all countries experienced drug budget reductions except Poland (+€41 million). Savings were expected to be the highest in the United Kingdom (−€9,367 million), France (−€5,589 million), and, far behind them, Germany (−€831 million), Greece (−€808 million), Portugal (−€243 million), and Hungary (−€84 million). The main source of savings came from the cardiovascular, central nervous system, and respiratory areas and from biosimilar entries. Oncology, immunology, and inflammation, in contrast, lead to additional expenditure. The model was particularly sensitive to the time to market of branded products, generic prices, generic penetration, and the distribution of biosimilars. Conclusions The results of this forecast suggested a decrease in pharmaceutical expenditure in the studied period. The model was sensitive to pharmaceutical policy decisions. PMID:27226837
Assessing the effect of increased managed care on hospitals.
Mowll, C A
1998-01-01
This study uses a new relative risk methodology developed by the author to assess and compare certain performance indicators to determine a hospital's relative degree of financial vulnerability, based on its location, to the effects of increased managed care market penetration. The study also compares nine financial measures to determine whether hospital in states with a high degree of managed-care market penetration experience lower levels of profitability, liquidity, debt service, and overall viability than hospitals in low managed care states. A Managed Care Relative Financial Risk Assessment methodology composed of nine measures of hospital financial and utilization performance is used to develop a high managed care state Composite Index and to determine the Relative Financial Risk and the Overall Risk Ratio for hospitals in a particular state. Additionally, financial performance of hospitals in the five highest managed care states is compared to hospitals in the five lowest states. While data from Colorado and Massachusetts indicates that hospital profitability diminishes as the level of managed care market penetration increases, the overall study results indicate that hospitals in high managed care states demonstrate a better cash position and higher profitability than hospitals in low managed care states. Hospitals in high managed care states are, however, more heavily indebted in relation to equity and have a weaker debt service coverage capacity. Moreover, the overall financial health and viability of hospitals in high managed care states is superior to that of hospitals in low managed care states.
Francisco Rodríguez y Silva; Armando González-Cabán
2013-01-01
The abandonment of land, the high energy load generated and accumulated by vegetation covers, climate change and interface scenarios in Mediterranean forest ecosystems are demanding serious attention to forest fire conditions. This is particularly true when dealing with the budget requirements for undertaking protection programs related to the state of current and...
ERIC Educational Resources Information Center
Potter, Norman R.; Dieterly, Duncan L.
The literature review was undertaken to establish the current status of the methodology for forecasting and assessing technology and for quantizing human resource parameters with respect to the impact of incoming technologies. The review of 140 selected documents applicable to the study was undertaken with emphasis on the identification of methods…
Development of a Statistical Validation Methodology for Fire Weather Indices
Brian E. Potter; Scott Goodrick; Tim Brown
2003-01-01
Fire managers and forecasters must have tools, such as fire indices, to summarize large amounts of complex information. These tools allow them to identify and plan for periods of elevated risk and/or wildfire potential. This need was once met using simple measures like relative humidity or maximum daily temperature (e.g., Gisborne, 1936) to describe fire weather, and...
It has been reported that ambient ozone (O3), either alone or in concurrence with acid rain precursors, accounts for up to 90% of U.S. crop losses resulting from exposure to all major air pollutants. Crop damage due to O3 exposure is of particular concern as...
1982-04-25
the Directorate of Programs (AFLC/ XRP ), and 11-4 * the Directorate of Logistics Plans and Programs, Aircraft/Missiles Program Division of the Air Staff...OWRM). * The P-18 Exhibit/Budget Estimate Submission (BES), a document developed by AFLC/LOR, is reviewed by AFLC/ XRP , and is presented to HQ USAF
Araújo, Ricardo de A
2010-12-01
This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods. Copyright © 2010 Elsevier Ltd. All rights reserved.
"Total Deposition (TDEP) Maps" | Science Inventory | US EPA
The presentation provides an update on the use of a hybrid methodology that relies on measured values from national monitoring networks and modeled values from CMAQ to produce of maps of total deposition for use in critical loads and other ecological assessments. Additionally, comparisons of the deposition values from the hybrid approach are compared with deposition estimates from other methodologies. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollution problem, but also in developing emission control policies and regulations for air quality improvements.
Assessing methods for developing crop forecasting in the Iberian Peninsula
NASA Astrophysics Data System (ADS)
Ines, A. V. M.; Capa Morocho, M. I.; Baethgen, W.; Rodriguez-Fonseca, B.; Han, E.; Ruiz Ramos, M.
2015-12-01
Seasonal climate prediction may allow predicting crop yield to reduce the vulnerability of agricultural production to climate variability and its extremes. It has been already demonstrated that seasonal climate predictions at European (or Iberian) scale from ensembles of global coupled climate models have some skill (Palmer et al., 2004). The limited predictability that exhibits the atmosphere in mid-latitudes, and therefore de Iberian Peninsula (PI), can be managed by a probabilistic approach based in terciles. This study presents an application for the IP of two methods for linking tercile-based seasonal climate forecasts with crop models to improve crop predictability. Two methods were evaluated and applied for disaggregating seasonal rainfall forecasts into daily weather realizations: 1) a stochastic weather generator and 2) a forecast tercile resampler. Both methods were evaluated in a case study where the impacts of two seasonal rainfall forecasts (wet and dry forecast for 1998 and 2015 respectively) on rainfed wheat yield and irrigation requirements of maize in IP were analyzed. Simulated wheat yield and irrigation requirements of maize were computed with the crop models CERES-wheat and CERES-maize which are included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at several locations in Spain where the crop model was calibrated and validated with independent field data. These methodologies would allow quantifying the benefits and risks of a seasonal climate forecast to potential users as farmers, agroindustry and insurance companies in the IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse ones. ReferencesPalmer, T. et al., 2004. Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bulletin of the American Meteorological Society, 85(6): 853-872.
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley T.; Case, Jonathan L.; Molthan, Andrew L.
2012-01-01
The Short-term Prediction Research and Transition (SPoRT) Center is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service forecast offices. SPoRT provides real-time NASA products and capabilities to help its partners address specific operational forecast challenges. One challenge that forecasters face is using guidance from local and regional deterministic numerical models configured at convection-allowing resolution to help assess a variety of mesoscale/convective-scale phenomena such as sea-breezes, local wind circulations, and mesoscale convective weather potential on a given day. While guidance from convection-allowing models has proven valuable in many circumstances, the potential exists for model improvements by incorporating more representative land-water surface datasets, and by assimilating retrieved temperature and moisture profiles from hyper-spectral sounders. In order to help increase the accuracy of deterministic convection-allowing models, SPoRT produces real-time, 4-km CONUS forecasts using a configuration of the Weather Research and Forecasting (WRF) model (hereafter SPoRT-WRF) that includes unique NASA products and capabilities including 4-km resolution soil initialization data from the Land Information System (LIS), 2-km resolution SPoRT SST composites over oceans and large water bodies, high-resolution real-time Green Vegetation Fraction (GVF) composites derived from the Moderate-resolution Imaging Spectroradiometer (MODIS) instrument, and retrieved temperature and moisture profiles from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI). NCAR's Model Evaluation Tools (MET) verification package is used to generate statistics of model performance compared to in situ observations and rainfall analyses for three months during the summer of 2012 (June-August). Detailed analyses of specific severe weather outbreaks during the summer will be presented to assess the potential added-value of the SPoRT datasets and data assimilation methodology compared to a WRF configuration without the unique datasets and data assimilation.
NASA Astrophysics Data System (ADS)
Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.
2015-12-01
In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.
NASA Astrophysics Data System (ADS)
Michael, A. J.; Field, E. H.; Hardebeck, J.; Llenos, A. L.; Milner, K. R.; Page, M. T.; Perry, S. C.; van der Elst, N.; Wein, A. M.
2016-12-01
After the Mw 5.8 Pawnee, Oklahoma, earthquake of September 3, 2016 the USGS issued a series of aftershock forecasts for the next month and year. These forecasts were aimed at the emergency response community, those making decisions about well operations in the affected region, and the general public. The forecasts were generated manually using methods planned for automatically released Operational Aftershock Forecasts. The underlying method is from Reasenberg and Jones (Science, 1989) with improvements recently published in Page et al. (BSSA, 2016), implemented in a JAVA Graphical User Interface and presented in a template that is under development. The methodological improvements include initial models based on the tectonic regime as defined by Garcia et al. (BSSA, 2012) and the inclusion of both uncertainty in the clustering parameters and natural random variability. We did not utilize the time-dependent magnitude of completeness model from Page et al. because it applies only to teleseismic events recorded by NEIC. The parameters for Garcia's Generic Active Continental Region underestimated the modified-Omori decay parameter and underestimated the aftershock rate by a factor of 2. And the sequence following the Mw 5.7 Prague, Oklahoma, earthquake of November 6, 2011 was about 3 to 4 times more productive than the Pawnee sequence. The high productivity for these potentially induced sequences is consistent with an increase in productivity in Oklahoma since 2009 (Llenos and Michael, BSSA, 2013) and makes a general tectonic model inapplicable to sequences in this region. Soon after the mainshock occurred, the forecasts relied on the sequence specific parameters. After one month, the Omori decay parameter p is less than one, implying a very long-lived sequence. However, the decay parameter is known to be biased low at early times due to secondary aftershock triggering, and the p-value determined early in the sequence may be inaccurate for long-term forecasting.
Super Ensemble-based Aviation Turbulence Guidance (SEATG) for Air Traffic Management (ATM)
NASA Astrophysics Data System (ADS)
Kim, Jung-Hoon; Chan, William; Sridhar, Banavar; Sharman, Robert
2014-05-01
Super Ensemble (ensemble of ten turbulence metrics from time-lagged ensemble members of weather forecast data)-based Aviation Turbulence Guidance (SEATG) is developed using Weather Research and Forecasting (WRF) model and in-situ eddy dissipation rate (EDR) observations equipped on commercial aircraft over the contiguous United States. SEATG is a sequence of five procedures including weather modeling, calculating turbulence metrics, mapping EDR-scale, evaluating metrics, and producing final SEATG forecast. This uses similar methodology to the operational Graphic Turbulence Guidance (GTG) with three major improvements. First, SEATG use a higher resolution (3-km) WRF model to capture cloud-resolving scale phenomena. Second, SEATG computes turbulence metrics for multiple forecasts that are combined at the same valid time resulting in an time-lagged ensemble of multiple turbulence metrics. Third, SEATG provides both deterministic and probabilistic turbulence forecasts to take into account weather uncertainties and user demands. It is found that the SEATG forecasts match well with observed radar reflectivity along a surface front as well as convectively induced turbulence outside the clouds on 7-8 Sep 2012. And, overall performance skill of deterministic SEATG against the observed EDR data during this period is superior to any single turbulence metrics. Finally, probabilistic SEATG is used as an example application of turbulence forecast for air-traffic management. In this study, a simple Wind-Optimal Route (WOR) passing through the potential areas of probabilistic SEATG and Lateral Turbulence Avoidance Route (LTAR) taking into account the SEATG are calculated at z = 35000 ft (z = 12 km) from Los Angeles to John F. Kennedy international airports. As a result, WOR takes total of 239 minutes with 16 minutes of SEATG areas for 40% of moderate turbulence potential, while LTAR takes total of 252 minutes travel time that 5% of fuel would be additionally consumed to entirely avoid the moderate SEATG regions.
NASA Astrophysics Data System (ADS)
Gelmini, A.; Gottardi, G.; Moriyama, T.
2017-10-01
This work presents an innovative computational approach for the inversion of wideband ground penetrating radar (GPR) data. The retrieval of the dielectric characteristics of sparse scatterers buried in a lossy soil is performed by combining a multi-task Bayesian compressive sensing (MT-BCS) solver and a frequency hopping (FH) strategy. The developed methodology is able to benefit from the regularization capabilities of the MT-BCS as well as to exploit the multi-chromatic informative content of GPR measurements. A set of numerical results is reported in order to assess the effectiveness of the proposed GPR inverse scattering technique, as well as to compare it to a simpler single-task implementation.
Methods for Improving In Vitro and In Vivo Boar Sperm Fertility.
Funahashi, H
2015-07-01
Fertility of boar spermatozoa is changed after ejaculation in vivo and in vitro. During processing for in vitro fertilization (IVF), although spermatozoa are induced capacitation, resulting in a high penetration rate, persistent obstacle of polyspermic penetration is still observed with a high incidence. For artificial insemination (AI), we still need a large number of spermatozoa and lose a majority of those in the female reproductive tract. Fertility of cryopreserved boar spermatozoa is still injured through freezing and thawing process. In the present brief review, factors affecting fertility of boar sperm during IVF, AI and cryopreservation are discussed in the context of discovering methodologies to improve it. © 2015 Blackwell Verlag GmbH.
NASA Technical Reports Server (NTRS)
Farr, Rebecca A.; Chang, Chau-Lyan; Jones, Jess H.; Dougherty, N. Sam
2015-01-01
Classic tonal screech noise created by under-expanded supersonic jets; Long Penetration Mode (LPM) supersonic phenomenon -Under-expanded counter-flowing jet in supersonic free stream -Demonstrated in several wind tunnel tests -Modeled in several computational fluid dynamics (CFD) simulations; Discussion of LPM acoustics feedback and fluid interactions -Analogous to the aero-acoustics interactions seen in screech jets; Lessons Learned: Applying certain methodologies to LPM -Developed and successfully demonstrated in the study of screech jets -Discussion of mechanically induced excitation in fluid oscillators in general; Conclusions -Large body of work done on jet screech, other aero-acoustic phenomenacan have direct application to the study and applications of LPM cold flow jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Nagarajan, Adarsh; Baggu, Murali
This paper evaluated the impact of smart inverter Volt-VAR function on voltage reduction energy saving and power quality in electric power distribution systems. A methodology to implement the voltage reduction optimization was developed by controlling the substation LTC and capacitor banks, and having smart inverters participate through their autonomous Volt-VAR control. In addition, a power quality scoring methodology was proposed and utilized to quantify the effect on power distribution system power quality. All of these methodologies were applied to a utility distribution system model to evaluate the voltage reduction energy saving and power quality under various PV penetrations and smartmore » inverter densities.« less
Cloud Forecasting and 3-D Radiative Transfer Model Validation using Citizen-Sourced Imagery
NASA Astrophysics Data System (ADS)
Gasiewski, A. J.; Heymsfield, A.; Newman Frey, K.; Davis, R.; Rapp, J.; Bansemer, A.; Coon, T.; Folsom, R.; Pfeufer, N.; Kalloor, J.
2017-12-01
Cloud radiative feedback mechanisms are one of the largest sources of uncertainty in global climate models. Variations in local 3D cloud structure impact the interpretation of NASA CERES and MODIS data for top-of-atmosphere radiation studies over clouds. Much of this uncertainty results from lack of knowledge of cloud vertical and horizontal structure. Surface-based data on 3-D cloud structure from a multi-sensor array of low-latency ground-based cameras can be used to intercompare radiative transfer models based on MODIS and other satellite data with CERES data to improve the 3-D cloud parameterizations. Closely related, forecasting of solar insolation and associated cloud cover on time scales out to 1 hour and with spatial resolution of 100 meters is valuable for stabilizing power grids with high solar photovoltaic penetrations. Data for cloud-advection based solar insolation forecasting with requisite spatial resolution and latency needed to predict high ramp rate events obtained from a bottom-up perspective is strongly correlated with cloud-induced fluctuations. The development of grid management practices for improved integration of renewable solar energy thus also benefits from a multi-sensor camera array. The data needs for both 3D cloud radiation modelling and solar forecasting are being addressed using a network of low-cost upward-looking visible light CCD sky cameras positioned at 2 km spacing over an area of 30-60 km in size acquiring imagery on 30 second intervals. Such cameras can be manufactured in quantity and deployed by citizen volunteers at a marginal cost of 200-400 and operated unattended using existing communications infrastructure. A trial phase to understand the potential utility of up-looking multi-sensor visible imagery is underway within this NASA Citizen Science project. To develop the initial data sets necessary to optimally design a multi-sensor cloud camera array a team of 100 citizen scientists using self-owned PDA cameras is being organized to collect distributed cloud data sets suitable for MODIS-CERES cloud radiation science and solar forecasting algorithm development. A low-cost and robust sensor design suitable for large scale fabrication and long term deployment has been developed during the project prototyping phase.
Online Analysis of Wind and Solar Part II: Transmission Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the toolmore » has been developed and implemented in software.« less
The IEA/ORAU Long-Term Global Energy- CO2 Model: Personal Computer Version A84PC
Edmonds, Jae A.; Reilly, John M.; Boden, Thomas A. [CDIAC; Reynolds, S. E. [CDIAC; Barns, D. W.
1995-01-01
The IBM A84PC version of the Edmonds-Reilly model has the capability to calculate both CO2 and CH4 emission estimates by source and region. Population, labor productivity, end-use energy efficiency, income effects, price effects, resource base, technological change in energy production, environmental costs of energy production, market-penetration rate of energy-supply technology, solar and biomass energy costs, synfuel costs, and the number of forecast periods may be interactively inspected and altered producing a variety of global and regional CO2 and CH4 emission scenarios for 1975 through 2100. Users are strongly encouraged to see our instructions for downloading, installing, and running the model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Andrew; Wiser, Ryan
2012-05-18
We estimate the long-run economic value of variable renewable generation with increasing penetration using a unique investment and dispatch model that captures long-run investment decisions while also incorporating detailed operational constraints and hourly time resolution over a full year. High time resolution and the incorporation of operational constraints are important for estimating the economic value of variable generation, as is the use of a modeling framework that accommodates new investment decisions. The model is herein applied with a case study that is loosely based on California in 2030. Increasing amounts of wind, photovoltaics (PV), and concentrating solar power (CSP) with and without thermal energy storage (TES) are added one at a time. The marginal economic value of these renewable energy sources is estimated and then decomposed into capacity value, energy value, day-ahead forecast error cost, and ancillary services. The marginal economic value, as defined here, is primarily based on the combination of avoided capital investment cost and avoided variable fuel and operations and maintenance costs from other power plants in the power system. Though the model only captures a subset of the benefits and costs of renewable energy, it nonetheless provides unique insights into how the value of that subset changes with technology and penetration level. Specifically, in this case study implementation of the model, the marginal economic value of all three solar options is found to exceed the value of a flat-block of power (as well as wind energy) by \\more » $$20--30/MWh at low penetration levels, largely due to the high capacity value of solar at low penetration. Because the value of CSP per unit of energy is found to be high with or without thermal energy storage at low penetration, we find little apparent incremental value to thermal storage at low solar penetration in the present case study analysis. The marginal economic value of PV and CSP without thermal storage is found to drop considerably (by more than \\$$70/MWh) as the penetration of solar increases toward 30\\percent on an energy basis. This is due primarily to a steep drop in capacity value followed by a decrease in energy value. In contrast, the value of CSP with thermal storage drops much less dramatically as penetration increases. As a result, at solar penetration levels above 10\\percent, CSP with thermal storage is found to be considerably more valuable relative to PV and CSP without thermal storage. The marginal economic value of wind is found to be largely driven by energy value, and is lower than solar at low penetration. The marginal economic value of wind drops at a relatively slower rate with penetration, however. As a result, at high penetration, the value of wind can exceed the value of PV and CSP without thermal storage. Though some of these findings may be somewhat unique to the specific case study presented here, the results: (1) highlight the importance of an analysis framework that addresses long-term investment decisions as well as short-term dispatch and operational constraints, (2) can help inform long-term decisions about renewable energy procurement and supporting infrastructure, and (3) point to areas where further research is warranted.« less